The Jordan B. Peterson Podcast


367. You Might Already Be A Member | Dr. James Lindsay


Summary

Dr. James Lindsay is a mathematical and political commentator, author, and frequent public speaker. In this episode, Dr. Lindsay talks about the origins of Marxist ideas, how they spread across modern culture, and their impact on our understanding of the world. He also shares some of his own personal stories of travel and travel mishaps, including a near-death experience he had while on a plane, and the moment he realized he didn t speak English well enough to speak to a group of people in a foreign country. He also discusses his new book, The Grievance Studies Affair, which details how Marxist ideas can be used to justify the acquisition of power, and why it s important to understand how these ideas are used in order to achieve political power and influence in the modern world. If you re struggling with anxiety, depression, or another mental health condition, or simply don t know where to turn to turn, this episode is for you. Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson's new series on Depression and Anxiety. Let this be the first step towards the brighter future you deserve. Dr. Peterson has created a new series that could be a lifeline for those battling depression and anxiety. We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling, and offer a unique understanding of why you might be feeling this way. with decades of experience helping patients. In his new series, Jordan B Peterson offers a roadmap towards healing. He provides a roadmap toward healing, and shows that, while the journey isn t easy, it s absolutely possible to find your way forward. If you're suffering, please know you are not alone, and there's hope and there s a path to feeling better. . - Dr. B. Lindsay and Dr. P. Lindsay is here to help you find a way forward, and you deserve a brighter future that you deserve to feel better. Thank you for listening to this episode of DailyWire Plus. - The Dark Side of the Dark Side Of by Jordan Peterson. by Dailywire Plus by Dr. Lyndsay by P. Peterson in this episode is a podcast that helps you feel better, not better than you think you do so you can help you feel good about it by listening to the podcast by helping you get a better night out.


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Hello everyone watching and listening.
00:01:10.700 Today I'm speaking with author, mathematician, and political commentator Dr. James Lindsay.
00:01:16.360 We discuss Marxism, how it evolved from a singular ideology into a genus,
00:01:22.280 spawning many oppressor slash oppressed dogmas across modern culture.
00:01:27.420 Ideas such as equity, critical race theory, queer theory.
00:01:31.440 We trace these sub-Marxist doctrines back past fundamental narrative into the theological realm
00:01:37.360 and detail their utility in, what would you say, justifying the acquisition of power.
00:01:43.320 We also discuss the Grievance Studies Affair, of which Dr. Lindsay was a co-author.
00:01:48.160 So James, I kind of feel like I know you.
00:01:50.760 I follow you on Twitter and watch all the trouble you cause, or some of it anyways.
00:01:54.300 And we did do a podcast a number of years ago with Helen and Peter.
00:02:00.120 But I actually don't know you.
00:02:02.800 And so let's start with that.
00:02:04.840 I'd like to know a little bit more about you.
00:02:06.600 And let's let everybody who's watching and listening know too.
00:02:09.400 So let's say, when you're not causing trouble on Twitter, what are you involved in at the moment?
00:02:16.720 I think causing...
00:02:17.320 What does your actual life look like?
00:02:20.340 Causing trouble on the road is what I'm involved in.
00:02:22.840 Okay, okay.
00:02:23.120 Yeah.
00:02:23.360 Well, I mean, Twitter overlaps with the road because it lives in my pocket.
00:02:26.620 So, you know, that's convenient.
00:02:27.940 But I do a lot of speaking.
00:02:30.240 I speak a lot.
00:02:31.020 I've had the privilege this year of getting to speak all over the world.
00:02:33.740 I got to speak at the EU Parliament.
00:02:35.120 That was great.
00:02:36.160 Right, right, right.
00:02:37.180 Because it's really funny.
00:02:38.740 Everybody's telling me that.
00:02:39.840 And I think it's true.
00:02:40.840 This is the best public speech you've ever given, Pablo.
00:02:43.100 And so what a setting did hit the one that you hit.
00:02:46.060 Oh, well, congratulations on that front.
00:02:47.800 Well, it turns out that the night before, it's a silly story.
00:02:50.300 The night before, I had a smaller meeting in a restaurant with a number of the MEPs from different countries.
00:02:54.720 And I'm trying to talk.
00:02:56.200 And I tend to be humorous and speak quickly.
00:02:58.540 And this is just my style.
00:03:00.560 And then all my jokes were landing flat.
00:03:02.760 And I got kind of awkward.
00:03:03.780 And I realized they don't speak English.
00:03:05.500 Because English is not even, you know, they're not even all fully fluent.
00:03:09.500 Yeah, that's particularly rough on the joke front.
00:03:11.700 So then I had to start slowing down.
00:03:13.440 And I got really conscious of slowing down and enunciating and trying to use simpler terminology to talk about, you know, neo-Marxism and postmodernism.
00:03:22.980 It's very difficult.
00:03:23.900 So I gave this speech, and it just happened to work out.
00:03:27.200 But I've had this privilege.
00:03:28.000 So, you know, 175 flights I took last year to give you a feel for how much on the road there is.
00:03:36.060 Getting around, meeting mostly with grassroots organizations, some bigger political stuff or legislators.
00:03:41.480 But mostly moms and dads that are trying to do something in this country and around Canada as well to try to change.
00:03:49.660 Are you mostly in North America when you're traveling?
00:03:51.840 Overwhelmingly just in the U.S.
00:03:53.740 Uh-huh.
00:03:54.640 Overwhelmingly.
00:03:55.040 Uh-huh.
00:03:55.700 How did the EU invitation come about?
00:03:58.020 They reached out to me.
00:03:59.440 So I'm not exactly sure how they got connected with me.
00:04:02.540 But one of the MEPs, they put together a conference at the EU parliament to talk about—
00:04:08.440 Was that in Brussels?
00:04:09.340 It was in Brussels.
00:04:09.760 It was in Brussels.
00:04:10.920 Were you in their terrible building in Brussels?
00:04:12.600 I was.
00:04:13.400 A huge airport?
00:04:14.640 Yeah, a huge airport.
00:04:16.180 You've been, right?
00:04:17.280 Yeah.
00:04:17.640 So you know that—
00:04:18.380 The ultimate Tower of Babel.
00:04:19.540 So there's that park back behind it that has the ostriches.
00:04:22.720 You know about these statues.
00:04:23.660 So there's a small park.
00:04:25.140 It's, you know, not very big.
00:04:26.520 However many square meters.
00:04:27.460 Very small.
00:04:28.340 Grassy area.
00:04:29.340 And there's all these strange statues.
00:04:30.840 And you come on it and you say three-legged statues.
00:04:32.780 It looks like aliens.
00:04:34.080 And you get closer and you realize they're all ostriches with their heads stuck in the ground.
00:04:37.700 And I thought, how poetic is this building?
00:04:40.720 Oh, yeah.
00:04:41.100 You know?
00:04:42.800 But yeah, it was in Brussels.
00:04:44.240 And so how were you received there?
00:04:46.540 Well, it was a conservative group.
00:04:48.280 It was a center-right European party called Identity Democracy Foundation.
00:04:52.000 So they were very warm to me.
00:04:54.180 I actually had a handful of—there was a student group from the foundation that was also present at the EU at the same time.
00:05:02.060 And so many of them came.
00:05:03.380 Maybe over 100 of them came.
00:05:04.980 Right.
00:05:05.180 And so I got a lot of feedback, actually.
00:05:07.560 They sent me messages, as one does, on Instagram or whatever and said, you know, this was amazing.
00:05:11.800 This was great.
00:05:12.080 Yeah, well, the EU speech kind of went viral, didn't it?
00:05:14.660 It did.
00:05:15.440 Yeah, it really took off.
00:05:16.780 Actually, it kind of lurked for a month.
00:05:18.620 And then it exploded about a month later.
00:05:20.800 And it's still just going crazy right now.
00:05:24.280 And so you said you slowed down and you enunciated more carefully.
00:05:28.280 But what do you think you hit on that made it so attractive as a speech?
00:05:33.180 Well, the question of the whole conference, which is a three-meeting session that I was at the first one, was what is woke and what does it mean for Europe?
00:05:43.760 And so I tried to give, in a sense, a genealogy of woke.
00:05:48.140 And actually, a taxonomy is more accurate.
00:05:50.780 I started off by saying, well, I think that woke is, in fact, Marxism that's evolved to attack the West.
00:05:56.740 And the techniques it's using are reminiscent of Mao's cultural revolution.
00:06:00.480 And so you can say that it's Marxist or Maoist.
00:06:03.680 But then I said we can't understand that unless we understand Marxism in a bigger way.
00:06:08.800 If we focus on his economic analysis and capital, we miss the entire picture.
00:06:13.380 If we take a step back and say that he outlined an entire theory of man and the world and our behavior in it and the meaning of life and purpose, telos for our being, which is to transform the world into the socialist utopia, to advance history to its intended end, then you can see that the particular mode of analysis becomes fungible.
00:06:34.960 If it's economic analysis for Marx, then you get classical Marxism.
00:06:39.880 If it's race analysis for the critical race theorists, it's almost, you have to massage around the edges, but it's almost the exact same architecture.
00:06:46.480 Well, that's certainly what it seemed to me to be.
00:06:48.620 You know, one of the things that's been disturbing, I suppose, on the gaslighting front is whenever I draw a relationship between postmodernism and neo-Marxism, first of all, people say two things that I don't know what I'm talking about, which, by the way, is rarely the case.
00:07:02.940 And second, that, you know, that's a conspiratorial misreading of the relationship, that there's nothing, that most postmodernism has nothing to do with Marxism.
00:07:11.920 And, you know, I've taken that criticism seriously because it happens a lot.
00:07:15.760 I think, well, you know, is there some manner in which I have this wrong?
00:07:19.400 And then I go back as much as I can to the source documents, including Foucault and I think, and Derrida, and I think, well, they said they were Marxists.
00:07:27.920 That seems like, you know, proof, and the entire intellectual milieu at that time in France was Marxist, including people who should have known better, like Jean-Paul Sartre.
00:07:38.780 So it's like that was the water in which those particular fish swam, and the postmodernists, when they themselves say that their, what would you say, that their intellectual effort is tending in the Marxist direction or is an extension of Marxism, I'm pretty much inclined to believe them.
00:07:59.760 And so I don't understand how this notion that those two concepts are separate has come about.
00:08:09.360 Do you have any idea about that?
00:08:11.380 I do.
00:08:12.000 I've thought tremendously on this question, and I believe I have an answer.
00:08:16.920 Kind of like yourself, if I open my mouth, usually I've thought about something before I spout off.
00:08:22.000 And in this case, it's the nature of the way these theories evolve.
00:08:26.660 They evolve through what technically is called dialectical critique.
00:08:30.880 And so each descendant theory, say if we use Marxism as the common ancestor, and that's what I did, by the way, in the EU, as I said.
00:08:38.740 Think of Marxism as a genus, and then you have all these species.
00:08:43.120 Well, postmodernism is a species, but they evolve through dialectical critique.
00:08:46.500 So for each new derivative that comes out, say postmodernism, they have to create themselves by giving a critique of the thing that they were before.
00:08:57.360 So they start by saying, here's where Marxism is wrong.
00:09:01.100 And academics hyper-focus on these distinctions, and they say, look.
00:09:04.860 I see.
00:09:05.320 So that's what you think it is.
00:09:06.360 They say, just like you say, well, they say that they're Marxists, that looks like proof.
00:09:10.380 They say, well, they said we're criticizing Marx, so that's proof that they're different.
00:09:14.740 And the neo-Marxists are no exception, and you'll find literal Marxists today.
00:09:18.020 So you think it's narcissism of small differences, to use the Freudian term.
00:09:21.840 I do.
00:09:22.000 Yeah, so there's a level of analysis at which these, I think your genus and species metaphor is a good one.
00:09:28.180 So there's a level of analysis at which these are all variations on a theme, and there's another level of analysis where they're, well, no, they're distinctly different, which is exactly what does happen in academic micro-arguments.
00:09:40.580 Right.
00:09:40.800 So you think part of that's, well, I think part of it's just the attempt to sow confusion as well.
00:09:45.020 Oh, probably, yes.
00:09:45.900 And then also ignorance on the part of the critics, because they just don't know enough about what they're talking about to even know that there's a relationship between post-modernism and neo-Marxism and Marxism.
00:09:55.820 I guess the other issue, too, is that, in principle, the post-modernists were skeptical of metanarratives, and it does seem not unreasonable to point out that Marxism is a grand metanarrative.
00:10:10.480 And so if you're skeptical about metanarratives, you know, you might start out by being skeptical about Marxism, and if you just focused on the post-modern critique of metanarrative, then you'd say, well, it couldn't be allied with Marxism because Marxism is a metanarrative.
00:10:24.140 But my response to that would be, what makes you think that incoherence ever bothered a post-modernist?
00:10:29.520 Right.
00:10:29.960 In fact, they specialize in incoherence, and I think because it can sow discord and chaos most effectively.
00:10:36.200 This is why this metaphor, the genus species, is so important.
00:10:39.700 And for me, this, you know, while Marxism is a grand metanarrative, et cetera, this is almost like saying, imagine that the animal clade that we're talking about has something to do with cats, right?
00:10:50.020 And so now we have cats, tigers.
00:10:52.860 Well, tigers have a tail, right?
00:10:54.760 And then lions, and lions have a tail, and house cats, house cats have a tail.
00:10:59.360 So cats have tails, right?
00:11:00.720 Well, not bobcats, not lynxes.
00:11:02.080 And so if we think of the tail as being a grand metanarrative, in fact, the broad historicism of classical Marxism, you find both neo-Marxism or critical Marxism and post-modernism are becoming skeptical of this kind of grand trajectory of history narrative that was kind of the early modern thought.
00:11:21.500 And as we shift from modern thinking to post-modern thinking, away from the scientific and into the kind of blatantly mystical and romantic, which the post-modernists are wholly characterized by, you can just imagine it.
00:11:34.620 It's a cat without a tail.
00:11:35.900 Yeah, yeah.
00:11:36.240 I see, if we do this, you know, I said Marxism is economic, and critical race theory is race, and we can say that queer theory is the concept of who defines what's normal.
00:11:44.440 Post-modernism is really a Marxist analysis of who gets to say what things mean.
00:11:48.800 Yeah.
00:11:49.040 Well, it seems to me the fundamental core around which these concepts circulate is, well, one core is resentment and bitterness.
00:11:58.280 There's an envy.
00:11:59.240 There's no doubt about that on the motivational front.
00:12:01.380 Yes.
00:12:01.760 But the other core, more ideological and intellectual, would be the notion that every social interaction is best viewed through the lens of oppressor and oppressed.
00:12:12.660 And so, then you can do that with economics, which is essentially what Marx did.
00:12:16.640 But once you've established that pattern, well, it's all about victimization and power.
00:12:21.420 So, I think it's actually the same claim that it's like a neo-Christian claim that emerged out of the Middle Ages because there was a doctrine in the Middle Ages among some strands of Christian thinkers that the secular world, the earthly world, let's say, was the domain of Satan himself.
00:12:40.220 It was ruled by the prince of power.
00:12:42.960 And I think that's exactly what the Marxists claim, except they're, you know, are they in favor of that or against it?
00:12:48.620 It's very difficult to say.
00:12:49.820 But their fundamental claim is something like all human relationships can be understood through the lens of power and oppression.
00:12:57.460 I mean, that's Foucault in a nutshell, right?
00:12:59.260 Yeah.
00:12:59.440 Because his whole theory is that everything is carceral power.
00:13:02.340 Every time I say that word, I have to stop and tell this.
00:13:04.660 It means prison.
00:13:05.560 Yeah.
00:13:05.700 I mean, incarcerated is the derivative for people.
00:13:09.300 But it's all about carceral power.
00:13:12.300 So, these sects that you're referring to in the Middle Ages of kind of bizarre Christianity were actually Gnostic heresies that were developing.
00:13:20.760 And I think that actually by means of Hegel coming down through Marx who inverted it, I believe we actually are looking at a Gnostic heresy that got hidden inside of economics and social.
00:13:33.820 In fact, if we read Phenomenology of Spirit from Hegel, 1807 is a publication, you get distinctly the sense that what he means by spirit is what he says he means by spirit.
00:13:44.220 It's a spirit of society.
00:13:45.840 It's a social phenomenon.
00:13:47.620 It's kind of the seed of sociology in a sense.
00:13:53.240 And this social spiritual realm is, for Hegel quite literally, because he was a heretical theologian, is the working of the Holy Spirit in the world.
00:14:04.720 It's not this transcendent third person of the Godhead.
00:14:09.380 It is the functioning of human beings in the collective all and how that's moving through history.
00:14:19.000 And so, if we relocate as a modern transformation of kind of this heretical Christian Middle Ages, you know, almost New Age movement of the time, mystical movement of the time, we have a very clear shift from the transcendental to the social, to the social universe representing the spirit.
00:14:38.940 And so, then Marx, he actually figures out the code.
00:14:42.860 He says, no, Hegel's got it upside down.
00:14:44.240 We focus on the idea and the state will follow and the spirit will follow the state.
00:14:48.220 And he said, no, no, no.
00:14:49.200 And then the spirit will sublate and raise to – Alfhaben in German – and raise to a higher level and we'll have a new idea and blah, blah, blah.
00:14:55.840 That's his trinity cycle, his dialectical cycle for Hegel.
00:14:59.740 Well, Marx says, no, it's upside down.
00:15:01.240 We start on the ground.
00:15:02.820 We do the work.
00:15:03.760 We do the praxis.
00:15:05.040 Do the work is the modern phrasing.
00:15:06.900 We do the praxis.
00:15:07.780 We do the activism.
00:15:09.040 And we change society directly.
00:15:12.240 And then that will cause, as society changes, what he called the inversion of praxis, the social conditioning to rain down on people and actually reify the transformation of society.
00:15:23.720 So this, I think, is where Marx had inverted Hegel.
00:15:27.700 And this is where we have a shift from the pre-modern transcendental spiritual to the modern social spiritual.
00:15:35.040 And this just becomes the playground of romantics and eventually the postmodernists who throw up their hands and say this whole thing is just this gigantic dynamic of power to where you and I converse.
00:15:46.600 I mean, at one point, I remember maybe 10 years ago, some feminists, it didn't go very far, but they posited this very postmodern argument that there was no possibility ever for a woman to consent to sex with a man because there's always a patriarchal power dynamic.
00:16:03.660 So there's always, no matter what, no matter how much, you know, she says she's interested or whatever, there is always, always her being coerced.
00:16:14.860 So this is, I think, you know, kind of this huge shift.
00:16:19.040 And you say, you know, academics get mired in these micro distinctions.
00:16:23.040 Yeah.
00:16:23.260 And that is partly their job.
00:16:24.860 So, okay, fine.
00:16:26.460 It's also how they carve out territory, though.
00:16:28.240 That is.
00:16:28.900 There's an incentive structure there, yes.
00:16:31.120 But this is, it's so important to realize that if we don't take a step back and understand this bigger picture, that this is a fundamentally theological architecture.
00:16:42.520 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:16:48.060 Most of the time, you'll probably be fine.
00:16:50.120 But what if one day that weird yellow mask drops down from overhead and you have no idea what to do?
00:16:55.820 In our hyper-connected world, your digital privacy isn't just a luxury.
00:16:59.420 It's a fundamental right.
00:17:00.780 Every time you connect to an unsecured network in a cafe, hotel, or airport, you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
00:17:10.260 And let's be clear, it doesn't take a genius hacker to do this.
00:17:13.460 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords, bank logins, and credit card details.
00:17:20.860 Now, you might think, what's the big deal?
00:17:22.960 Who'd want my data anyway?
00:17:24.500 Well, on the dark web, your personal information could fetch up to $1,000.
00:17:28.360 That's right, there's a whole underground economy built on stolen identities.
00:17:33.180 Enter ExpressVPN.
00:17:34.940 It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
00:17:39.620 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:17:45.220 But don't let its power fool you.
00:17:46.940 ExpressVPN is incredibly user-friendly.
00:17:49.400 With just one click, you're protected across all your devices.
00:17:52.140 Phones, laptops, tablets, you name it.
00:17:54.660 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:17:58.780 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:18:04.760 Secure your online data today by visiting expressvpn.com slash jordan.
00:18:09.520 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
00:18:15.900 ExpressVPN dot com slash jordan.
00:18:17.760 Well, we could even go back farther than that, I think.
00:18:23.660 So tell me what you think about this, this set of ideas that I've been working on more recently.
00:18:28.720 So I mentioned earlier that you could think about Marxism in two ways.
00:18:36.180 And the new Marxist variants that we've been talking about, you can think about them as expositors of the doctrine of power.
00:18:45.540 That every social relationship between human beings can be understood as a function of domination and compulsion.
00:18:53.080 So that would be marriage, that would be friendship, family, economic relationships, history itself, every single social relationship.
00:19:00.740 Now, that has the advantage of extreme simplicity.
00:19:03.700 And so even if you're not very bright, you can understand it.
00:19:06.400 It's comprehensively explanatory.
00:19:08.880 So that's attractive on a psychological front.
00:19:10.740 And then there's a nefarious element.
00:19:12.800 And the nefarious element is that, well, if it's all about power, then not only am I thoroughly justified in my use of power, because after all, that's just what you're doing.
00:19:23.440 But it also justifies any accusation possible.
00:19:27.640 Because you might come up with a claim, for example, that you're a proponent of free speech.
00:19:32.840 And I can easily say, well, no, like as a white colonialist benefiting from the privilege of your position, you just use the argument that such a thing as free speech exists to justify your position in the power hierarchy.
00:19:46.680 Right.
00:19:46.800 And that's a universal criticism.
00:19:48.200 And so then I can take anything that you might claim as positive and just transform it with intellectual jujitsu into a manifestation of the power drive.
00:19:58.540 You know, and it's fair to say that the drive to power is a human motivation, but it's not the only human motivation.
00:20:05.800 It's not the only one.
00:20:06.680 That's right.
00:20:06.980 And it's also fair to say that when social relations are corrupted, they become corrupted in the direction of power.
00:20:17.200 But that's a completely 100% different proposition than the claim that all human social relationships are predicated on power.
00:20:26.100 I mean, you mean all, do you?
00:20:27.860 Yeah.
00:20:28.160 Okay.
00:20:28.700 So I know where, okay.
00:20:29.660 So there's the power claim, but then there's this undercurrent of bitterness and resentment.
00:20:36.320 And I see this system of ideas that's playing out as an extension, actually, of the battle between Cain and Abel.
00:20:44.400 Yes.
00:20:44.780 Because, okay, so why yes?
00:20:47.820 This bitterness, this envy, there's the, you know, the very devout, or devout maybe is not the right word, but the very obedient Abel.
00:20:56.920 And then he has these advantages.
00:20:59.640 Cain gets very upset.
00:21:00.700 Yes.
00:21:01.100 Well, that's—
00:21:01.640 And comes to murder him.
00:21:02.560 This is, I think, exactly the motivation.
00:21:04.340 And this is, how do they proceed?
00:21:06.480 You know, so if we take seriously the concept that this is a Gnostic heresy, and you look back, whether it's Jews pre-Christ—
00:21:12.920 How did you come to that conclusion that it was a Gnostic heresy?
00:21:15.660 Because I didn't know you had pursued this all the way back into the religious realm.
00:21:19.140 Well, I stumbled on a recommendation onto a philosopher named Eric Foglin, who's got quite the reputation for having named Marx as a Gnostic.
00:21:28.300 And then I went down a rabbit hole reading about Gnosticism and reading about Hegel and his relationship to this kind of mystical reinterpretation of Christianity.
00:21:37.580 You know, Glenn Alexander, Mickey, Eric Foglin, these analysis—I read a little bit of Evola.
00:21:42.740 I'm not a big fan of Julius Evola's writing, but I read a little.
00:21:45.980 He makes these claims quite strongly as well.
00:21:48.800 And then I just started to read the Gnostic texts myself and their hermetic texts.
00:21:54.560 And so I stumbled—I mean, I've been claiming for a decade that this is a religious phenomenon, posing as sociology in some fashion, but this is what really finally allowed me to put it together.
00:22:09.720 And it doesn't matter.
00:22:11.240 Gnosticism is, by its nature, parasitical.
00:22:14.660 It's that we have discovered through whatever means divine, you know, revelation, whatever it happens to be, the secret salvific knowledge that they don't want you to know.
00:22:25.700 So there's some higher truth that's hidden, and maybe there's a code written in the Bible that if you have the secret means of divining it, then you can determine what the Bible really means.
00:22:35.560 And when you go and you talk to the priest and the priest says, you know, that's a heresy, they turn around and say, he just doesn't want you to know that.
00:22:41.600 Which, when you say it goes back to Cain and Abel, it goes back further than that.
00:22:46.280 It goes back to the serpent in Eve.
00:22:47.700 God hath not said, you know, that you've got this emblem of all authority that has declared this thing.
00:22:55.260 Then you have the subversive element that comes in and say, was that what he really said?
00:22:59.040 So it seems to me that this is a way of conceptualizing the relationship between the religious and the philosophical and the sociological.
00:23:07.560 So if you delve deeply enough into the battle between two idea sets and you keep going down, as you go down to more and more fundamental layers, you approach the religious because the religious is by definition the most fundamental.
00:23:25.240 And so I think when you're looking at something like the culture war that's going on, you can see it as a battle between ideas.
00:23:37.220 But then when you trace the ideas back, you see it as a battle between narratives.
00:23:41.060 And when you trace the narratives back, you see it as a battle between fundamental narratives.
00:23:44.840 And as you approach the most fundamental narrative, you are treading on religious grounds.
00:23:49.860 So what happens in the story of Cain and Abel, of course, is that Cain makes second-rate sacrifices.
00:23:55.840 Right.
00:23:56.100 And he knows they're second-rate.
00:23:57.620 And because they're second-rate and he's not all in, his sacrifices aren't accepted.
00:24:02.200 And that's just a phenomenological truth, which is life is so difficult that unless you make the proper sacrifices, you're not going to succeed.
00:24:10.040 And then he calls God out on that and says the cosmos is constituted in an ill-gotten manner because I'm not successful.
00:24:19.040 And God basically says, well, if you did things right, things would work out for you.
00:24:23.760 And instead of Cain accepting that as corrective information, his countenance falls, so goes the story.
00:24:30.680 And he flies into a murderous rage and destroys his own ideal because he really wants to be able.
00:24:36.400 And then the descendants of Cain become, he's murderous, Cain, obviously, because he kills Abel.
00:24:42.620 And then the descendants of Cain become genocidal.
00:24:46.180 Right.
00:24:47.000 And so the way the biblical narrative essentially opens, because Cain and Abel are the first two real human beings, right?
00:24:54.180 Because Adam and Eve are made by God.
00:24:55.900 So the biblical narrative portrays the battle between the spirit of Cain and Abel as the fundamental battle that rages in the human heart.
00:25:03.800 So it's the battle between the spirit of proper sacrifice, which is what Abel represents, the spirit of improper sacrifice, that's Cain, and the cascading consequences of improper sacrifice.
00:25:16.000 And then a metaphysical battle between those two spirits that characterizes, well, that's when history starts, right?
00:25:22.120 So that's the central battle in history.
00:25:24.400 And I think of Marxism as, I think, the French Revolution was a manifestation of the spirit of Cain, and that Marxism itself is a manifestation of the spirit of Cain.
00:25:34.960 And then the postmodern enterprise that's besetting us now, another manifestation of the spirit of Cain.
00:25:41.520 And it's the proclamation.
00:25:43.040 I don't exactly understand the relationship between that and the claim that it's power that's the ruler of all things.
00:25:51.720 But I do know that the spirit of Cain is indistinguishable, say, in the biblical corpus, from the spirit of Satan.
00:25:59.680 And Satan is the, the satanic ruler is definitely the ruler that uses power and compulsion and deception to control everything.
00:26:07.920 I mean, I think your connection that you're looking for boils down, I mean, Foucault gives us the hint, right?
00:26:13.880 Everything is a prison, this carceral power.
00:26:16.440 So what is power in this analysis?
00:26:18.900 It's the power to compel, to extort, to force behaviors, to paraphrase Larry Fink.
00:26:26.460 And so—
00:26:27.520 Right, no kidding.
00:26:28.640 Yeah.
00:26:29.200 To paraphrase Larry Fink and his bloody black rock.
00:26:31.600 Again, if we take this Gnostic concept seriously, the Gnostics believe that there is an all-good, transcendent God behind everything that's so good that he's completely pure spirit, completely uncorruptible.
00:26:44.020 And therefore, anything material must not be of that.
00:26:48.280 It must be, in fact, evil.
00:26:49.880 And so, where did it come from?
00:26:52.300 And they've got a mythology for it, but it doesn't matter.
00:26:54.420 This character called the Demiurge comes into being through a series of kind of cosmic accidents in the Pleroma, as they call this.
00:27:03.560 And Demiurge comes from the Greek Demiurgos.
00:27:07.240 Demiurgos means artisan or builder.
00:27:09.880 He's the architect of the world.
00:27:11.680 So he builds out the world, but in fact, he's a demon.
00:27:14.580 And so he builds out the world as a prison.
00:27:16.460 So God in Genesis, in Genesis 3, with the fruit, has imprisoned Adam and Eve in the garden.
00:27:23.500 And the snake is saying, did you know?
00:27:25.160 He just doesn't want you to know that you're like him.
00:27:28.600 This same system, this is what Cain's rejecting.
00:27:31.880 He's giving his second-rate sacrifices.
00:27:33.900 He's not doing what he should.
00:27:36.160 God's telling him if he does what he should, things will work out.
00:27:39.120 And he's like, no, this system's corrupt.
00:27:40.960 Right, exactly.
00:27:41.840 And so this is the same.
00:27:43.020 Yeah, he calls God out on his misbehavior, essentially.
00:27:45.520 That's right.
00:27:45.940 If you don't think that's the sin of pride, there's definitely something wrong with the
00:27:49.100 way that you're thinking.
00:27:50.060 Right.
00:27:50.540 And so I think that this is ultimately the Gnostic motivation.
00:27:55.720 It must have been—I wonder if it's as surprising for you as it is for me that this
00:28:00.400 is the rabbit hole that you've ended up going down.
00:28:03.200 You know, I had no idea when I started investigating these theories that, you know, that the root
00:28:07.880 consequence of that investigation would be to move down levels of analysis into the religious
00:28:13.480 domain.
00:28:14.000 So I did start to understand that as you move down levels of analysis, you inevitably end
00:28:19.500 up in the religious domain because the religious domain is the deepest level of analysis.
00:28:23.560 Right.
00:28:23.920 So—but I mean, has it surprised you that you're sitting here talking about Gnosticism, for
00:28:29.120 example, while trying to diagnose the ills of the modern world?
00:28:32.220 Yes, it has surprised me.
00:28:34.220 It's very curious as well because I was this character.
00:28:38.600 I had this—it cuts through every human heart, as Solzhenitsyn very eloquently put—I was
00:28:44.900 this character.
00:28:45.640 I was a very frustrated academic, and I think it's typical.
00:28:48.380 What is the most common—I mean, maybe there are others, but one of the more common psychiatric
00:28:52.280 disorders that academics complain about is imposter syndrome.
00:28:55.200 Right?
00:28:56.200 Oh, I've got this degree, but I'm actually stupid because the PhD earning process is quite
00:29:01.420 difficult.
00:29:02.420 And you're always surrounded by people who know far more than you do, who remind you
00:29:05.920 of it on a daily basis.
00:29:07.420 And so you end up with this massive amount of imposter syndrome.
00:29:10.540 I'm not really, you know, good at the thing—I'm not as good as they think I am, this kind
00:29:15.460 of delusional complex.
00:29:17.080 And rather than taking the certification and saying, you know, well, okay, I've earned this.
00:29:22.960 And so you have this baseline where you—like Marx, what do you do with your time?
00:29:28.560 You dig into some area.
00:29:30.340 You finally see the secret, you know, truth nobody else saw within that.
00:29:34.500 And I'm just talking as an impulse.
00:29:35.780 I'm not getting religious yet.
00:29:36.820 And you see this, and then you write, and you write, and write, and seven people read
00:29:40.320 it.
00:29:40.500 Nobody cares.
00:29:41.840 And you start to think to yourself, why am I not getting career advancement?
00:29:45.820 Why am I not getting the accolades?
00:29:47.460 Why isn't society—or if you're Marx, why isn't everybody else just paying my bills?
00:29:51.780 Don't they see how important my social theory that I'm writing is?
00:29:54.580 So you're doing something that's not particularly useful.
00:29:58.060 But it's Cain.
00:29:58.940 This is a second—
00:29:59.600 That's also Lucifer, by the way.
00:30:00.900 It is.
00:30:01.400 You're making a second-rate sacrifice and expecting to get first-rate results.
00:30:06.080 And that jealousy grows there.
00:30:10.000 Well, yeah.
00:30:10.540 Well, that intellectual pride is a big part of that, too.
00:30:13.000 You know, I saw—
00:30:13.500 I worked so hard.
00:30:14.060 Where's mine?
00:30:14.480 Well, it's worse than that, even.
00:30:16.660 It's not even that I worked so hard.
00:30:18.220 Like, I had clients, for example, now and then, who had a Luciferian problem.
00:30:23.040 And they were often very smart people who hadn't put in the work.
00:30:27.480 That's certainly the case.
00:30:28.520 But they were very annoyed because it was clear to them that they were smart as or smarter than everyone else.
00:30:34.540 Yes.
00:30:34.760 And yet the world hadn't unfolded at their feet.
00:30:37.440 Yes.
00:30:37.720 You know, and so they were very bitter and resentful about that.
00:30:40.560 And, like, it's definitely the case that in Milton, like, Lucifer, who's the bringer of light, is definitely an envious intellect, right?
00:30:48.240 Yes.
00:30:48.520 And he's the angel who, in God's heavenly hierarchy, rose the highest and fell the furthest.
00:30:55.180 And that's definitely something that can characterize intellect because the human intellect is a remarkable spirit, you might say, capable of the greatest good.
00:31:05.800 But it is also the thing that can fall the farthest.
00:31:08.240 And that wounded intellect is the most vicious of spirits.
00:31:12.980 And so that's sort of that combination of Cain and Lucifer.
00:31:16.540 And it's also, you know, it's also the case in the biblical corpus, if you take the stories apart, that the spirit that raises the Tower of Babel is the wounded spirit of Lucifer and Cain, right?
00:31:28.300 And that's erecting a technological alternative to God, but partly in an attempt to worship intellect instead of, well, instead of, well, instead of whatever God might be.
00:31:38.260 The highest, it's something like the highest spirit of genuine self-sacrifice.
00:31:42.780 Right.
00:31:42.980 Something like that.
00:31:43.880 Well, the fact is that the wounded intellect or the wounded narcissist doesn't humiliate itself in front of anything.
00:31:50.540 But the world or God demands that you constantly humble yourself in front of what's happening to you.
00:31:56.120 You know, right now, you get this weather thing, or it's not weather, these fires and the smoke.
00:32:01.040 Well, guess what?
00:32:01.840 Everybody in Washington, D.C., New York City has to deal with that.
00:32:04.780 Reality has imposed itself.
00:32:07.000 The smoky air is here.
00:32:08.560 You don't have a choice about whether it's here or not.
00:32:10.660 You have to humble yourself before what's happening around you and make adjustments accordingly.
00:32:16.200 You don't get to just assert, well, I think the world should be this way.
00:32:18.840 I think God should honor my sacrifices.
00:32:20.380 Yeah, right.
00:32:20.820 I think I should have the Tower, however it is.
00:32:22.860 No, if your sacrifices aren't being honored, the first question to ask is, what have you done wrong?
00:32:28.100 Yes.
00:32:28.400 Right, and that's more or less by definition, right?
00:32:30.720 Because this is also one of the things I see problematic in the so-called manosphere online.
00:32:37.140 It's because all the men who are unsuccessful are clattering on about what's wrong with women.
00:32:41.380 It's like, by definition, there's nothing wrong with women.
00:32:44.900 Yeah.
00:32:45.240 Right?
00:32:45.460 If you're not adapting yourself to women, it's not the women's problem.
00:32:49.360 Yeah, right.
00:32:49.840 It's your problem.
00:32:50.920 And that's by definition.
00:32:52.280 And it's the same thing in relationship to your relationship to the world, is that if your sacrifices aren't being rewarded, the right question to ask is, how am I prideful and blind?
00:33:02.740 Not, how is the world constituted in an ill-gotten manner?
00:33:06.060 Yes.
00:33:06.400 Okay, so now you said something interesting biographical.
00:33:09.560 You just touched on it.
00:33:10.920 Sure.
00:33:11.600 But you said that there was a time in your life where you were bitter, I think I've got that right,
00:33:17.480 and feeling that you were marginalized and that the world wasn't laying itself at your feet as a consequence of your sacrifices.
00:33:25.200 So tell me a little bit more about that, when that happened, and why do you think that the same thing isn't true now?
00:33:32.280 Or do you, you know what I mean?
00:33:34.180 Because if something like that has got you in its grip, you think you've escaped from it, but that doesn't necessarily mean that you've escaped from it.
00:33:40.380 No, right, right, right, right.
00:33:41.220 So let's go to, like, so how long ago was, tell me a little bit more about that.
00:33:46.240 How long ago was that?
00:33:47.300 And what were you doing?
00:33:49.140 What was your career at that point?
00:33:51.400 Starting a business can be tough, but thanks to Shopify, running your online storefront is easier than ever.
00:33:57.380 Shopify is the global commerce platform that helps you sell at every stage of your business.
00:34:01.620 From the launch your online shop stage, all the way to the did we just hit a million orders stage, Shopify is here to help you grow.
00:34:08.380 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, ship products, and track conversions.
00:34:16.000 With Shopify, customize your online store to your style with flexible templates and powerful tools, alongside an endless list of integrations and third-party apps like on-demand printing, accounting, and chatbots.
00:34:27.760 Shopify helps you turn browsers into buyers with the internet's best converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:34:36.560 No matter how big you want to grow, Shopify gives you everything you need to take control and take your business to the next level.
00:34:42.360 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
00:34:48.740 Go to shopify.com slash jbp now to grow your business, no matter what stage you're in.
00:34:54.200 That's shopify.com slash jbp.
00:34:56.680 So this is right after I left academia, which I didn't get chased out.
00:35:03.720 There's an urban legend online, of course, from the people who hate me that I couldn't get a job.
00:35:07.760 No, I took 100% of the job offers that I applied for.
00:35:12.100 In other words, zero.
00:35:13.040 Okay, and so what was your academic history?
00:35:16.900 Well, I got a bachelor's in physics, a master's in math, and I finished a PhD in math in 2010.
00:35:22.680 It's a PhD in math.
00:35:23.980 Yeah.
00:35:24.200 Okay, okay, so you were, those are very difficult disciplines, right?
00:35:28.020 So if you rank order disciplines by IQ, math is actually at the top.
00:35:31.620 It's hard.
00:35:31.960 I think it's math first and then physics.
00:35:33.940 Right, so you got a PhD in math.
00:35:35.520 Yes.
00:35:35.840 Okay, so you can clearly think on the quantitative side of things.
00:35:39.160 Yes.
00:35:39.520 So, okay, okay.
00:35:40.420 And so then you decided not to apply for academic jobs.
00:35:43.800 Yes.
00:35:44.200 Okay, and so why did you decide not to apply for academic jobs?
00:35:47.200 So at the time, the last, so this is 2007 or 2008, I don't remember which year was first.
00:35:52.140 The last couple of years that I was doing my PhD, we had all these, you know, teaching meetings, or I don't want to say faculty meetings because I was a grad student, but equivalent to a faculty meeting.
00:36:01.640 And the new rule of the university was fail the smallest number of kids possible, one per course, no more.
00:36:08.820 And I'm thinking, I teach math, what are we doing?
00:36:10.340 Wow, this was what university?
00:36:11.240 University of Tennessee.
00:36:12.780 Wow.
00:36:13.120 That was actually a rule that was established.
00:36:16.500 Mostly informally.
00:36:18.040 The convention.
00:36:18.520 Yes.
00:36:19.100 It was, we need to focus on student retention.
00:36:21.740 Don't alienate students with bad grades.
00:36:23.380 Oh yeah, that's that consumerist approach to it.
00:36:25.440 Correct, yeah.
00:36:26.220 The unholy alliance of bloody managerial speak and unholy woke-ism.
00:36:31.340 Exactly.
00:36:31.640 God, then you have the university.
00:36:33.540 That's right.
00:36:33.880 The bastard child of the two worst monsters you could possibly put together.
00:36:37.500 So I didn't want to participate in that.
00:36:39.660 So I chose not to apply for academic jobs at all.
00:36:42.860 I didn't want to participate.
00:36:43.960 If I can't teach where the students who succeed get treated for success and the students who fail, fail and have to try again or go somewhere else or do something different, I didn't want to participate in that.
00:36:56.600 Even, you know, having come from physics, I cared very much.
00:37:00.180 If I certify somebody as being competent in calculus and they go on in an engineering program and they're not competent in calculus, I'm doing actually a grave evil.
00:37:09.680 And if the university is telling me I have to do this or show up and talk to the dean and explain why I failed a third kid in the class, I don't want to participate in this system anymore.
00:37:20.020 So I just left.
00:37:20.720 Okay, and so how have you kept body—we'll go.
00:37:24.460 How have you kept body and soul together since then?
00:37:27.580 And because your career is kind of mysterious.
00:37:29.880 And then also, let's go back to this issue of bitterness.
00:37:33.160 Yes, yes, yes.
00:37:33.700 So after leaving the university, I just got a job.
00:37:38.140 Actually, I didn't.
00:37:39.240 I became a massage therapist, which is—the internet thinks this is just hilarious.
00:37:44.300 It's not a mysterious story how or why.
00:37:46.480 I have two components to this story.
00:37:48.600 My wife is one, so I had an in to the profession, you know, and a direct line where I understood what it is and what it does.
00:37:55.080 And secondly, I had injured myself doing jujitsu early in my 20s, and it turned out that massage therapy was what actually fixed it where all these doctors I had seen weren't able to figure out how to sort out that what I had actually done is messed up my psoas muscle.
00:38:10.140 And so the massage therapist was able to massage the problem out of my psoas and the subsequent problems out of my lower back muscles and glutes and thighs, and it fixed it.
00:38:19.580 And I no longer had, you know, these terrible episodes of lower back pain, so I said, I want to do this for people, so let's go.
00:38:26.540 And so I started studying simultaneously medical textbooks on muscle pain and going to massage school, which is a little less rigorous, and did this for a number of years.
00:38:37.120 Well, being academic, I became a little academically bored.
00:38:41.380 This is a fine career, and I enjoyed what I did and was very helpful and rewarding, but I needed some academic stimulation.
00:38:47.480 So I started to read philosophy of science.
00:38:49.520 I started to get in discussion forums online.
00:38:51.180 This is where I discovered kind of the feminism, you know, explosion that was kind of happening in blog spaces all over the internet, and everybody's getting accused of this.
00:39:00.360 And I got involved in, a little bit regretfully now, with the new atheism movement and got caught up in all of that for a few years.
00:39:07.720 But the resentment came.
00:39:08.620 I think most of the new atheists actually regret becoming involved in that.
00:39:12.360 Yeah, I think so.
00:39:12.820 I even see echoes of that in Dawkins now because he's seen what sort of children he produced.
00:39:16.920 Right, yeah, very petulant, very, very, very not healthy.
00:39:21.140 Yeah, well, one of the things I learned from you was Catholic was about as sane as you get, right?
00:39:27.020 Yeah.
00:39:27.280 So you destroy that.
00:39:28.440 You think, well, you think the Catholics are insane.
00:39:30.460 You wait till you see what that's protecting you from.
00:39:33.040 Right.
00:39:34.260 Polytheistic paganism.
00:39:35.860 Totally insane.
00:39:36.560 Right.
00:39:37.100 Oh, boy.
00:39:37.960 With some child sacrifice thrown in and some nature worship.
00:39:40.940 Right.
00:39:41.680 But, of course, in the woke era we live in now, we don't have as much, you know, these pagan gods as they were construed in the pre-modern.
00:39:50.380 We have power dynamics, racism, systemic racism.
00:39:54.460 Well, the Irish are going to sacrifice 200,000 cows to Gaia.
00:39:58.320 Yeah, that's right.
00:39:59.140 So, you know, that's nature worship at its finest.
00:40:00.800 That is, exactly.
00:40:00.980 To change the weather.
00:40:02.080 To change the weather.
00:40:02.460 I think it's really pretty damn funny.
00:40:04.200 It is.
00:40:04.540 It's so appallingly comical that it is definitely a form of cosmic joke.
00:40:11.860 Well, we do need a few of those sometimes.
00:40:13.780 Yeah, yeah, yeah.
00:40:14.640 Okay, so you started working as a massage therapist.
00:40:17.080 This was around 2007, 2008.
00:40:19.480 What the hell did your wife think you were doing?
00:40:21.660 Oh, she loved the idea that we were working together.
00:40:23.840 She did?
00:40:24.440 Yeah.
00:40:25.000 Yeah.
00:40:25.640 Boy, that's quite the detour out of the math realm.
00:40:27.660 It was.
00:40:28.380 But, you know, she's not of that world, and so she didn't really care one way or the other if I stayed in it or not.
00:40:33.200 And so she thought it was, you know, noble that I was picking up something different and working with my hands and working with people.
00:40:40.520 Yeah.
00:40:41.000 And trying to help people with chronic pain that, you know, for whatever sort of reasons.
00:40:45.680 How long did you practice as a massage therapist?
00:40:47.200 I was licensed for 10 years, so I probably practiced regularly for about eight.
00:40:52.440 Were you good at it?
00:40:52.960 Were you able to attempt?
00:40:53.640 Quite good, yeah.
00:40:54.300 You can really get good at touch healing.
00:40:57.140 Yeah.
00:40:57.420 It's so interesting.
00:40:58.840 My wife's a massage therapist.
00:41:00.140 Yeah.
00:41:00.280 And it's so interesting to see how much wisdom you have in your fingers.
00:41:04.200 You know, if you place your hands on people, you can feel, weirdly enough, where there are whatever a knot is.
00:41:12.180 You know, it must be a place where circulation isn't optimal or where the muscles have tightened up for some reason.
00:41:17.640 But it's so interesting that you can feel that out and that you can fix it, too.
00:41:21.600 Yeah.
00:41:21.960 I was actually quite good at it.
00:41:23.660 And it was pretty nice.
00:41:25.160 But, again, academic boredom.
00:41:27.740 And so, I think maybe just in general that trying to get engaged in these academic activities, starting to try to write books and trying to get, you know, paid attention to it wasn't coming.
00:41:39.980 The world's not laying itself out at my feet.
00:41:42.120 My sacrifice isn't being honored, if you will.
00:41:45.040 But I got really, like, I remember telling my wife a couple of times, like, I work so hard.
00:41:49.400 I know I'm smart.
00:41:50.520 Yeah.
00:41:50.760 Why don't people recognize it?
00:41:53.020 And it's so tempting to say, well, it must be something wrong with them.
00:41:55.620 Do you think that was one of the things that tempted you into the New Atheist movement?
00:41:59.500 I think that was generally frustration with growing up largely areligious in the broadly Southern Baptist South.
00:42:08.680 Okay.
00:42:09.000 I felt the New Atheism movement was like a breath of fresh air for me because it was the first place I felt that I found in the world where I could say what I thought without, you know, whatever long set of problems it would create for me.
00:42:21.980 You know, everybody when I grew up in the South.
00:42:24.220 Well, the thing about people like Dawkins, too, and Harris, for that matter, is they are characterized by a certain clarity of thinking.
00:42:31.000 Right.
00:42:31.320 Right, and, you know, I have some, what would you say, sympathy, certainly for both Dawkins and Harris.
00:42:37.820 I think they're both good people in some fundamental sense.
00:42:41.220 I mean, I know Harris was trying to ground a transcendental ethic in what he regarded as unshakable truth, and he thought the unshakable truth was essentially objective reality.
00:42:50.480 And I also have some sympathy for that perspective because it's the postmodern critique of that, in part, that's led us to where we are.
00:42:57.000 Right.
00:42:57.200 But the problem is, is that, well, we can't go into that.
00:43:01.920 There's lots of problems, and you've started to stumble into that, obviously, looking at the strange tangle of religious ideas that have produced the conundrum that we see before us.
00:43:11.580 Okay, so you were keeping body and soul together, you and your wife, by working on the massage front.
00:43:15.620 Yeah.
00:43:15.880 You said that that didn't provide enough stimulation for your ravenous intellect, and you weren't getting a lot of purchase on that front, and that was making you bitter.
00:43:26.880 And so, how did you, when did you realize that?
00:43:31.800 What did you do about it?
00:43:32.800 What makes you think you have done something about it?
00:43:35.160 Well, I didn't realize it until after the fact, so that helps me think that maybe I have done something about it, because it's one of these things I look back at, and I say, wow, I really was a different person.
00:43:44.580 And now—
00:43:45.820 What made you learn?
00:43:48.920 Getting beat up by life, not physically, losing over and over and over again, and finally accepting that maybe I have to do something different.
00:43:59.100 Well, that's interesting, because another thing that can happen if you get beat up over and over again and lose is that you can get more and more bitter.
00:44:07.100 Well, this is—
00:44:07.720 You don't have to learn.
00:44:08.860 I had the same lesson, though.
00:44:10.240 I used to fight.
00:44:11.320 I used to fight, you know, sport fight, for real.
00:44:15.280 Fourth-degree black belt, all this stuff back in when I was young.
00:44:18.500 I mean, young adult, not a child, not one of these children black belts, you know, in my early 20s and my mid-20s.
00:44:24.860 And I fought a lot, and frankly, sometimes I won, but a lot of times, especially on the way up, I got my ass kicked.
00:44:32.760 Right.
00:44:33.040 And you start learning real fast that you can get as mad as you want at somebody who can beat you up.
00:44:38.720 And it doesn't let you—
00:44:39.780 And they'll still beat you up.
00:44:39.980 They'll still beat you up.
00:44:41.180 They'll beat you up worse.
00:44:41.980 Worse.
00:44:42.880 Yeah, yeah.
00:44:43.180 That's right.
00:44:43.700 It'll beat you up much worse.
00:44:45.360 Right, and that—and there's nothing to argue about on that.
00:44:48.020 There's nothing.
00:44:48.760 There's no debate.
00:44:49.480 No, no, no.
00:44:49.500 Sit down on the floor with your nose on the ground.
00:44:51.240 That's right.
00:44:51.740 Right.
00:44:52.020 That's right.
00:44:52.400 I can't abstract yourself out of that.
00:44:54.200 But I think the kind of magic secret sauce, besides getting beat up with all the repeated, you know, attempts and failure, attempt and failure, was that I actually just got really busy.
00:45:05.460 I didn't have time to ruminate on this because I picked up the grievance studies affair.
00:45:10.720 Peter and Helen and I started to—
00:45:12.260 Yeah.
00:45:12.800 That was all consuming.
00:45:14.560 I mean, every waking moment that I had for—
00:45:17.360 Do you want to walk people through that a little bit?
00:45:19.060 Sure.
00:45:19.720 So Peter and I had this wild-brained idea to write as many fake academic articles for feminist theory, gender studies, you know, all of these kind of woke postmodern journals.
00:45:33.020 Put them in the highest ones we could get and see how many we could get published.
00:45:38.420 And we had planned roughly two years.
00:45:41.220 Our goal was that we were going to start in August of 2017 writing these things as fast as we could write them, and we were going to write until sometime around a year later, some summer 2018.
00:45:52.200 We would stop writing and finish the academic process on wherever we got, and we'd see what happened.
00:45:57.020 So when we wrote the 20th paper, which we finished in June of 2018, we called it.
00:46:01.560 No more new ones.
00:46:03.120 Let's just finish these, do what we have to do to get them, you know, into the world and see where it goes.
00:46:09.580 But then the article got published on us in October, so we didn't see all the way to the end.
00:46:15.800 So we wrote 20.
00:46:17.440 Six of them were a learning process.
00:46:19.400 They're utter failures.
00:46:20.460 I think they prove lots of interesting things.
00:46:22.300 It's maybe a discussion for another day.
00:46:25.340 But then there were these 14 others.
00:46:28.800 Seven of those had been accepted.
00:46:31.060 Seven more were under peer review.
00:46:33.000 A sociologist wrote an article.
00:46:34.540 I wish I could remember who it was, but soon after this happened, and he said that he anticipated that either four or five of those seven, based on what the peer review had said and the quality of them, four or five of those seven probably would have been accepted as well.
00:46:46.700 But the ones that wouldn't were also the earliest among that batch of 14.
00:46:51.460 So you got good at writing fake articles.
00:46:53.160 We cracked the code.
00:46:54.740 At one point—
00:46:56.260 I wonder if ChatGPT could do that.
00:46:58.300 Yes.
00:46:58.800 No question.
00:46:59.680 Absolutely no question.
00:47:00.660 I'm 100% confident that it could write papers that—
00:47:03.520 And read them.
00:47:04.200 We don't need the academics at all.
00:47:05.840 The ChatGPT can write the papers and read them.
00:47:08.300 That's right.
00:47:08.680 And grade them, for that matter.
00:47:10.300 Yeah.
00:47:10.580 Yeah, yeah.
00:47:11.280 Yeah.
00:47:11.620 So this project just consumed my life.
00:47:19.140 You know, an academic article, a new full 8,000-word, fully cited academic article.
00:47:25.340 We covered 15 subdomains of academic pursuit across the 20 papers.
00:47:31.200 So we're all over the place having to read as fast as we can read and write as fast as we can write and cobble these things together.
00:47:36.840 It was all we did.
00:47:38.380 And so I couldn't think.
00:47:39.320 Plus, I knew I was doing something productive.
00:47:41.360 And then, of course—
00:47:42.320 Why did you think it was productive?
00:47:43.600 We thought that, you know, once we started to get success, it was very clear that we had figured something out that was proved against the real world.
00:47:55.740 I mean, academic peer review isn't exactly the real world qua the real world.
00:48:00.800 But it is the system, the actual system, where the real thing that certifies knowledge or whatever we pretend this corrupt system does, where that is.
00:48:12.340 And we were in.
00:48:14.300 We cracked the code.
00:48:15.980 We said at one point—I said at one point, I am convinced now that I could have a 100% success rate.
00:48:22.000 Every paper I write, I can get in.
00:48:23.360 I can pick a topic, write whatever I want, and I can get it in.
00:48:27.180 Anything.
00:48:28.220 Gentrification of cornbread was the next one I had planned.
00:48:31.340 Just something ridiculous.
00:48:32.880 Right.
00:48:33.140 And this was a humanities focus.
00:48:34.960 Humanities focus, that's right.
00:48:35.100 When you were doing your PhD work and the previous work on that in the STEM fields,
00:48:40.200 had you had any philosophical interest or interest in the humanities at that point?
00:48:45.440 Or did this all develop after you stopped pursuing them?
00:48:49.360 I mean, very little.
00:48:50.240 I took a philosophy class as an elective as an undergraduate and had a tremendously good time with it.
00:48:54.960 But that was it.
00:48:56.340 Oh, right.
00:48:56.840 So, this really wasn't so—that's so interesting.
00:48:59.320 You hopped out of that mathematical world into massage therapy and into the humanities.
00:49:05.860 Into the humanities, yeah.
00:49:06.840 Right, and you were really coming at it from the perspective of a STEM mind.
00:49:12.080 You had a STEM mind.
00:49:13.180 Right.
00:49:13.860 That's interesting.
00:49:14.860 You know, a lot of the greatest psychologists in the early part of the 20th century were engineers, hey?
00:49:19.500 They established all the statistical techniques that all the social scientists use.
00:49:24.440 I'm an interesting mathematician, though.
00:49:26.540 I'm a different kind.
00:49:28.320 There are 13 different branches of mathematics.
00:49:30.820 I'm what's called an enumerative combinatoricist—that's a lot of syllables.
00:49:36.360 Enumerative combinatorics gives these very kind of Baroque counting arguments.
00:49:40.520 A combinatoricist will be upset that I called it Baroque.
00:49:43.640 All the rest of the mathematicians will cheer that I said this, that I've confessed this.
00:49:48.220 But we give these things that are called counting arguments so that we say that an equation is true, an identity is true, because both sides of it count the same thing in two different ways.
00:49:57.920 And so you describe—it counts on this side of the equation, it counts it this way.
00:50:02.240 On this side of the equation, it counts it that way.
00:50:03.800 A simple example, without doing a bunch of math, is that if you don't know, that the square numbers—1, 4, 9, 16, so on, 25—the square numbers obviously count the number of squares in a square grid, n by n.
00:50:18.720 Well, it turns out that the square numbers equal the sum of the first n odd numbers.
00:50:23.100 So 1, then 1 plus 3 is 4, 1 plus 3 plus 5 is 9, 1 plus 3 plus 5 plus 7 is 16, and add 9, get 25, and you can see how it goes.
00:50:30.400 But what that is, is you count the corner, that's 1, then you count the 3 that go around it, that's 3 more.
00:50:35.280 Then you count 1—well, it's 2 by 2, so there's 2, then there's 2, and there's 1, so that's going to be the next odd number.
00:50:40.800 And then it's 3, 3, and 1, so two 3s and a 1.
00:50:43.620 The next odd number, that's it.
00:50:45.640 That's a combinatorial argument.
00:50:46.760 Is that against two different measurement techniques, would you say?
00:50:48.900 Well, yeah, in a sense, in a sense.
00:50:50.800 Because that's one of the ways we triangulate on truth, right, is we use multiple measurement systems to abstract out the same pattern.
00:50:57.760 They call that construct validation in psychology.
00:51:01.840 Well, psychologists have tried to wrestle with the idea of how you know when a concept is real.
00:51:07.660 And the reason they wrestled with that was because of the issue of diagnosis.
00:51:12.600 Like, for example, is anxiety and depression, are they the same thing, or are they different?
00:51:16.940 Well, they overlap to some degree.
00:51:18.760 And then when you're starting to ask about whether two things are the same or different, you're asking about the nature of reality, and you're also asking about the nature of measurement.
00:51:28.540 And what psychologists concluded, essentially, was that to establish something as real—imagine there's a pattern there—you needed a set of qualitatively different measurement techniques, all of which converged on an emphasis of the same pattern.
00:51:41.480 It's what your senses do, right?
00:51:43.260 Because your senses provide five qualitatively distinctive reports, and if they converge, you presume—it's like a definition of real—five converging reports using qualitatively distinct measurement processes constitutes reality.
00:51:56.780 Yeah, sure.
00:51:57.500 Right, right, right.
00:51:58.080 So it seems to me there's an analog there of the equation issue, that it's true if one counting method produces this result, and another counting method produces this result, that constitutes an equation.
00:52:07.840 And those two things are, what would you say, is equal the same as real there?
00:52:14.300 I don't know if that's a fair thing to say.
00:52:16.580 So, okay, well, anyways, you're—
00:52:17.940 Equivalent is the word, but—
00:52:18.760 Equivalent, right, equivalent.
00:52:19.900 Equal is an equivalence relation.
00:52:21.940 Right, right, right.
00:52:23.180 So they're the same.
00:52:24.180 Yeah, yeah.
00:52:25.140 So as it turns out, this gave me a tremendous amount of background.
00:52:28.760 I think, how does math help what I do?
00:52:30.760 A lot of background in detecting a pattern and being able to articulate it in a less abstract way as to what it is.
00:52:37.860 So I go find—I would find patterns and say, how can I describe this pattern, not just in one way, but in two ways?
00:52:43.720 Right, right.
00:52:44.240 At the same time.
00:52:45.360 And you were doing that in the papers?
00:52:46.600 So I read the papers, and I detect a pattern of how they use language and how they cite and who they think are important.
00:52:51.500 And then I just go reproduce this in another way, and of course—
00:52:54.400 You're building a little chat GPT in your imagination.
00:52:56.540 More or less, yes.
00:52:57.800 Yes, definitely.
00:52:58.220 Absolutely, and it was extraordinarily successful.
00:53:01.280 Another, by the way, thing that math helps with is mathematicians are pretty particular about definitions.
00:53:06.780 Yes.
00:53:07.400 The most.
00:53:08.220 Right, of course.
00:53:08.380 Because whatever we say is a definition, all of the logical conclusions of that definition in the axiomatic system are necessarily true by consequence forever universally.
00:53:18.360 Right, right.
00:53:18.900 So if you get it wrong a little bit, if you say, for example, a prime number is a number that's divisible by one in itself.
00:53:25.220 That's a very common, you know, elementary school definition.
00:53:28.740 That's not adequate because it leaves open the question, is one prime?
00:53:33.280 And the answer to that is no, one is not a prime number.
00:53:36.220 So the actual definition of prime, when you get very cautious, is it is a number with exactly two factors.
00:53:40.980 Which sounds like the same thing, but it's not the same thing because it removes that one question of ambiguity, upon which all expressions of things like the fundamental theorem of arithmetic.
00:53:50.360 Right, so get your axioms, right.
00:53:51.180 Well, so, okay.
00:53:51.800 So partly what you're doing as you're diving into the underlying religious substrate is to go farther and farther down into the axioms.
00:53:58.380 Well, yes, but it also, I have the ability to read them, and when they misuse words, I can figure out what they must mean by the word they're using.
00:54:09.140 And then I can go start to check that to see.
00:54:11.500 Right, right, right.
00:54:12.640 Like equity, for example.
00:54:14.300 Like equity, or diversity, or democracy, or actually literally almost every word, or power, yeah.
00:54:20.420 Oppression, yeah, yeah.
00:54:21.420 It's really useful to figure out what those words mean.
00:54:23.820 So that STEM mind ends up having been, it was trained in kind of these two particular skills, and I think I had a proclivity.
00:54:33.480 There's definitely, why does anybody become a kind of fringe branch of mathematics where it's hard to get a job if you apply for them because there's just not that many of them?
00:54:43.080 Why would you do that?
00:54:43.860 It's because you have a proclivity for it.
00:54:45.220 There's a selection bias into that.
00:54:47.020 You're interested in it.
00:54:48.000 You're good at it.
00:54:48.780 You're talented in it.
00:54:50.360 I thought it was the most fun thing in mathematics.
00:54:53.020 I could have been an algebraist.
00:54:54.360 I was good at algebra.
00:54:55.280 Algebra is very employable.
00:54:56.680 It's very necessary.
00:54:57.700 It's very useful.
00:54:59.680 I didn't want to be an algebraist.
00:55:01.100 I wanted to be a commonotorist, which, why?
00:55:03.960 Because I really enjoyed it.
00:55:05.540 I really enjoyed getting to think that way, challenging myself to think that way about patterns.
00:55:09.980 And you were able to, you think you were able to take that proclivity and then apply it to what you were doing in the humanities.
00:55:15.720 And oddly enough, what you ended up doing in the humanities was producing parodies of humanities papers.
00:55:23.780 And so you found that intellectually compelling.
00:55:28.240 What was your motivation outside of the intellectual compulsion?
00:55:33.540 Now, you talked a little bit about the fact that you were annoyed about the fact that when you were teaching at the university, you were called on essentially to falsify the teaching process.
00:55:42.220 So that must have been lurking around in there in the background somewhere.
00:55:44.960 But what were you and Pluckrose and Boghossian conspiring about, so to speak, when you were producing these false papers?
00:55:50.960 Like, why the hell were you doing it?
00:55:52.160 Because people have asked that.
00:55:53.040 Are you just causing trouble?
00:55:54.700 No, it's a very simple answer.
00:55:56.160 We had seen some of these things because we were involved in the new atheism movement.
00:55:59.640 And it got attacked by this, you know, woke virus very early on before anybody knew what to call it.
00:56:04.400 We were all saying third wave radical feminism back then.
00:56:07.360 That was the phrase, and I think you were the only person saying something like postmodern neo-Marxism or something like this.
00:56:13.540 And so we were looking into this, and we would criticize.
00:56:18.220 You know, this deviates from, you know, standards of, you know, free liberal society.
00:56:22.700 This is oppressive.
00:56:23.520 This is against free speech.
00:56:24.720 We would offer these criticisms, and you know what we would get back?
00:56:27.780 If we didn't get called white or male or something stupid, we would get the most substantive criticism we would get is you're not credentialed.
00:56:34.320 You don't have a PhD in this.
00:56:36.100 You can't criticize it.
00:56:37.440 So we thought, well, you can delegitimize a fraudulent enterprise.
00:56:42.860 We started to read the papers and thought that they were fraudulent, and it was an emergency because they were dipping into the sciences.
00:56:47.240 Yeah, well, that made you weird to begin with, that you were reading the papers because I think 80 percent, is it 80 percent of humanity's papers are never cited once.
00:56:56.600 I asked a question at one point.
00:56:58.540 You said this is sexist.
00:56:59.960 What's sexist?
00:57:00.500 And I said, well, sexism is systemic.
00:57:01.880 This feminist woman was talking to me.
00:57:03.480 And I said, what does that mean?
00:57:05.420 And she sent me a feminist theory paper about sexism.
00:57:09.480 And I read it, and I came back to her, and I said, okay, I kind of get this concept, but why don't you say this is systemic sexism and distinguish from what most people think of as sexism?
00:57:21.000 She said, no, it is sexism.
00:57:22.140 It's the same thing.
00:57:23.020 But they're clearly not the same thing.
00:57:25.020 So this made me curious.
00:57:26.220 What's going on?
00:57:27.280 And then I started to read some of their papers here and there.
00:57:29.640 I wasn't that invested in it yet.
00:57:31.280 This is 2014 and 15.
00:57:32.400 You know, it also means, it's interesting too, it also means that all these, see, I've noticed this tendency among creative liberal types, right, is that they're very, very good at producing ideas.
00:57:45.360 Yes.
00:57:46.120 But they're not very good at editing them.
00:57:47.920 Yes.
00:57:48.320 And those are actually separate neurological functions, by the way.
00:57:52.220 Yes.
00:57:52.540 So the two different brain areas do that.
00:57:54.860 And so one's a producer and the other is an inhibitor, an editor.
00:57:58.300 And so if you're in dialogue with someone, true dialogue, you produce your ideas, but the other person can act as a critic.
00:58:07.380 That's what peer review is supposed to do.
00:58:08.780 But if no one's reading your papers, there's no editing function.
00:58:12.300 And so that creativity can just go everywhere.
00:58:15.060 Everywhere, yeah.
00:58:15.500 It can produce false positives, which is what unconstrained creativity does.
00:58:18.960 Sure.
00:58:19.400 It's like, well, that's a new idea.
00:58:21.100 It's like, yeah, but it's stupid.
00:58:22.540 Why is it stupid?
00:58:23.540 Because if you act that out in the world, you'll die.
00:58:25.740 Yes.
00:58:26.040 That's like the definition of stupid.
00:58:27.480 It is.
00:58:28.000 Right, right.
00:58:28.620 And you're supposed to kill your stupid ideas before you act them out.
00:58:31.840 And you do that with critical thinking.
00:58:34.040 And if these papers aren't being read, or they're not being criticized, that also means that the people who are producing the ideas don't get to hone their ideas.
00:58:41.240 Right.
00:58:41.400 Because they don't get to learn how to distinguish between the smart ideas and the ideas that aren't so good.
00:58:46.100 Right, yeah.
00:58:46.740 Right.
00:58:46.900 There's no iron to sharpen the iron whatsoever.
00:58:48.980 Right, right, right, right, right.
00:58:50.460 There's nothing to separate the wheat from the chaff.
00:58:52.500 Right, right.
00:58:53.060 Exactly.
00:58:53.400 And there's no spirit of truly logical inquiry.
00:58:57.320 Logo-spaced inquiry.
00:58:58.300 Yeah, exactly.
00:58:59.060 Right, okay, okay.
00:58:59.920 So you are bringing that to bear.
00:59:01.240 It makes sense, too, because of the discipline that you came from.
00:59:04.120 You're bringing that sharp eye to bear on this collection of, what would you call it, mixed creative overproduction, hyperproduction.
00:59:15.720 Exactly.
00:59:16.180 So we thought we could expose that problem and simultaneously solve the problem that our most annoying critics were telling us.
00:59:23.060 You have no authority to speak on this because you clearly don't have a degree in it.
00:59:27.360 Well, we thought, well, I'm not going to go to school and go get a degree.
00:59:30.380 Why go back to school for the more—I've been in school forever.
00:59:33.220 I'm not doing more school.
00:59:34.280 But if we're publishing at the PhD or the research level, surely we know something about it.
00:59:39.240 Right, right, right, right.
00:59:40.260 Well, it turns out that they did not accept this as a credential.
00:59:43.160 That did not satisfy them.
00:59:45.860 I'm still someone who lacks a PhD.
00:59:46.700 Right, well, we can point out for everyone who's watching and listening, too, that you could—this is a rule of thumb, and it's a rough one.
00:59:53.380 But it's not so bad if you're trying to understand this, is what's a PhD equivalent to?
00:59:58.240 And essentially, a PhD is equivalent to three published papers.
01:00:02.860 So because in most universities, certainly in the social sciences, if you publish three papers in reasonably well-regarded journals,
01:00:12.120 and then you aggregate them into a single document, add an introduction and a discussion, you have a PhD that will be acceptable to your committee.
01:00:19.500 Right.
01:00:19.760 And so if you can produce three published papers, you have demonstrated by the standards of the field.
01:00:25.560 Yeah.
01:00:25.720 In fact, in the fields that are less rigorous, one or even zero published papers will often do it.
01:00:33.120 Because lots of times, you know, the median number of publications for a PhD graduate is one.
01:00:39.580 Yeah.
01:00:40.040 Median.
01:00:40.600 Right, exactly.
01:00:41.380 So when you guys publish three, not only have you mastered the discipline, you've exceeded the norm by a substantial margin.
01:00:48.560 Sure.
01:00:48.940 Right, right.
01:00:49.540 Yeah.
01:00:49.660 Okay, so that's—so partly you were trying to indicate—I see.
01:00:52.260 So partly you were trying to indicate that you knew the lingo.
01:00:55.940 Exactly.
01:00:56.420 This is important to criticize.
01:00:58.060 We have this barrier to being able to criticize it and be taken seriously doing so.
01:01:01.700 We have an authority gap, a recognized authority gap, a credential gap, and so let's fill it, was the other motivation.
01:01:09.500 This needs to be exposed, and we need to credential ourselves as authorities that can criticize this from the outside.
01:01:16.440 That was it.
01:01:17.660 Because we saw that there were few or no authorities on the inside who were willing to criticize it.
01:01:22.200 And so walk me through again why you came to the conclusion that it needed to be criticized.
01:01:27.020 You touched a little bit on the fact that you were being attacked when you put forward the new atheism arguments by the postmodern horde.
01:01:34.820 But were there other reasons that you felt that there was like a corruption at the core of this that needed to be exposed?
01:01:40.620 Well, we started to read lots of the papers, as a matter of fact.
01:01:43.440 A lot of these very silly academic papers.
01:01:45.700 Oh, I see. So you started to learn what was actually happening.
01:01:47.940 Yes.
01:01:48.240 Oh, that's nasty.
01:01:49.600 Yeah. And so—and then there was one, though, that tipped us over the edge.
01:01:53.120 It was actually quite famous.
01:01:54.600 It was operating on a half a million dollars of National Science Foundation money.
01:01:59.200 It was a paper that was published in 2016 out of the University of Oregon, or four of the—some of the professors, anyway.
01:02:04.760 There were four authors are from the University of Oregon.
01:02:06.880 And it's about needing to bring feminism into the science of glaciology in order to successfully combat climate change.
01:02:13.920 Right, right.
01:02:14.540 They've got a TED Talk out of this.
01:02:16.080 I mean, this is the most—I read this thing, and it was so shocking to me as somebody with a background in science.
01:02:22.220 I actually shut down psychologically for almost three days.
01:02:25.260 I kind of stayed in a dimly lit room.
01:02:27.440 I wouldn't interact.
01:02:28.220 I barely ate.
01:02:29.000 I was so depressed at this attack on—that a journal with an impact factor so high—it was a seven, for those who know what that means—would publish a paper this off course about what the sciences are about.
01:02:42.180 I mean, it was suggesting that the sciences are sexist unless they bring in feminist art projects.
01:02:46.560 It was saying that in addition to studying—and these are true—if I tell you what's in this paper, nobody believes.
01:02:51.100 In addition to studying satellite photography of glaciers, which are the God's eye view from nowhere and literally called pornographic pictures because it's, you know, the satellite is a pornographer staring down in—
01:03:04.200 Mother Earth?
01:03:05.120 At Kooning at Mother Earth, yes.
01:03:06.940 And I use that G word very intentionally.
01:03:10.840 They went on to say that unless we take paintings done by women in specific of glaciers and study those as well, beside the satellite photographs, then it's not a comprehensive science.
01:03:23.540 It shuts out all these other perspectives, other means of knowing.
01:03:27.140 If we don't include indigenous perspectives and mythologies about why ice is the way that it is and why it moves, then we're obviously being colonialist and masculinist and all these horrible things.
01:03:37.000 And I was shocked that a high—not some fringe, goofy little, you know, qualitative studies journal, but a high-impact factor journal would publish this brazen of an assault on the scientific methodology at all.
01:03:51.180 I had no idea how corrupt it was until I saw that.
01:03:54.320 I see.
01:03:54.780 And it did shock you.
01:03:55.840 That was it.
01:03:56.080 I think about it.
01:03:56.860 When you were speaking about that, it's an image that's come to mind a lot lately for me,
01:04:01.900 is that, you know, what we're seeing is the invasion of whale carcasses by snow crabs.
01:04:08.800 I don't know if snow crabs is the right word, but—
01:04:11.040 I think I know what you mean.
01:04:11.900 Yeah, you bet, man.
01:04:12.980 It's like once something stores up value, the universities have stored up value, right?
01:04:17.320 Historical value, right?
01:04:19.520 Credentialing value.
01:04:20.380 Yeah.
01:04:20.820 And unbelievable financial resources.
01:04:23.720 Yes.
01:04:24.020 Like they are whale carcasses.
01:04:25.840 And so now everyone on the fringe is saying, you know, I'd like access to that.
01:04:30.480 And they make these arguments about why they should be included, right, into the enterprise
01:04:34.800 because what they're after is access to those stored resources.
01:04:37.920 That's right.
01:04:38.340 Exactly.
01:04:38.880 That's exactly right.
01:04:39.900 So that was the moment where I got on the phone.
01:04:42.140 Why do you think it shocked you so badly?
01:04:44.320 I think I treated science as sacred.
01:04:47.820 Oh, yeah.
01:04:48.220 Well, you were part of the New Atheist Movement, right?
01:04:50.760 Well, this is one of the things that's so interesting, I think, is that—and I don't know what you make of this exactly,
01:04:55.420 but one of the things I see happening is that—so the Enlightenment critique was one of the reasons for the death of God.
01:05:03.400 Now, and the New Atheist types, Dawkins and Harris, Dawkins in particular, would celebrate the death of God
01:05:10.840 because that would free the scientific enterprise from the superstitious overlay that was interfering with clear rationality.
01:05:17.760 Right.
01:05:17.860 But then that begs a question, which is, well, what is the relationship, let's say, between the Judeo-Christian tradition and science?
01:05:23.800 And one answer is antagonistic.
01:05:26.420 Yeah.
01:05:26.540 And the other is, no, the Judeo-Christian tradition established the monasteries.
01:05:32.260 For example, the universities grew out of the monasteries and the scientific tradition grew out of the universities
01:05:36.500 and that there's actually—that the scientific project is actually embedded in the Judeo-Christian project.
01:05:43.220 And the reason for that is the Judeo-Christian project is predicated on the Greek idea that there's a logos in the world
01:05:50.300 and the Jerusalem idea that there's a logos in the intellect.
01:05:54.020 And all of that's a precondition for science.
01:05:56.740 And if we lose God, we'll lose science too.
01:05:59.000 And I think that—I think that's what's happening.
01:06:01.180 I think that the new atheists, because I think the historical notion that there is an antagonism between science and religion
01:06:09.620 is actually a misreading of history.
01:06:11.360 I don't think that is how it laid itself out.
01:06:13.600 And I do believe that the—and it's an oddly postmodern argument in some ways—
01:06:18.020 is that the scientific enterprise is embedded in the broader Judeo-Christian narrative.
01:06:22.860 So when I look at someone like Dawkins, for example, I think, okay, here's what you believe, Dr. Dawkins.
01:06:27.840 You believe there's logos in the world, because otherwise there's an intelligible order in the world.
01:06:34.900 You believe that studying that intelligible order is redemptive, right?
01:06:39.880 No, first you believe that you are constituted so that you could understand that order.
01:06:44.400 Yes, yes, yes.
01:06:44.960 And then you believe that if you understood that order, that would be redemptive.
01:06:48.600 It's like every single one of those axioms is religious.
01:06:51.160 That's right.
01:06:51.900 That is the fundamental construction, in fact, of a religion.
01:06:55.160 And it's, in fact, if we boil it down into legalese, what the Supreme Court recognizes
01:06:58.980 as what constitutes an established religion for establishment clause purposes,
01:07:03.820 is it a comprehensive system of belief and practice that this is their definition
01:07:08.580 that answers fundamental questions about the world and man's role in it
01:07:12.160 such that it gives rise to duties of conscience.
01:07:13.760 And this is precisely, you know, the legalese kind of practical version of those same fundamental axioms.
01:07:23.260 Because these things, there is that multidimensional convergence of what's really going on here.
01:07:28.080 Yeah, well, it might be, you know, that the most fundamental axioms of a conceptual system
01:07:32.220 are religious claims by definition.
01:07:35.440 You have to say, we hold these things to be self-evident.
01:07:38.420 And I would say, well, Dawkins has to hold the three things that I laid out as self-evident.
01:07:42.900 And I think that those are part and parcel of the original Judeo-Christian argument.
01:07:47.000 Sure.
01:07:47.400 I mean, the fact that the entire scientific enterprise, as hardcore as physics might be,
01:07:53.600 utterly depends on a complete and irrational faith that there is a logical structure to the world
01:08:01.720 that doesn't change.
01:08:03.300 Yeah.
01:08:03.620 Or that only changes in ways that are intelligent.
01:08:05.820 Right, because what you could do, you could also easily set up as an axiom
01:08:09.660 that it is immoral to analyze the transformations of the material world
01:08:14.180 because all that will produce is danger.
01:08:16.240 And, you know, the Frankenstein story, the Gollum story, the Tower of Babel story,
01:08:20.720 for that matter, are variations of that axioms, right?
01:08:23.720 Look out, be careful what you study.
01:08:26.600 You know, you might open Pandora's box, for example.
01:08:29.040 Right.
01:08:29.340 Have your liver torn out forever because you're Prometheus.
01:08:32.380 And, you know, there's certainly, you can make a case for that.
01:08:35.060 So, the axiom that investigating the transformations of the material world in good spirit,
01:08:40.360 let's say, in the proper spirit, will be redemptive, that isn't a factual statement, right?
01:08:45.620 It's an a priori claim.
01:08:46.800 Yeah, it's an axiom.
01:08:47.580 Yeah, okay.
01:08:48.320 It's like the axiom of infinity.
01:08:49.680 Is there infinity?
01:08:50.480 Who knows?
01:08:51.240 But mathematicians generally accept that, okay, we're going to use this concept of infinity.
01:08:55.120 We're going to say there's infinity.
01:08:56.680 But technically—
01:08:57.340 And then we're going to see what happens.
01:08:58.660 It is unknowable if there is infinity.
01:09:00.920 Right, right.
01:09:01.400 By definition, it's unknowable.
01:09:03.340 But there's an axiom called the axiom of infinity.
01:09:05.480 Yeah, yeah, yeah.
01:09:05.780 Infinity exists.
01:09:06.880 Yeah.
01:09:07.480 It more or less is what it states.
01:09:09.980 Yeah, well, I've started to understand that the most fundamentally religious and the most fundamentally axiomatic, that's the same thing.
01:09:20.040 Sure, sure.
01:09:20.740 Okay, and that makes sense to you.
01:09:22.040 That makes complete sense to me, yeah.
01:09:23.560 Yeah, okay.
01:09:24.180 So now you go down the axiom hierarchy, and the farther down you go, the closer you get to the sacred, essentially.
01:09:30.320 And I think the reason that it's sacred, by the way—so axioms constrain entropy.
01:09:37.000 That's a good way of thinking about it, right?
01:09:38.980 And so the reason that you have to hold some things as sacred is because the things you hold as sacred are the things that constrain the most entropy in your conceptual system.
01:09:48.340 So if you blow an axiom, you free the entropy.
01:09:51.900 That's what happened to you when you read that paper.
01:09:53.720 Yes.
01:09:53.900 That's why you're hiding in that room for three days.
01:09:55.760 Three days.
01:09:56.180 Right, right.
01:09:56.680 Because an axiomatic presumption had been challenged.
01:09:59.000 You thought not only was the scientific enterprise valuable, you thought it was valued.
01:10:04.180 Yes, that's right.
01:10:04.780 Right, right.
01:10:05.820 Turns out you were wrong.
01:10:07.080 I was wrong.
01:10:07.720 You bet, man.
01:10:08.620 I was wrong.
01:10:09.100 You bet, and the STEM people are certainly going to find out how true that is.
01:10:12.140 You know, I saw this five years ago.
01:10:13.700 I was warning people in STEM.
01:10:15.620 I said, you guys are apolitical.
01:10:18.420 You are sitting ducks.
01:10:19.720 You have no idea what's going to happen when the people who swarmed the humanities—and those people were partly political, the humanities professors, so they had some defense.
01:10:28.820 You wait till they land up on your shores.
01:10:31.020 You people have no idea what's coming your way.
01:10:33.780 They're going to go through you like a hot knife through butter.
01:10:36.060 And this is happening.
01:10:38.180 Oh, yeah.
01:10:38.520 Look at our medical journals.
01:10:39.580 Oh, yeah.
01:10:40.080 Well, they say 75 percent—this is so horrible—75 percent of new applicants to STEM positions in the University of California State Systems have their applications, their research dossiers are unread because their DEI statements aren't sufficient.
01:10:56.760 Seventy-five percent.
01:10:57.980 That's an astonishing number.
01:10:59.280 It is something to behold, boy.
01:11:02.140 Yeah, Trofla-Mysenko would smile down on this.
01:11:04.300 That's for sure.
01:11:05.240 That's for sure.
01:11:05.920 Talk about a coup.
01:11:07.240 Yeah.
01:11:07.680 You know, so that you can replace those decades of work that it takes to become, say, a PhD in something difficult like mathematics.
01:11:14.860 Yeah.
01:11:15.340 And you can reduce that to a DEI statement, and then you can let the dimwits who evaluate DEI statements decide which mathematicians get to practice math.
01:11:24.100 That's right.
01:11:24.400 It's like, oh, my God.
01:11:25.380 And now the DEI statement will be written by ChatGPT.
01:11:27.880 Yeah, right.
01:11:28.480 Trick everybody.
01:11:29.360 Right, right, right.
01:11:30.300 Well, maybe you technical types have come up with a solution to the DEI problem.
01:11:34.500 Yeah.
01:11:34.660 Just get ChatGPT to write the statements.
01:11:36.580 Yeah, or let me do it.
01:11:37.340 You've automated the compliance process.
01:11:40.560 Oh, yeah, that's pretty damn funny.
01:11:42.060 Yeah, yeah.
01:11:42.820 In a horrible, horrible way.
01:11:45.100 So, yeah, this is ultimately what it was.
01:11:46.880 So, why do I feel like I've gotten out of this resentment?
01:11:50.120 Well, I'm not resentful.
01:11:52.280 Okay.
01:11:52.740 I feel gratitude.
01:11:54.780 Like I began saying, I feel very privileged that I get to travel and talk about these issues.
01:11:59.060 Okay, so let me unfold what happened to you.
01:12:02.220 You published these papers, and that caused—let's walk through that a little bit.
01:12:05.840 That caused all sorts of trouble, right?
01:12:07.600 That was wonderful, yes.
01:12:09.040 Yeah, okay.
01:12:09.480 So, tell me the story, and tell me what's happened in your life since then, because now
01:12:13.800 I don't know—you were a massage therapist, then you wrote these preposterous papers, then
01:12:18.000 there was an explosion around that, but I have no idea, like, how you're keeping body
01:12:21.660 and soul together now.
01:12:22.880 And so, like, what is your professional life at the moment?
01:12:26.520 I mean, my professional life now is chaos.
01:12:29.240 It's been a learning process to deal with the amount of travel, the demands on my time,
01:12:33.260 the requests—
01:12:33.620 And are you being paid as a speaker constantly?
01:12:35.600 Yeah, yeah.
01:12:35.760 And is that how you're deriving most of your income?
01:12:37.600 Most of it, yeah.
01:12:38.600 So, I also created a company called New Discourses, where I publish my own materials and put out
01:12:43.300 my podcast, kind of my own platform, website, et cetera.
01:12:46.240 Right, right, right, right.
01:12:46.460 And that—
01:12:47.140 Is that a subscription platform, New Discourses?
01:12:49.120 It has—
01:12:49.560 How do you monetize that?
01:12:50.340 It's optional subscription, and it's only by the generosity of other people that it stays
01:12:55.420 going.
01:12:55.860 I don't have any big donors, contrary to what the internet believes in urban legends.
01:12:59.800 I have none.
01:13:00.340 Zero.
01:13:01.140 I don't think I have a—
01:13:01.880 Big oil, come on.
01:13:02.880 Big oils.
01:13:03.680 Yeah, right.
01:13:04.100 And Big Pharma, no doubt.
01:13:05.440 Yeah, Big Pharma loves me, I'm sure.
01:13:07.500 I'm sure they do.
01:13:08.140 Yeah.
01:13:08.440 I'm sure they do.
01:13:09.120 No, so—
01:13:09.820 But it's optional.
01:13:11.420 I give out virtually everything for free.
01:13:13.580 I offer one product that's behind a paywall, and it's a kind of more personal podcast where
01:13:18.880 I share kind of my more cutting-edge experimental ideas and stories from my trips that I think
01:13:25.000 are instructive in some way.
01:13:26.400 But other than that, it's all public, and it's the generosity of people who appreciate it.
01:13:30.140 Are you approximating something that would be the equivalent of a reasonable academic salary?
01:13:35.740 I'm exceeding that.
01:13:36.880 Oh, well, congratulations.
01:13:38.340 Rather well.
01:13:38.920 Right, right.
01:13:39.520 Well, so look at that.
01:13:40.600 You started making the right sacrifices, and everything turned around.
01:13:43.560 That's—I feel that way, yeah.
01:13:44.860 Isn't that something?
01:13:45.440 And again, how do I feel about it?
01:13:46.800 Great.
01:13:47.160 Grateful.
01:13:47.780 I don't feel like I should be bigger.
01:13:49.440 I think your answer's a good one, by the way, because I was curious about that, you know,
01:13:53.120 because I've watched people who were admired in bitterness say that they're no longer admired
01:13:58.840 in bitterness, and I remember—I think it's the Nietzschean dictum, which is something
01:14:02.180 like, you think you're done with the past, but that doesn't mean the past is done with
01:14:07.060 you, you know?
01:14:07.980 And if you've gone down a dark road and been in that for a long time, there are traces
01:14:13.400 of that that last for a long time, and they will come and get you if you think you've
01:14:17.580 escaped.
01:14:18.000 And so—but your response that you're grateful, that's a good response, because gratitude
01:14:24.160 is the opposite of bitterness.
01:14:25.720 I feel like I get to serve.
01:14:27.720 Yeah, okay, that's a good answer, too.
01:14:29.180 I just do.
01:14:30.180 I don't ask for—you know, you asked about the speaking fees.
01:14:33.280 Yeah.
01:14:33.560 I don't ask for very large ones.
01:14:35.320 I'm very modest in what I ask.
01:14:37.980 You know, I want my expenses covered, obviously, but other than that—
01:14:40.340 Do you have an agent?
01:14:41.300 I do.
01:14:41.900 Okay.
01:14:42.440 And it's all very, very modest.
01:14:44.140 I keep everything extremely modest, because I sat down with myself.
01:14:47.560 A couple of years ago, and I said, if it came to the fact that—let's say the right
01:14:51.680 person's in the audience, because you never know who—
01:14:53.440 Yeah, yeah.
01:14:54.360 And I said no over a matter of a few hundred or a few thousand dollars to this event, supposing
01:14:59.800 it would fit into my schedule, which is busy.
01:15:02.100 And I said no to this.
01:15:04.040 I've done something gravely wrong.
01:15:06.400 If I don't have time, you know, we have to—
01:15:08.220 So what do you—do you have more invitations than you can fulfill?
01:15:12.340 I kind of hit right at the line.
01:15:14.960 Okay, okay.
01:15:15.560 Well, because one of the arguments for raising your fees is because it helps you prioritize
01:15:21.420 if you have a plethora of invitations.
01:15:23.160 Correct.
01:15:23.480 But you're right on the—
01:15:24.380 I ride right on the market.
01:15:25.680 I see, I see.
01:15:26.020 And it's wonderful.
01:15:26.440 Well, that's a nice place to be.
01:15:27.260 I'm also grateful for that.
01:15:28.540 Yeah, yeah.
01:15:28.980 It's just—it's a—I hesitate to use loaded words glibly, but it's almost providential
01:15:36.620 that it's working this way.
01:15:38.460 And so I'm very excited about it.
01:15:40.520 So I don't feel like I fell, and I can look back on and reflect.
01:15:45.120 But I think that the moment where the decision was made was during the grievance studies papers.
01:15:51.800 We wrote one.
01:15:52.440 It was about education.
01:15:53.460 We called it the progressive stack.
01:15:54.860 We said we should progressive stack the classroom.
01:15:56.520 So if you have—we're going to do a privilege inventory, whether I'll make them do that walk,
01:16:01.660 that privilege walk, where if, you know, you have this or that, go forward three steps.
01:16:05.360 If you're white, walk backwards out of the building or whatever they make you do.
01:16:09.000 And we're going to rank all the kids, and we're going to put, you know, your roster for the class
01:16:14.180 will be ordered according to privilege.
01:16:16.640 Intersectional privilege, hopefully.
01:16:17.960 Well, yes, intersectional, of course.
01:16:19.240 That's very nice.
01:16:19.840 And so the more privilege that you have, the worse we're going to treat you.
01:16:24.680 We're going to ignore you.
01:16:26.080 We're going to—our phrasing was invite you to listen and learn in silence.
01:16:29.560 Right, right, right.
01:16:30.540 And then it progressively got worse to—
01:16:32.960 So to speak.
01:16:33.540 You know, we'll speak over you.
01:16:35.020 We'll interrupt you.
01:16:36.300 We'll, you know, report you for things to the dean or whatever.
01:16:40.280 We'll actually invite you to sit in the floor to experience reparations.
01:16:44.500 You should wear chains.
01:16:45.320 You should do humiliating things.
01:16:46.320 Right, right.
01:16:46.800 For the other kids staring at you to overcome your privilege.
01:16:48.820 But, of course, these are hoaxes.
01:16:50.160 So we said, but we'll do it with compassion.
01:16:53.180 Critically compassionate intellectualism.
01:16:55.420 Yeah, we'll do it with compassion.
01:16:56.760 And the peer reviewers wrote us back, and they said, don't use compassion.
01:17:01.820 They said it'll threaten to re-center the needs of the privileged if you're compassionate with them.
01:17:07.120 And instead, they recommended what they call.
01:17:08.900 So because they figured you were on their side, they could show their true colors.
01:17:12.780 I think so.
01:17:13.400 Right, because one of the things I have learned is that—so serpents camouflage themselves, right?
01:17:18.000 You know that we can detect serpent camouflage better in the bottom half of our visual field, by the way.
01:17:22.880 Oh, yeah.
01:17:23.380 Yeah, yeah.
01:17:24.040 You know what the best camouflage is for serpents?
01:17:26.940 What's that?
01:17:27.740 Compassion.
01:17:28.400 Oh, sure, sure.
01:17:29.220 You bet.
01:17:30.020 You bet.
01:17:30.500 Sure, sure.
01:17:30.620 And it really works on conservatives.
01:17:32.320 And the reason it works on conservatives is because you can really make conservatives feel guilty.
01:17:36.440 Because they're dutiful.
01:17:37.360 So they're guilt-prone.
01:17:38.340 Yeah.
01:17:38.540 So if you can come after conservatives with compassion and you can say, you're not doing your duty on the compassionate front, instead of the conservatives going, you're a serpent, they go, oh, no, you know, we could be a little better.
01:17:50.580 And, of course, they could because, like, who's perfect on that front?
01:17:53.240 Of course.
01:17:53.800 But, mm-hmm.
01:17:54.500 Yeah, of course.
01:17:55.120 Well, we were talking just before we started this podcast about some of the new psychological research on left-wing authoritarianism.
01:18:01.160 And so I read a paper here a week ago.
01:18:03.640 There's not a lot of papers on this front.
01:18:05.100 There's only about 10 because the social psychologists denied that left-wing authoritarianism existed for seven decades, right, till 2016, before I got, what would you say, disenfranchised from the university.
01:18:17.680 Yeah.
01:18:17.900 My lab did a study on left-wing authoritarianism.
01:18:20.300 And the first thing we did was to see if there was a clump of ideas that were statistically related that you could describe as both left-wing and authoritarian.
01:18:28.340 And there is.
01:18:29.440 And it's identifiable.
01:18:30.420 It's exactly the clump of beliefs you would have been studying and would suspect.
01:18:33.940 Yeah.
01:18:34.500 We looked at what predicted that.
01:18:36.420 We've predicted allegiance with that set of beliefs.
01:18:38.600 Low verbal intelligence.
01:18:40.020 Negative 0.4 with IQ, verbal IQ.
01:18:43.000 So you think, well, how can people be, you know, unwise enough to believe these ideas?
01:18:47.720 And one of the answers is, well, they're not that bright, as it turns out, being female, having a feminine temperament, right?
01:18:53.700 Those were the three big predictors.
01:18:55.660 Other predictors have emerged looking at the similar construct, left-wing authoritarianism.
01:19:01.260 The best predictor I've seen is malignant narcissism.
01:19:03.940 Yeah.
01:19:04.140 Correlation is 0.6.
01:19:06.040 0.6, right?
01:19:07.680 Which is about as good as the measurement accuracy of the questionnaires.
01:19:10.860 Yeah.
01:19:11.000 Right, so it actually opens up the question.
01:19:14.860 The question is, there may be no difference between left-wing authoritarianism and malignant narcissism.
01:19:20.600 And what that means is the serpents are using the language of compassion to mask their power striving.
01:19:26.300 That's right.
01:19:26.640 While simultaneously claiming, well, of course we can do this because every single social relationship in the world is predicated on nothing but power.
01:19:33.240 And if you don't accept that, that just means that you're a malignant liar.
01:19:36.800 That's right, exactly.
01:19:37.900 That's the whole structure.
01:19:38.860 Yeah, yeah, yeah, yeah.
01:19:40.040 And then that kind of discourse is disinhibited in these disciplines where no one subjects any of the ideas to critical evaluation, right?
01:19:48.780 And the malignant narcissists are also disinhibited online.
01:19:52.020 Yeah.
01:19:52.420 Right, which is a huge problem, right?
01:19:54.120 None of our evolved mechanisms for keeping malignant narcissists under control are operative on the social media front.
01:19:59.660 That's correct.
01:20:00.340 Yeah, it's really bad.
01:20:01.080 They can have 20 accounts.
01:20:01.860 They can do whatever.
01:20:02.380 They can do whatever they want.
01:20:03.400 They have 100% free reign.
01:20:05.700 30% of internet traffic is pornographic, right?
01:20:08.900 Like, criminality is absolutely rife on the internet, right?
01:20:11.480 You can't control it.
01:20:12.560 And then you have the subclinical criminality, which are the troll demon types.
01:20:16.240 Yeah, yeah, yeah.
01:20:16.440 And they're monetized by the social media platforms.
01:20:18.880 Of course.
01:20:19.500 I've been trying to convince—well, convince.
01:20:22.380 I've made a case on social media multiple times that platforms like Twitter, for example, should separate the anonymous people from the real people.
01:20:31.080 They should put them in different categories.
01:20:32.700 Right.
01:20:32.980 Because if you can't bear responsibility for your words, you shouldn't be allowed free reign in the realm of discourse.
01:20:40.340 And the reason for that, people say, well, anonymity protects freedom.
01:20:43.420 It's like, no, if you took 100 anonymous troll demons, one of them is a whistleblower, and the other 99 are malignant narcissists.
01:20:51.380 Right.
01:20:51.480 So—and I think they should be allowed to have their say, but they shouldn't be thrown.
01:20:55.080 The troll demons—and those are, what would you say, machine-human hybrids, right?
01:20:59.380 Because when you're online, you're a machine-human hybrid.
01:21:01.900 Sure.
01:21:02.900 Troll demons are not human, right?
01:21:06.260 Anonymous troll demons are not human.
01:21:08.020 Right.
01:21:08.280 You don't put them in with the people.
01:21:09.640 You put them in, like, anonymous troll demon hell, and if you want to go there and visit and see what they're up to, no problem.
01:21:15.580 But they shouldn't be confused with people who will take the consequences of their words onto themselves.
01:21:21.140 You haven't operated anonymously.
01:21:23.080 No, I have not.
01:21:23.860 Why not?
01:21:24.880 I have—you've just got to say these things.
01:21:28.100 You just have to say these things.
01:21:30.440 Why do you think that?
01:21:32.360 I think that telling the truth is the most important thing that we have to do.
01:21:36.680 So why don't—why not shield that with anonymity?
01:21:39.640 I—that's not as grave.
01:21:44.740 How on earth can someone come and challenge me or check me?
01:21:48.460 If I'm anonymous, I can just vanish.
01:21:50.720 Right, right, right.
01:21:51.640 So what I come and say—
01:21:52.360 So you don't have to subject any of your ideas to critical evaluation or to take any of the weight of what you say on yourself.
01:21:57.420 Right, or compare it against a pattern of established thought.
01:22:00.540 I like the idea that I—I do understand why some people have certain risks they're not willing to take, but I try to encourage them to take those risks.
01:22:11.620 You don't understand.
01:22:12.400 It's like, yeah, yeah, yeah.
01:22:14.020 If you have something to say, story of Jonah.
01:22:16.880 Let me—I'll give you a one-minute summary of the story of Jonah.
01:22:20.200 Okay, so Jonah's just minding his own business.
01:22:22.440 God comes along and says, you know that city Nineveh?
01:22:26.280 Yeah.
01:22:26.640 Well, those people have deviated from the straight and narrow, and I'm not very happy, and I'm going to wipe them out.
01:22:31.300 But I'd like you to go there and tell them what they're doing wrong and let them know that they're in danger.
01:22:37.760 And Jonah thinks, there's no bloody way I'm going to a city of 120,000 people to tell them that they're wrong.
01:22:43.440 No.
01:22:44.040 So he hops in a boat and goes the hell the other direction.
01:22:46.820 Well, then the waves rise and the winds blow and the ship is threatened, right?
01:22:51.520 Yeah.
01:22:51.720 Which means that if you don't say what you're called upon to say, then the ship is threatened.
01:22:55.500 Well, the sailors think, well, there's someone on board who's on outs with God because that's why the storms are rising.
01:23:02.240 So they go talk to everybody on board.
01:23:04.460 You've got a problem with God.
01:23:05.680 And Jonah says, well, as a matter of fact, yeah, I've disobeyed direct order.
01:23:10.560 And the sailors say, well, we've got to throw you overboard because, like, otherwise we're all going to die, right?
01:23:15.740 So that means if you hold your tongue when you're called upon to speak, then everyone dies.
01:23:20.140 And so off they throw up, right?
01:23:21.500 Now he's drowning.
01:23:22.300 And you think, well, that's pretty bad.
01:23:23.460 He's drowning.
01:23:24.060 And then that's not so bad because the next thing that happens is a horrible creature from the darkest part of the abyss comes up and swallows him and takes him down to the bottom.
01:23:33.300 And so what that means is that if you hold your tongue when you're called upon to speak, not only do you put the ship at risk and then likely drown, but then something will happen to you that will make you wish you drowned, right?
01:23:46.540 So Jonah's now in hell, right?
01:23:49.200 Which is where you go when you hold your tongue, when you have something to say.
01:23:52.760 And he's there for three days in hell.
01:23:55.220 And then he repents.
01:23:56.720 The whale spits him up on shore.
01:23:58.140 Then he goes to Nineveh and says, I know what I'm talking about.
01:24:02.660 You guys, you've gone somewhere dark.
01:24:05.380 You better get your act together.
01:24:07.420 And they put on sackcloth and ashes and repent.
01:24:09.820 And God decides not to destroy them.
01:24:11.800 And that isn't precisely where the story ends, but that's where that part ends.
01:24:15.460 Yeah, yeah.
01:24:16.160 Hold your tongue at your peril.
01:24:18.320 You know, and I knew that because people have talked to me.
01:24:20.660 Maybe they've said the same thing to you.
01:24:22.220 They said, well, you know, thank you for your bravery.
01:24:24.900 And I think, it's not bravery.
01:24:27.100 I know what to be afraid of.
01:24:28.880 And I'm nowhere near afraid as afraid of the people who would want to compel my language as I am afraid of the consequences of not saying what I have to say.
01:24:36.320 Yeah.
01:24:36.580 Right?
01:24:37.460 The ship sinks, you drown, and then you wish you would have drowned.
01:24:41.660 Yeah.
01:24:42.000 Right.
01:24:42.440 Yeah, yeah.
01:24:42.840 I feel that.
01:24:43.480 I feel that.
01:24:44.160 That's exactly.
01:24:45.720 I see what to be afraid of.
01:24:49.300 Well, you know.
01:24:50.220 Of course, in a totalitarian state, everybody holds their tongue.
01:24:53.280 And that's this turning point that I was telling you about because I'm getting this feedback from these peer reviewers that have now pulled this mask off of themselves.
01:25:02.100 No compassion.
01:25:03.460 We're going to abuse students out of privilege, which we had meant college students, but this could apply to children.
01:25:09.700 Of course.
01:25:10.260 Very quickly.
01:25:10.820 And faster because they have no defensive voice.
01:25:13.860 Look what we did with them with masks.
01:25:15.660 Yeah.
01:25:16.020 So we're going to abuse them, and there'll be no compassion.
01:25:18.640 We're going to use what's called the pedagogy of discomfort, they told us.
01:25:22.080 Horrible.
01:25:22.520 Wow.
01:25:23.120 And so—
01:25:24.120 So you really saw the narcissists unmasked in the peer review process.
01:25:27.760 I talked with Mike Naina, who was doing a documentary.
01:25:30.800 It recently came out.
01:25:31.560 The Reformers, it's called, documenting what we were doing.
01:25:35.360 And I called him, and I said, Mike, we got to talk about this.
01:25:38.240 And we get talking about that feedback.
01:25:39.560 I'd sent it to him.
01:25:40.280 And we decided that the phrase we used—and I've got so much trouble for this online—it was that it represents the seed of a genocide.
01:25:48.120 I don't know if the seed's going in the ground.
01:25:49.640 I don't know if it's going to sprout.
01:25:50.680 I don't know if it's going to grow.
01:25:51.480 I don't know if the tree's going to bear fruit.
01:25:52.760 You're actually accurate about that.
01:25:54.160 You know, I wrote a paper with one of my students who had gone to visit the mass grave sites in Eastern Europe, by the way, before she became one of my students.
01:26:01.540 Brilliant girl.
01:26:02.860 Maya Cicic was her name.
01:26:06.280 We wrote a paper on the precursors to genocide and enhanced victimization.
01:26:12.700 The enhancement of the sense of victimization is one of the steps along the pathway to genocide.
01:26:18.060 Yeah.
01:26:18.560 Get them before they get us, for example.
01:26:20.660 So I stare this in the face.
01:26:22.200 Because this road, if followed to its apparent potential conclusion, is a genocide.
01:26:28.780 Or a totalitarian state.
01:26:30.820 And so I thought about this.
01:26:33.200 I sat on this for a couple of weeks.
01:26:34.520 And I took one of the braver moments of my life.
01:26:37.560 I went to my wife and I said, can I quit my job and dedicate my life to exposing—
01:26:42.620 This was the massage therapy job?
01:26:44.140 Yeah.
01:26:44.820 No more.
01:26:45.560 Can I dedicate all of my time to studying this and telling the world about it as fast as I can learn about it?
01:26:52.000 And she, being a woman of great practicality and wisdom, said, can you make money doing that?
01:26:58.080 Right.
01:26:58.340 And I said, I don't know.
01:27:00.160 It's actually a good question, right?
01:27:01.540 Because, I mean, it is one way of market testing the viability of your ideas.
01:27:05.180 Sure.
01:27:05.360 Because one of the things you might assume is that if there's no market, what makes you think you have anything to offer?
01:27:11.140 Yeah, exactly.
01:27:11.940 Right, right.
01:27:12.260 And so—
01:27:13.380 It's good discipline, that.
01:27:14.200 She gave me a runway.
01:27:15.480 She's like, you have 18 months to figure that out.
01:27:17.840 Huh.
01:27:18.120 And if we get to the end of 18 months—and she's a woman, so 18 months is 15 months in reality.
01:27:22.560 Huh.
01:27:22.820 So we got about to month number 15, and it got a little rocky.
01:27:25.800 And then I was actually—like I said, I'm completely crowd-supported, other than the speaking fees.
01:27:30.920 And so I—
01:27:32.200 So you're actually an autonomous intellectual.
01:27:34.660 Correct.
01:27:35.220 Right, right.
01:27:35.800 I built it that way very intentionally.
01:27:38.080 That's a very difficult thing to attain, so congratulations on that.
01:27:40.920 I did this very intentionally so that nobody can tell me I have to shut up.
01:27:43.760 Right, right, right, right.
01:27:45.120 And because things have to be said, and I don't know what has to be said, but I can't be told to shut up when I have to say it.
01:27:51.280 And I can't have anybody, some think tank guy, looking over my shoulder saying, just don't go there.
01:27:57.620 Some granting committee.
01:27:58.140 Don't insult so-and-so.
01:27:59.640 You know, we're not going to drag that into the light.
01:28:02.040 I can't have any of that.
01:28:03.300 And so I wrote myself a salary check at month number 16 that was the whopping total for 16 months of effort to try to build the beginnings of this of $2,000.
01:28:14.680 This is my big oil money.
01:28:17.180 $2,000 is not zero.
01:28:19.020 It's not zero.
01:28:19.660 And getting from zero to one is really, really, really hard.
01:28:24.660 Once you get to $2,000, the next $2,000 is a lot easier.
01:28:27.920 That's 100%.
01:28:28.460 Zero is rough, man.
01:28:29.920 Zero is hard.
01:28:30.480 And people, yeah.
01:28:31.220 Zero is a black hole.
01:28:32.580 It's not like any other number.
01:28:34.020 It's a black hole.
01:28:34.900 It's really hard to escape from zero.
01:28:36.300 Well, it's true.
01:28:36.680 You multiply any number by zero, what do you get?
01:28:38.600 Zero?
01:28:38.740 Right, right.
01:28:39.220 Exactly.
01:28:39.740 That's right.
01:28:40.180 Zero devours everything.
01:28:41.480 And getting out of zero is really, really difficult.
01:28:43.900 But once you get out of zero, you can start moving forward exponentially.
01:28:47.460 That's the Pareto distribution issue.
01:28:49.640 Yeah.
01:28:50.040 So you made some money, right?
01:28:52.280 So you saw that there was a market.
01:28:53.700 There was a market.
01:28:54.460 I was doing something useful.
01:28:55.800 I was doing something right.
01:28:57.680 I very fortuitously chose among this, you know, we'd written cynical theories.
01:29:02.800 We hadn't published yet.
01:29:03.740 And so I have this pantheon of evils to choose from.
01:29:08.220 What do I focus?
01:29:08.900 You can't focus on all of it.
01:29:10.740 The ultimate buffet in hell.
01:29:13.340 Yeah, exactly.
01:29:14.320 And I chose just kind of finger on the ground or ear to the ground, I guess, is the metaphor.
01:29:19.120 I mixed my metaphors.
01:29:20.820 The critical race theory would be the most accessible and relevant to start exposing first.
01:29:25.180 So I dove into that full blast, full bore, and I fortunately created a library of decoding critical race theory in advance of George Floyd dying for the preceding eight months.
01:29:39.420 And was that video mostly?
01:29:41.120 No, it was mostly writing.
01:29:42.740 It was mostly writing blog?
01:29:44.560 No.
01:29:45.220 I created the website.
01:29:46.500 Some of those articles are blog-type articles.
01:29:48.620 Some of them are explanatory.
01:29:49.580 But what I found I thought would be most important, and this goes back to that mathematical comment I made earlier about the definitions, was I knew they were misusing words.
01:29:58.080 And so I started to create a lexicon.
01:29:59.900 I started to create an encyclopedia of their terminology.
01:30:03.480 And I just would focus on one term after another.
01:30:06.880 Let me get into their head and know what they mean and go read primary sources.
01:30:10.400 When they use the word democracy, where does this come?
01:30:12.280 Oh, my gosh, we're all the way back to Lenin.
01:30:14.160 You know, Lenin-defined democracy.
01:30:15.820 Do you feel that you're dealing with a they, or do you, you know, there's this biblical idea that what we war against is principalities.
01:30:25.420 And I think of a principality, one variant of a principality is a system of ideas.
01:30:30.340 Yeah.
01:30:30.640 And I think, well, there's no, in a way, there's no they.
01:30:34.500 There's a system of ideas that's an animating, that's a set of animating principles.
01:30:39.060 Sure.
01:30:39.460 Right?
01:30:39.700 And it partially inhabits a multitude of people.
01:30:42.780 Well, I think there are two answers to this.
01:30:45.800 There's a very diffuse they.
01:30:47.320 If we say the woke, we generally know that we're speaking about people who think in certain ways, that they've adopted some of this power analysis.
01:30:54.500 But it's very diffuse.
01:30:55.700 And maybe it's only a small amount, and maybe it's a great amount, and maybe it's on this issue and not that.
01:30:59.860 But then there are the people who pay for it.
01:31:01.620 And I mean, with large sums of money.
01:31:03.360 They're a very distinct they.
01:31:05.800 Somebody has decided to pour the gasoline into this fire.
01:31:09.800 And they decided—
01:31:11.900 Do you think—but do you think that they have any sense they again?
01:31:16.220 These people?
01:31:17.020 I think they know exactly what they're doing with it.
01:31:19.620 That they are disrupting Western civilization so that they can recolonize it with their own position.
01:31:25.200 And this is why I get called a conspiracy theorist online, despite the fact that they basically write this in their books themselves.
01:31:30.040 And who—okay, for you, who are the primary actors in that they?
01:31:35.360 Well, there's a front that I would—I usually typically name that is obviously the people that are the public face of this.
01:31:42.100 Okay.
01:31:42.380 And these are people at the World Economic Forum.
01:31:44.100 Right, right.
01:31:44.720 Overwhelmingly centrally.
01:31:45.460 Did you read Klaus Schwab's Great Reset?
01:31:47.560 I did.
01:31:48.180 It's quite the aggregation of cliches.
01:31:50.260 It is.
01:31:50.320 I got about a third of the way through it, and I thought, no, I just can't do this anymore.
01:31:54.360 The Great Narrative, his second book that he wrote—well, it's his fourth, really, but the second book he wrote in that series is much more poignant.
01:32:02.760 But the way that he writes is it's, you know, maybe 130 pages of a book or, I don't know, it's not that long.
01:32:08.320 And it's business cliché, business cliché, business cliché.
01:32:11.400 Terrifying pair of paragraphs, business cliché.
01:32:14.100 Oh, he's not right, eh?
01:32:15.460 Yeah, so if you read it in regards to the fact that the meat, it's a lot of fun and a very little burger.
01:32:22.100 His camouflage is manager-speak.
01:32:26.140 So there is a compassion camouflage.
01:32:26.780 So you're asleep before you get anywhere.
01:32:28.500 Right, right.
01:32:29.100 Well, I did notice, too, that a lot of the things I saw at universities that were really deep falsehoods weren't compassion.
01:32:35.680 They were manager-speak.
01:32:36.920 Manager-speak, that's right.
01:32:38.060 Manager-speak.
01:32:38.680 If you read something and it's putting you to sleep—
01:32:40.500 Then you, like I said, you made those together.
01:32:41.460 It was probably designed to put you to sleep so you don't see—
01:32:44.640 Yeah, manager-speak is what people on the administrative front who have absolutely no ability cloak themselves in so they look competent.
01:32:50.900 That's right.
01:32:51.600 That's exactly right.
01:32:52.440 Oh, yeah.
01:32:52.880 It's something to see, man.
01:32:53.800 So here in the middle, almost squarely in the middle of The Great Narrative for a Better Future, which is the follow-up book to The Great Reset.
01:33:00.460 It's called Narrative for a Better Future.
01:33:02.140 The Great Narrative for a Better Future.
01:33:05.080 Oh, yeah.
01:33:05.420 You can't make this up.
01:33:06.220 I've got to read that.
01:33:06.620 I've got to read that.
01:33:07.180 And so right in the middle, he has a set of paragraphs.
01:33:09.480 It's maybe five paragraphs, but it covers three ideas.
01:33:12.380 And number one, we're going to force all of the corporations to adopt ESG standards.
01:33:18.160 And we're going to do that through top-down manipulations with a public-private partnership, governments, and big business working together.
01:33:23.320 Fascism, in other words.
01:33:24.440 Fascism.
01:33:25.480 With compulsion.
01:33:26.400 With the NGOs being the coordinating entities.
01:33:29.240 So the World Economic Forum is a hub that connects these things, which means there's something probably behind it that's a different they that's really organized.
01:33:37.880 That's the legion, by the way.
01:33:39.320 Yes.
01:33:39.940 And then so secondly, we're going to transform the youth to demand ESG.
01:33:44.800 They won't work in a company.
01:33:45.920 They won't buy from a company, et cetera, unless they're ESG compliant.
01:33:49.380 We're going to change the youth culture.
01:33:51.160 And then third, we're going to rewrite the social contract to accept this new need.
01:33:54.920 Only those three things.
01:33:56.400 Only those three things.
01:33:57.360 Very nefarious, though.
01:33:58.580 And he repeats this.
01:33:59.240 Well, he's got the uniform for it.
01:34:00.980 The space.
01:34:01.780 For man, the accent.
01:34:03.060 Yeah.
01:34:03.580 Central casting.
01:34:04.480 Central casting.
01:34:05.360 That's right.
01:34:06.040 He needs a crow on his shoulder.
01:34:07.860 Yeah.
01:34:08.260 Yeah.
01:34:08.660 Or a bald cat to pet.
01:34:10.060 Yeah, right.
01:34:10.620 Exactly.
01:34:11.260 Yeah.
01:34:11.340 So, yeah.
01:34:11.940 So he says this speech earlier this year, an interview.
01:34:14.780 He says that we're going to rewrite the social contract.
01:34:16.940 He says this again and again.
01:34:18.540 But he says specifically this time, we're going to rewrite the social contract so that society accepts
01:34:22.760 as we move from an economy of production and consumption to an economy, and I kid you not,
01:34:28.640 Jordan, of caring and sharing.
01:34:31.640 Oh, yeah.
01:34:32.400 And that's communism.
01:34:33.300 You mean because productive generosity on the free market front hasn't worked.
01:34:37.300 Correct.
01:34:37.720 Right, right, because it hasn't lifted more people out of poverty since the year 2000
01:34:41.600 than lifted out of poverty in the entire some history of humanity before that.
01:34:45.440 Yeah, but climate change.
01:34:46.560 Yeah, I know.
01:34:47.320 Exactly.
01:34:48.060 Yeah, yeah, yeah, yeah.
01:34:49.120 So this is it.
01:34:50.120 Yeah, it's so interesting, eh, that, like, I've got a couple, here's some, tell me what
01:34:54.600 you think about these ideas.
01:34:56.000 So, tyrants use fear to produce compulsion.
01:35:02.240 They're after compulsion.
01:35:03.280 They want to aggregate the power, and so they use fear.
01:35:05.480 And apocalyptic fear is the best sort of fear to use.
01:35:07.700 Of course.
01:35:08.240 Right, and so now you can tell the tyrants, because everybody wants to know who's listening.
01:35:13.100 How do you tell the tyrants from the real leaders?
01:35:15.100 Okay, the tyrants will frighten you into compulsion, and they'll use the crisis and the catastrophe
01:35:22.100 to justify the compulsion.
01:35:23.580 And you might say, well, there's a real crisis, and the right answer to that is there's always
01:35:27.560 an apocalyptic crisis.
01:35:28.920 That's right.
01:35:29.240 That's a universal, eternal truth.
01:35:32.360 And it's partly because all of us die.
01:35:34.200 It's partly not because you die and I die, but also because every single person we know,
01:35:39.320 our whole culture, everything we know will die.
01:35:42.040 Yeah.
01:35:42.200 So the apocalypse is always there, and sometimes that happens dramatically, and sometimes it
01:35:46.000 happens incrementally.
01:35:47.340 But it happens.
01:35:48.360 And so we're always facing apocalyptic crises.
01:35:51.820 Because of that, and then you say, well, in spite of that, you can have a form of government
01:35:57.040 that's not a tyranny.
01:35:58.360 Well, not if you use the fear of the apocalyptic crisis to compel and to aggregate power.
01:36:03.380 And so any leader who tells you that the crisis is so intense that it necessitates emergency
01:36:09.560 compulsion, that's a tyrant.
01:36:11.980 That's exactly right.
01:36:12.840 And then there's another corollary to that, which is, imagine you're tyrannical, and you
01:36:18.540 are genuinely frightened by this crisis.
01:36:21.520 Then I would say, your nervous system has indicated by the paralysis of your fear that you're too
01:36:28.520 small a night for that dragon.
01:36:31.260 Right, and so you shouldn't be parading yourself around as a leader.
01:36:34.620 It's like, no, you're a frightened tyrant.
01:36:36.480 You're not a leader.
01:36:37.440 And we can tell you're not a leader because you're a frightened tyrant.
01:36:42.360 Right?
01:36:42.840 If you were a leader, even in the face of a crisis, you would keep your head.
01:36:47.620 That's right.
01:36:48.040 And you wouldn't use compulsion.
01:36:49.900 That's right.
01:36:50.160 Right, so all these apocalyptic nightmare mongers who are saying, well, well, we're going
01:36:54.020 to burn up the planet.
01:36:54.940 It's like, yeah, that and 10 other apocalypses, by the way, that doesn't mean you get to centralize
01:37:00.760 all the power and take it for yourself and use compulsion.
01:37:03.960 And this, like, I really started to understand that as far as I was concerned, I was at war
01:37:08.260 when I saw that the leftist, radical, narcissistic, malignant types were willing to sacrifice the
01:37:14.860 poor to their climate scam.
01:37:17.080 That's right.
01:37:17.480 Right?
01:37:17.740 So, well, let's crank up energy prices.
01:37:20.140 Well, why?
01:37:20.940 Well, because renewables, because climate.
01:37:23.020 It's like, do you know who you're going to hurt with high energy costs?
01:37:26.560 You're going to destroy the marginal, right?
01:37:29.100 Because all you have to do is energy and food.
01:37:31.680 There's no bloody difference.
01:37:33.000 Those people are barely clinging on to the edge of reality.
01:37:36.020 You crank up energy prices 10%, you wipe out, like, 20 million people.
01:37:40.040 Yeah.
01:37:40.200 It's like, well, that's okay, sir, because, you know, there are too many people on the
01:37:44.100 planet anyways.
01:37:44.840 It's like, yeah, I know who's speaking in that voice.
01:37:47.340 Exactly.
01:37:47.980 You bet, right?
01:37:48.420 The great cosmic joker.
01:37:50.140 Yeah, yeah.
01:37:50.860 There's too many people on the planet.
01:37:52.340 Yeah.
01:37:52.480 I watch people actually say that, I think, do you know who's speaking out of your mouth?
01:37:57.680 God, stunning.
01:37:59.560 So, you know what the great narrative is for our bitter future?
01:38:03.300 He says it explicitly.
01:38:04.760 This is closer to the end of the book.
01:38:06.720 He, in one paragraph, says what the great narrative is after he makes his case with,
01:38:11.020 you saw the punchline is centralization of power, transformation of the universe, or
01:38:14.860 whatever.
01:38:15.080 The great narrative is we face multiple existential crises, climate change, pandemics.
01:38:21.600 Four horsemen.
01:38:22.680 Exactly.
01:38:23.420 The four horsemen.
01:38:24.300 We face multiple existential crises, the polycrisis, they call it now.
01:38:28.060 Yeah, yeah.
01:38:28.620 Therefore, we need greater global cooperation.
01:38:32.760 Cooperation.
01:38:33.480 Which, of course, is going to have to be managed by a system of stakeholders.
01:38:36.400 Well, the Chinese are wandering down that road real fast.
01:38:38.560 Yeah.
01:38:38.840 700 million CCTVs, right?
01:38:41.660 And gate recognition.
01:38:43.260 Yeah.
01:38:43.400 Oh, such fun, right?
01:38:44.580 Smile for the government.
01:38:45.620 I saw a video that said, it showed the guy scan his face to go through the gate, and it
01:38:50.040 didn't let him, it didn't open, and it said in Chinese, it says, I don't read Chinese,
01:38:53.360 but it was translated, smile for, so I assume it's true, smile for the government.
01:38:56.720 And he smiled, and it opened.
01:38:58.420 Could you imagine?
01:38:59.780 Yes.
01:39:00.400 Well, of course.
01:39:00.980 Yes, I've been in airports.
01:39:02.120 I can imagine.
01:39:02.760 Yeah, well, yes, of course.
01:39:03.880 I hate airports.
01:39:04.340 In my pocket now is my passport from the last time I went to China.
01:39:07.320 I keep a little slip of paper they gave after they took all of my fingerprints, my handprints,
01:39:10.560 and scanned my eyes just to go through immigration.
01:39:12.720 The last time I went to Beijing in 2019.
01:39:15.840 So I carry it, it's completely faded, you know, it's a heat transfer.
01:39:18.420 Soon the drone is coming your way, buddy.
01:39:20.700 Exactly.
01:39:21.200 Yeah, the gate recognition drone, nice poison dart for you.
01:39:24.720 Yeah, exactly.
01:39:25.560 This is very scary, but that's it.
01:39:28.000 There's your tyrant, though.
01:39:29.620 We're building the good Skynet.
01:39:31.440 It's like, really?
01:39:33.060 Yeah.
01:39:33.280 We're building the good Skynet, are you?
01:39:34.780 Yeah, yeah.
01:39:35.700 Wow, it's something.
01:39:36.340 And so the ESG openly is there to serve.
01:39:39.900 BlackRock and Larry Fink, by the way.
01:39:40.820 BlackRock and the United Nations.
01:39:42.520 Yeah, Evil Central.
01:39:43.840 Because they're set up to establish the reign of the 17 Sustainable Development Goals of
01:39:48.600 what they call Agenda 2030.
01:39:49.740 I helped write those goddamn things, you know?
01:39:51.840 Oh, no.
01:39:52.300 And I gotta tell you, too, we worked on that document back in 2012.
01:39:56.880 Do you know why there are 17 of them?
01:39:58.820 I'm relentlessly curious about that.
01:40:00.700 No, no, I don't know.
01:40:02.000 When we wrote, when we helped write the document, there were like 170, and one of the criticisms
01:40:06.540 that I kept levying is like, hey, guys, guess what?
01:40:09.780 You can't have 170 priorities.
01:40:12.280 Right.
01:40:12.500 You can have one priority, because that's what makes it a priority.
01:40:15.360 That's it.
01:40:15.720 And I tried to find out why there were 170, and the answer was, well, there's 170 different
01:40:20.640 constituents to please, and we don't want to offend anyone.
01:40:23.320 It's like, oh, you mean you don't want to do anything?
01:40:25.680 It's like, well, yeah, that is, you know, wink, wink, no, no, that's what we mean, but
01:40:29.020 we don't usually say that.
01:40:30.060 Sure, of course.
01:40:30.680 Right, and that, I would say in my defense, such as it is, that if you think the document
01:40:36.680 that was produced in 2012 was bad, you should have seen what it was like before it got edited.
01:40:41.580 Wow.
01:40:42.260 Right.
01:40:42.900 Wow.
01:40:43.460 Yeah, right, right.
01:40:45.000 Well, one of the things I did realize after going through that process was that, well, apart
01:40:49.340 from the fact that it was preposterous to have 170 goals, that no one had rank-ordered
01:40:54.500 the goals in any half-ways intelligent way.
01:40:57.140 There was no cost-benefit analysis, and then Bjorn Lomberg's team started to do that.
01:41:01.380 Right, right.
01:41:01.980 Bjorn wrote a book a while back called How to Spend $75 Billion to Make the World a Better
01:41:06.300 Place, which is an intelligent approach to, I wouldn't say the Pauli crisis, because that
01:41:11.380 isn't how he frames it.
01:41:12.380 But, you know, if we were actually going to try to lift the help, lift the remaining
01:41:17.300 people in abject poverty, out of poverty, Lomberg has shown that there are ways that
01:41:21.560 are far less expensive than the trillions of dollars that we will waste not fixing the
01:41:27.820 climate, just like they've not fixed the climate in Germany.
01:41:30.720 Right.
01:41:30.940 Right.
01:41:31.100 Germany, what a catastrophe, eh?
01:41:32.700 What, yeah.
01:41:33.120 The bloody and energy costs are now five times as high.
01:41:36.060 Five times as high.
01:41:37.900 Unreliable, dependent on Putin and other dictators around the world, right?
01:41:41.480 And they're burning lignite, so per unit of energy, they actually produce more pollution
01:41:45.420 than they did before they started the Green Revolution.
01:41:48.000 Right.
01:41:48.220 And their response to that criticism is, we have to do stupid things.
01:41:53.000 We have to do stupid virtue-signaling, destructive things faster.
01:41:57.080 Yeah, exactly.
01:41:57.600 Right.
01:41:57.840 Like, God, stunning net zero.
01:42:01.140 Net zero.
01:42:01.660 Zero for you.
01:42:02.680 But have you seen the absolute zero as well that's beyond net zero?
01:42:06.460 This is a project, a think tank project that came out in 2019 from UK Fires, F-I-R-E-S,
01:42:13.340 which is a conglomeration of the British government and Cambridge and Oxford and University College
01:42:19.160 London or whatever that's called.
01:42:20.820 And all of this lays out the idea that—and, of course, this is just the Overton window
01:42:27.560 stretching.
01:42:28.320 We should take it seriously, but it's probably not what will happen.
01:42:31.540 Yeah.
01:42:31.940 Right?
01:42:32.300 But they argue that net zero is not nearly enough.
01:42:34.900 We must have absolute zero.
01:42:36.380 An absolute zero—
01:42:37.040 Oh, they're saying what they mean.
01:42:38.340 Absolute zero emissions by 2050.
01:42:40.200 That's where everyone freezes.
01:42:40.620 An absolute zero.
01:42:41.760 Well, they openly in the documents say people should start buying warmer clothing now.
01:42:45.680 Especially old people, you know, but, you know, they're cluttering up the emergency
01:42:48.760 rooms anyways.
01:42:49.440 That's right. No air travel at all.
01:42:50.580 I know. I know.
01:42:51.420 You know, France banned short-haul flights, eh?
01:42:53.800 Yeah.
01:42:54.220 Two weeks ago between any two cities that were connected by rail.
01:42:57.540 I know the absolute plan is, well, this is what I said.
01:43:00.100 Net zero means all you peasants who are watching and listening, you should pay attention to this.
01:43:04.460 Net zero means zero for you.
01:43:06.340 That's what it means.
01:43:07.220 And by plan, you little people, you don't need cars.
01:43:10.700 God, who needs a private automobile?
01:43:12.540 I knew 15 years ago that the bloody totalitarians would go after the cars.
01:43:16.280 Sure.
01:43:16.480 Because nothing screams freedom like a 350-horsepower Mustang in the hands of a 16-year-old boy.
01:43:23.260 It's like, no, we've got to clamp down on that.
01:43:25.400 That's right.
01:43:25.760 He can go wherever he wants and do whatever he wants, you know, cluttering up the planet
01:43:29.800 and producing carbon.
01:43:31.180 Yeah.
01:43:31.340 So it's like no cars, right?
01:43:32.880 The goal is 90% reduction in private automobile ownership.
01:43:37.320 You think, well, you get to have an electric car.
01:43:39.140 It's like, no, the grid can't sustain electric charging.
01:43:41.660 Well, how will we deal with that?
01:43:43.060 How about you peasants don't get to have cars?
01:43:45.460 That's right.
01:43:45.940 Right, right.
01:43:46.340 No cars, no flights, no meat, no heat, no air conditioning.
01:43:49.360 No container shipping.
01:43:50.600 Oh, yeah.
01:43:51.280 Well, who needs goods?
01:43:53.720 So we go back to our neo-Marxist, Herbert Marcuse, writing in 64 and One Dimensional Man.
01:43:58.380 And what is he saying?
01:43:59.300 What's his argument?
01:43:59.980 He says, well, the problem with socialism is it can't produce.
01:44:02.920 The problem with capitalism is it's not sustainable.
01:44:05.540 It overproduces and ends up destroying itself.
01:44:08.000 So we need unsustainable non-production.
01:44:10.160 Correct.
01:44:10.440 Right, right, right.
01:44:10.960 That's the socialist solution.
01:44:12.600 And so what?
01:44:13.040 How about we have the best?
01:44:13.740 How about we have the worst of both worlds?
01:44:15.540 Yeah.
01:44:15.820 And so he says, what do we have to do?
01:44:17.480 We have to start getting used to less.
01:44:18.900 Lower standards of living.
01:44:19.960 Fewer gadgets.
01:44:20.620 We.
01:44:20.960 We.
01:44:21.360 We.
01:44:21.800 We.
01:44:21.920 Right, right, right, right.
01:44:23.600 Not him.
01:44:23.740 And all the people who will be getting us used to lower standards of living will be flying
01:44:27.460 around in their private jets deciding how to do that, eating steak.
01:44:30.760 That's right.
01:44:31.120 Mm-hmm.
01:44:31.780 Of course.
01:44:32.720 Mm-hmm.
01:44:33.000 Of course.
01:44:33.460 Oh, yeah.
01:44:34.040 Cute.
01:44:34.220 So you are a conspiracy theorist, fundamentally.
01:44:36.900 Well, I mean, I fundamentally reject the word theorist.
01:44:40.420 Uh-huh.
01:44:41.020 I don't think it's a theory.
01:44:42.300 Uh-huh.
01:44:42.780 I think that they're open about their collaboration.
01:44:47.100 And so.
01:44:47.940 Yeah, well, that's so interesting, eh?
01:44:49.540 Because we have Antifa, right?
01:44:50.940 And at the same time, we have actual fascism.
01:44:54.180 And everything that Antifa attacks has nothing to do with the actual fascism.
01:44:58.020 That's right.
01:44:58.400 Yeah, so there's another great cosmic joke for you.
01:45:01.660 Yeah, the anti-fascists are supporting a gigantic conglomeration of governments and large
01:45:06.140 banks and industry.
01:45:07.020 Yeah, yeah, yeah.
01:45:07.380 Is it all working together?
01:45:08.920 Pharmaceutical companies and legacy media.
01:45:10.700 Isn't that fantastic?
01:45:11.640 Yeah, it's really, it's really, it's really great.
01:45:13.280 The cosmic jokes are piling up.
01:45:15.480 Yeah, well, I've been posting pictures of evil clowns lately, and people think they're wondering
01:45:19.900 what the hell I'm up to.
01:45:20.940 You know, which form of insanity has now gripped me?
01:45:23.840 It's like, I realized that Satan is an evil clown, right?
01:45:27.440 I started to understand that, you know, when I encountered the sign that was over Auschwitz,
01:45:33.440 because the sign that was over Auschwitz was Arbeck Macht Frey, which means work will make
01:45:37.760 you free.
01:45:38.340 Right.
01:45:38.640 It's like, I thought, that's a joke.
01:45:40.500 Then I thought, who would tell a joke in Auschwitz?
01:45:44.580 Himmler.
01:45:44.960 Right.
01:45:45.540 Well, yeah, but who's the spirit behind Himmler, right?
01:45:49.340 Who's the great cosmic joker?
01:45:51.200 And then also Mutual Assured Destruction, right?
01:45:55.420 The acronym for years was MAD.
01:45:57.300 I thought, oh, that's a joke too.
01:45:59.520 Right.
01:45:59.800 It's all these jokes, you know?
01:46:01.060 And then I watched The Death of Stalin.
01:46:02.980 Have you seen that?
01:46:03.940 No.
01:46:04.340 Oh, it's great.
01:46:05.140 It's a movie about-
01:46:06.000 I've heard, yeah.
01:46:06.560 Oh, it's so great, because it portrays the brutal reality of the Soviet Union.
01:46:12.400 There are terrible, murderous, raping, catastrophic things going on in the background of the movie
01:46:18.420 nonstop.
01:46:19.120 And then there's these five jokers, one of whom is Stalin and the rest of his evil crew.
01:46:23.200 And they are like, they're bumbling parodies, right?
01:46:26.720 Everything's a parody.
01:46:27.580 And I realized, after really thinking about that and thinking about that motif of the joker
01:46:32.140 and the clown, which has become so prevalent in modern culture, I thought, I see, see,
01:46:37.480 when things become totalitarian, they turn into a parody, right?
01:46:41.500 Like Dylan Mulvaney's a parody.
01:46:43.400 And what's happening to women's sports is a parody.
01:46:46.500 And what North Face is doing with their advertising is a parody.
01:46:49.640 And it's like, oh, yes, that's right.
01:46:51.520 Satan is an evil clown.
01:46:53.240 Yeah.
01:46:53.640 Well, what did Marcuse write in 69?
01:46:55.740 69, so he wrote in Essay on Liberation that it's crucial that the resistance, meaning
01:47:01.140 them, the radicals, take on the form.
01:47:04.320 He said the clownish forms that so irritate the establishment.
01:47:07.680 It must become an antinomian revolution.
01:47:10.480 Right.
01:47:11.020 Everything upside down.
01:47:11.860 Everything upside down.
01:47:12.740 Everything upside down.
01:47:13.160 Everything becomes—and then so Judith Butler talks about the politics of parody.
01:47:16.460 You get this kind of despair.
01:47:17.580 Well, I didn't know any of that had been actually laid out in a strategy.
01:47:21.360 Yeah.
01:47:21.560 Politics of parody, the clownish forms that irritate the establishment.
01:47:26.200 I remember reading this because, of course, clown world is the meme on the internet.
01:47:29.540 They call it—
01:47:29.940 Yeah, yeah, yeah.
01:47:30.820 And I'm reading Marcuse, and I stumble on this, take on the clownish forms.
01:47:34.380 It's like, oh, my God, clown world was a plan.
01:47:36.480 Right.
01:47:36.640 The only thing about the evil clowns is they're not funny.
01:47:39.940 Right.
01:47:40.300 That's the—
01:47:40.840 They're not funny.
01:47:41.680 No, they're not funny.
01:47:42.480 They're not funny.
01:47:43.140 That's right.
01:47:43.160 They're not funny at all.
01:47:44.300 It's not funny at all.
01:47:46.220 Right.
01:47:46.520 And so—and that's quite interesting, too, because one of the things you see about the
01:47:50.460 totalitarian left is they really hate comedians.
01:47:52.900 They hate—
01:47:53.240 Right.
01:47:53.420 So they love parody, but they hate comedy.
01:47:55.980 That's right.
01:47:56.420 Right.
01:47:56.780 Right.
01:47:57.120 And it's got to be—it's got to be parody of the darkest form.
01:48:00.180 Yeah.
01:48:00.400 It's always dark or—
01:48:02.660 Destructive.
01:48:02.800 Almost intentionally stupid or destructive.
01:48:05.580 Yeah.
01:48:05.980 Yeah.
01:48:06.160 Every time.
01:48:07.240 Every time.
01:48:08.560 Grotesque.
01:48:09.120 Frequently grotesque.
01:48:09.900 Yeah, yeah, frequently.
01:48:11.180 Monstrous.
01:48:11.780 Yeah.
01:48:12.080 Yeah, yeah, yeah.
01:48:13.220 Yeah.
01:48:14.040 And it's never funny.
01:48:15.520 Okay, so how do you know you're not just crazy?
01:48:19.760 I ask myself that a lot, too, and I don't know that I have a really great answer for that.
01:48:23.780 Is your wife sane?
01:48:25.180 She seems to be pretty sane.
01:48:26.340 She's very grounded.
01:48:27.380 Okay, well, that's helpful, and she still likes you.
01:48:29.340 She very much likes me, and she's very convinced that I'm not crazy.
01:48:33.740 Okay.
01:48:34.180 And she has a pretty good reason.
01:48:35.120 So then you could have a folie de, right?
01:48:36.640 You could both be crazy.
01:48:37.620 Do you have friends?
01:48:39.240 It's a few, yeah.
01:48:41.840 Are they sane?
01:48:43.400 Some of them, some of them not, maybe.
01:48:44.960 I don't know.
01:48:46.180 It's a weird world we occupy, but yeah, most of them are.
01:48:49.700 Yeah, sure it is.
01:48:49.720 Most of them are.
01:48:51.160 Well, that's one of the ways you can check, right?
01:48:53.340 Is you want to have people around you, especially if you're playing in this abysmal realm, let's
01:48:59.460 say, you bloody well want to have people around you who will give you the straight story.
01:49:03.460 Give you the straight sense, right.
01:49:04.080 You'll get demented and bent out of shape.
01:49:06.320 You know, what did Nietzsche say about the abyss, right?
01:49:09.240 That's right.
01:49:09.400 You stare into it, and it stares back.
01:49:11.600 That's right.
01:49:12.100 That's right.
01:49:12.360 Right, right.
01:49:13.100 And if you fight with monstrous forms, you have to be very careful that you don't become
01:49:16.940 monstrous yourself, right?
01:49:18.840 And I don't think you can do that yourself.
01:49:20.340 I think you have to have people around who are going, tapping you into shame.
01:49:24.300 Yeah, hey, hey, hey.
01:49:24.820 Yeah, yeah, yeah, yeah.
01:49:26.300 Keep the humility up and the gratitude.
01:49:28.680 Well, humility is absolutely central.
01:49:30.840 This has become kind of my main issue.
01:49:34.440 I speak often with, you know, Christian audiences, but also political audiences, and just what unites
01:49:39.620 this kind of broadly secular resistance to the woke, and then the very religious, and the
01:49:48.240 truly religious are humble before God, and the rest of us are humble before God.
01:49:51.080 Right, and the truly religious are prideful, right?
01:49:53.320 True, that's right.
01:49:53.720 Right, you know the Canadian government announced Pride season.
01:49:56.980 Yeah.
01:49:57.580 Yeah, because, you know, Pride grade isn't enough, Pride day isn't enough.
01:50:01.360 Yeah.
01:50:01.800 Pride week, well, that's not enough.
01:50:03.360 Pride month, well, that's not enough either.
01:50:04.780 Not enough either.
01:50:05.620 Pride season.
01:50:06.400 Pride all year.
01:50:07.480 Yeah, yeah, yeah.
01:50:08.540 It's so interesting to see the worship of pride.
01:50:11.960 It's like, well, that isn't what we mean.
01:50:13.320 It's like, yeah, I think that's what you mean.
01:50:15.380 It looks like what you mean.
01:50:16.140 Since that's what you say, or the words, that's what the words you say mean.
01:50:20.840 God only knows what you mean.
01:50:22.540 You might not mean anything, but the words you say mean something, just like equity means
01:50:27.500 something.
01:50:28.400 And, hey, we're going over to the Daily Wire Plus side now, so if you're interested in
01:50:32.180 that, head over there, give some considerations of supporting them if you like what they're
01:50:36.840 doing, and, yeah, well, thanks again, James.
01:50:40.620 Very good talking with you.
01:50:42.800 Hello, everyone.
01:50:43.660 I would encourage you to continue listening to my conversation with my guest on dailywireplus.com.