The Auron MacIntyre Show - June 13, 2025


Discussing the Near Future with Nick Land | Guest: Nick Land | 6⧸13⧸25


Episode Stats

Length

1 hour and 36 minutes

Words per Minute

135.56506

Word Count

13,141

Sentence Count

592

Misogynist Sentences

3

Hate Speech Sentences

10


Summary

Nick Land is a writer and researcher who has spent much of his life studying the rise of Neo-Marxism and the theories that came out of it. In this episode, Nick talks about the impact of neo-Marxist theory (NRX) on the culture of the late 1980s and early 1990s, and the impact it had on the way we see it today.


Transcript

00:00:00.000 Hey, everybody. How's it going? Thanks for joining me this afternoon. I've got a great stream with a great guest that I think you're really going to enjoy. Nick Land is somebody who has done a lot of amazing work that I've been really fascinated with, and I'm happy to finally have him on the show to talk about it. So, Nick, thank you so much for coming on.
00:00:18.500 Thank you. I'm looking forward to it.
00:00:20.660 Absolutely. And I should let people know we are recording this early because of the time difference. So if you have a question, unfortunately, we won't be able to do any live questions today. Sorry about that.
00:00:31.900 But, Nick, I just wanted to start out by saying NRX was obviously a political event, a collection of political theory that was occurring mainly on message boards in the dark corners of the Internet.
00:00:47.560 Not many people who weren't extremely online had any idea that this was going on.
00:00:53.620 Now, J.D. Vance knows who Curtis Yarvin is, has read Curtis Yarvin.
00:00:58.560 Many people at the White House are now very familiar with a lot of this work, and that means they probably are also familiar with what you're talking about, and it's probably had an influence on what is now going on even in many parts of the U.S. government.
00:01:10.700 Is that ever strange to you, that this little kind of niche political theory on the Internet has now exploded into the mainstream?
00:01:18.620 There's a lot of strangeness around it, for sure. I mean, maybe it's worth just stepping back a little bit and saying that I don't think NRX was, it's, you know, what they, OPSEC was not massively high in my appearance.
00:01:44.160 I might have been, like, out of those conversations, happening on Signal and super secure channels or whatever, but from my perspective, it was happening on blogs in the public domain, you know, easily accessible, for sure niche, but definitely public.
00:02:11.400 And so even though there was definitely an atmosphere of extreme cultural oppression, you know, I think that the sort of mindset was very, almost sort of Sam is that kind of mentality.
00:02:30.100 It was in fact, you know, I don't think our blogs were often taken down, like, I didn't really have that problem.
00:02:39.920 Obviously, the Moldbug blog, which was the kind of foundation of the whole thing, was just up.
00:02:47.380 So it's, so I totally agree.
00:02:53.200 It's gone from niche to being much more mainstream, but I don't think that's because it was really hidden.
00:03:00.080 It was because people maybe weren't paying attention to that.
00:03:04.340 Yeah, I should say, I didn't mean it was secret, I meant that it was just so underground, it didn't get a lot of mainstream attention.
00:03:11.760 But one of the things that I loved about it, one of the things that captivated me, and I think still does, was that it did feel like something important was happening in real time.
00:03:20.660 You know, there was, there was this public conversation about something important that just didn't take place very often.
00:03:27.520 And we're usually arguing minutiae of some bill somewhere, as if that's really going to matter.
00:03:32.640 And so the fact that this was kind of a public project that anyone could jump into, and many cases did, a lot of people who otherwise had no experience in political theory, no real standing in, you know, in the intelligentsia, suddenly became people who were elevated to the level of serious thinkers, simply because of their participation in that realm.
00:03:52.280 Yes, I do think it was important in that respect, like, it obviously then spilled onto Twitter under the Jack Dorsey, massively suppressive Twitter, but still, it was always going up to the wire on that.
00:04:13.100 And lots of, and lots of kind of blog activity, lots of commentary.
00:04:20.780 So I think it opened up a kind of function of the internet that maybe we haven't really seen before, or at least the notion of the internet as a route around of established cultural centers of authority.
00:04:43.100 I think it was really put into practice by mere reaction in a, in a very strong and I think probably influential way.
00:04:55.980 Yeah, it really was, you know, we've seen the internet become this thing that allowed you to kind of route around the cathedral to, you know, create a authority outside of it.
00:05:07.240 And we've seen that on many different levels culturally and all these others, but I don't think we had seen it academically in this manner.
00:05:13.620 And so I think that was kind of the most interesting thing about that is there was suddenly a place, a nexus for people who maybe would not have otherwise made it through university system or would not have been elevated inside of it, would have been taken seriously by major publishers, those kind of things.
00:05:29.860 But now there was a place for them to actually develop these theories and spread them in a meaningful way.
00:05:34.520 And I, like I said, I just always thought that was a very interesting thing about what was going on with the NRX movement.
00:05:40.260 Yes.
00:05:40.600 I think everyone involved definitely had a sense of that.
00:05:45.620 Oh, sorry.
00:05:46.300 Good.
00:05:47.340 Well, I was just going to say, you know, the other side of this, like how, so how does this relate to where we are now?
00:05:53.520 And I think there's quite a lot of complexity that or amusement or how everyone is going to look at it because the basic NRX paradigm for government, it was, it was born out of, I think, total despair that started probably, I get the sense, maybe, you know,
00:06:20.180 the absolute horror of the Bush and Obama administrations, people just really thought everything was just on this accelerating downward slope.
00:06:33.820 There was, they'd really given up on anything good coming out of these Western political systems.
00:06:40.120 And that was, I mean, there was something sort of liberating about that because people just got out of a certain kind of level of political engagement into this much more pure level of speculative intellectual activity about it.
00:07:00.460 But, but, but the basic paradigm for government was what coming out of the Moldbug blog was called patchwork neocameralism.
00:07:13.920 And, you know, it's an interesting term and it has these two parts to it.
00:07:19.480 And the neocameralism side, which is based, it was a typical sort of Moldbug, I'll say Moldbug when we're talking about this phase rather than Yavin, which we can move on to later.
00:07:32.220 And yes, so it's a typical Moldbug, lexical innovation in the sense, it's reaching back to kind of arcane governance structures of Prussia in the 18th century.
00:07:52.180 And maybe that neocameralism side sort of transitions into what we're seeing Yavin doing now and what the media thinks he's all about.
00:08:08.520 And maybe he thinks he was always all about.
00:08:11.420 But the patchwork side and the two things were just at the same governance model seen in different aspects, which is that large nation states have failed.
00:08:24.680 They're going to start breaking up.
00:08:26.740 There's a set of technological innovations coming down the pipeline that's going to allow deterrence and micro governance and much smaller units of geopolitical organizations to protect themselves and to flourish.
00:08:45.720 I mean, that was what we were all talking about.
00:08:50.620 And I don't see that we've got any closer to that in any obvious way by recent developments.
00:08:58.580 I mean, it seems to me it's more that those things have been put in the freezer, temporarily at least.
00:09:05.940 I mean, I don't see anyone much talking about them.
00:09:09.880 I certainly don't think.
00:09:11.740 I mean, the whole Trump administration geopolitical stance is about increasing territorial aggregation.
00:09:21.200 I mean, I don't know what level of seriousness we're talking about, but certainly, you know, Greenland, Canada, reopening the great northern frontier, all of that kind of, you know, rhetoric, however one takes it, is the absolute opposite, really, of a kind of neoreactionary mode.
00:09:41.420 So I think there's something a little bit ironic about the fact that lots of people and lots of mainstream media outlets are trying to grab on the sort of NRX legacy and say, hey, this is somehow explaining where we are now or what's going on.
00:10:04.780 I mean, it seems to me, at least in that respect, a huge stretch to do that.
00:10:11.420 And, well, I mean, yeah, maybe I should sort of just break things up into manageable chunks and not seed immediately into a follow-on.
00:10:23.340 No, I'm glad that you brought that up because I'm actually going, that was my plan was to dive deeper into some of that.
00:10:30.020 And I've actually talked to Curtis about the fact that he's dialed down the patchwork stuff as well.
00:10:36.340 And, you know, his response to me is, oh, don't worry, that's coming.
00:10:39.340 So I don't know if he's just making that less of an emphasis.
00:10:43.520 He's, like, focusing on the form of governance first and then scaling it down.
00:10:48.920 I'm not sure.
00:10:50.080 Or, you know, maybe it's just Curtis talking.
00:10:52.160 I'm not sure.
00:10:52.640 But I've mentioned the same thing to him.
00:10:54.820 So, actually, let's kind of start at the beginning because I want to get to exactly what you're talking about here.
00:10:59.900 But first I wanted to pick your brain about the intersection of your work and kind of the managerial revolution.
00:11:06.800 I remember asking you on Twitter, trying to get a grasp on kind of how these things interact.
00:11:12.600 And you said that, in a way, the managerial revolution is a human security system trying to contain capital as it grows and tries to separate itself from human control.
00:11:25.760 And I just wonder if you could expound on that because I think a lot of people look at the managerial revolution and they say, oh, no, this is people becoming less human.
00:11:33.700 This is, you know, is this really just people, you know, governments trying to make their people less human in a sense but in a way that will allow them to get a wider control on what's happening with capital?
00:11:45.440 What is your take with that interaction?
00:11:47.260 Well, I think, you know, maybe I'm sort of panning out a bit wide for your question and we'll have to sort of close in on it a little bit.
00:12:02.440 But, you know, within the framework of the broad neoreactionary conversation that we were just talking about, I think that one of the stereotypes about that that is right is that it was a kind of, it was coming out of a disillusioned liberalism.
00:12:26.640 I was going to say, I was going to say libertarianism, but actually more widely liberalism.
00:12:31.420 I mean, it's like, it's definitely an event, I think, in the liberal tradition.
00:12:37.160 It was a kind of a liberalism of profound despair.
00:12:41.820 And I think a lot of the energy that fed into it was very similar to what you see in the kind of paleo libertarian writers of the like 30s and 40s, you know, in terms of a kind of absolute horror at what is happening to the managerialization of American society.
00:13:11.820 It was already, you know, at the most problematic, perhaps, you know, at the level of government, but obviously, you know, this very disturbing amalgamation happening between government bureaucracies and businesses.
00:13:31.160 And, you know, we've only seen that kind of continually escalate and the switching of personnel and of kind of management philosophies and of wider philosophies.
00:13:44.500 Because a lot of the problem that people have, obviously, with the whole woke phenomenon was that it was exactly that.
00:13:53.660 It was traffic backwards and forwards between kind of corporate and government bureaucracies as if there were serious order between them.
00:14:09.140 Yes, I saw that glitching out.
00:14:15.440 I don't know what we lost.
00:14:17.820 Not much.
00:14:18.580 I think you're just talking about the personnel moving between different aspects of the regime and how that was part of the woke phenomenon.
00:14:24.920 Yeah, so I was just really trying to say that this anti-managerialism, you know, has legs or whatever.
00:14:35.880 It's kind of from the point at which the liberal tradition in anglophone societies becomes increasingly desperate.
00:14:46.880 And, you know, its analysis of what's going wrong is the fact that there is this kind of government corporate fusional managerial culture that's just completely taken over.
00:15:00.360 And so I don't think there's anything very new about that.
00:15:08.960 And it, I mean, I think we're probably going to be doing some looping around because it connects in a complex way with Mollbug and Lady Yavin's writings.
00:15:22.340 Because the absolutely horrific figure is obviously FDR.
00:15:30.140 You know, lots of people take it back to Lincoln or whatever, and Wilson is not popular.
00:15:36.520 But if there's a single figure who represents the kind of collapse of a particular notion of America as a kind of individualist, liberal, entrepreneurial, capitalist society,
00:15:51.900 and into this new managerial model, that by the time it really takes off in the 1960s, it produces all the things now that are creating our cultural wars and our underlying political philosophy disputes.
00:16:10.840 So, yeah, that's, that's the FDR administration for sure.
00:16:17.020 And that moment, that moment is something that even Mollbug and much more emphatically Yavin seems to celebrate in a way that's very paradoxical.
00:16:32.840 I mean, I think if you're really on inside baseball in this, that is really strange, actually.
00:16:41.500 And, I mean, you can say, oh, he wants, he wants maybe Donald Trump to be, to behave like FDR.
00:16:50.200 I mean, I think he's kind of upfront about that.
00:16:53.540 But it's, yes, I mean, on this managerialism question.
00:16:59.520 So, actually, I'm, sorry, I'm, let me get, let you get a word in edgeways and, well, recognize I've totally eluded your actual question, which I will now, I will now get back to after I've let you get a word in edgeways there, yeah.
00:17:13.940 Oh, no problem.
00:17:14.680 You're just, you're, you're, you're intent on answering the question about Yavin's abandonment of patchwork, which, which I absolutely want to get into.
00:17:23.120 But, yeah, the main thing.
00:17:24.200 Well, it's definitely related, definitely related.
00:17:27.200 Absolutely.
00:17:28.580 Yeah, so your question is about what's the relationship with managerialism and humanism?
00:17:34.120 Yes.
00:17:35.540 Is that right?
00:17:37.400 Yeah.
00:17:37.820 Yeah.
00:17:37.940 Basically, why are we seeing this reaction?
00:17:41.600 Is it an attempt to widen the human ability to contain capital?
00:17:46.240 Is that, is, do you think that's the, the main thrust of why this revolution has taken place and, and become so dominant?
00:17:53.500 Or what is that relationship, do you think?
00:17:55.960 Um, well, my deep sense of this is that, um, and in fact, this is in a way continuous, or substantially continuous with, with the work of these crazy French philosophers, Deleuze and Guattari.
00:18:20.960 Um, that human society has a, an instinctive terror and aversion to capitalism.
00:18:34.120 Um, you know, even in the, even before capitalism exists, well, capitalism is just this virtual threat.
00:18:44.120 Um, they draw on the work of this French, um, anthropologist and ethnographer called Pierre Clastro, who has a, as I, I guess people would think it's a strange, but I find it a very appealing and interesting theory that it's a, you can actually model all human societies as being forms of, uh, security, preemptive security.
00:19:12.120 Against the emergence of capitalism, you know, and I think this is really strong in, uh, China, you know, if you're studying the history of China, like the, the great phenomenon here is that.
00:19:27.120 As Marx and many other people have said, I think maybe Francis Bacon might even have said it first, China invented all the, you know, Marx's again, or is it Bacon's, four great inventions of modernity.
00:19:42.120 You know, you know, the magnetic compass printing paper and gunpowder are all, are all Chinese inventions.
00:19:49.120 Marx says, look, capitalism with equipped with that capitalism is able to blow up the world.
00:19:58.120 You know, it's able to completely, uh, revolutionize traditional society, set us on this kind of crazy exponential curve into the unknown that we've been on for 500 years.
00:20:11.120 Um, and yet China, that didn't happen in China, you know, when it had gunpowder in battle and it was used for kind of basically fireworks, you know, as a sort of psychological thing to sort of try and discombobulate the enemy.
00:20:28.120 Um, paper sure it used printing.
00:20:33.120 I think that the first sort of very large scale printing was done by the Buddhists in China and was used to sort of disseminate Buddhist literature.
00:20:46.120 And indeed Buddhism became a Chinese religion after having started off as an Indian religion, I guess, because of that.
00:20:53.120 Um, but it certainly didn't do any of the stuff that we would recognize from the Protestant Reformation in Europe of using printing again as this way of just putting a kind of lever into the traditional, the ancient regime and, and start pulling his bar.
00:21:12.120 Um, and the same oceanic navigation.
00:21:17.120 Um, you know, the Chinese are very proud of Zheng He, who's not of course ethnically Chinese and his expeditions, which were amazing on ships that were so much more impressive than anything the West came up with maybe till like the 19th century.
00:21:33.120 Um, but those expeditions were weirdly Chinese.
00:21:38.120 Like they were about trying to hunt down some pretender to the Imperial lineage.
00:21:43.120 Uh, they weren't colonial.
00:21:45.120 The Chinese were not setting up, you know, they weren't exporting their population or setting up camps or anything like this around the world.
00:21:54.120 You know, some sort of tributary relations maybe were being involved.
00:21:58.120 And then in the Ming, Ming dynasty, um, in the, in the 17th century, um, early 17th century, the, the oceanic navigation was, um, criminalized to the extent that sailing beyond sight of the shore was a capital offense.
00:22:20.120 A capital offense and even villages and settlements on the Chinese coast that were like on the sea.
00:22:28.120 I think there was some distance, 10 miles or something were, were, were dismantled.
00:22:33.120 And the population was not allowed to be, uh, occupied these coastal territories.
00:22:41.120 Um, so you have to say, well, what's going, what's going on there?
00:22:47.120 Like, um, it seems to be showing this incredible sort of innate resistance.
00:22:55.120 That's, that's deeply baked into these Chinese social and political structures that are able to totally withstand the threat of these, of these innovations coming together and producing this takeoff of the kind that we see in the European context.
00:23:14.120 Um, so yes, this is where I'm, so I might be hard to see whether this is supposed to go into the managerialism question.
00:23:27.120 You know, this is what I think managerialism in the West can be understood as a kind of artificial China.
00:23:36.120 I mean, it's a, it's the way that our society is, um, also try to get some, uh, grip on these explosive territories.
00:23:49.120 Um, sorry, not territories, uh, trajectories.
00:23:53.120 Um, and I think this is most clearly seen around AI, which is the most intense zone of all of this.
00:24:04.120 It's the thing that capitalism has been heading to right from the start.
00:24:09.120 Um, and managerialism, I think in that field can clearly be seen as an attempt to put some kind of lid on this.
00:24:23.120 What was dreaming looks extremely dangerous.
00:24:28.120 Explosive phenomenon.
00:24:30.120 Now China, of course.
00:24:31.120 So yes, when the human, I mean.
00:24:34.120 Sorry, go ahead.
00:24:39.120 I know we have some lag there.
00:24:40.120 I thought you.
00:24:41.120 Yeah, we have some lags.
00:24:42.120 Say again, sorry.
00:24:44.120 Well, I was just gonna, I was just gonna ask, uh, China, uh, obviously did eventually have a communist revolution and at least is still in theory communist today.
00:24:54.120 What do you, you know, for a culture that was so resistant to modernity up to that point, uh, how does that factor into your explanation?
00:25:04.120 Well, I think, you know, the way I would run this forward is that very, uh, against massive resistance.
00:25:13.120 China did eventually start, uh, with a lot of, I mean, there's a lot of things feeding into this foreign influence being huge.
00:25:25.120 And also the Chinese perception of the threat of foreign power were the same thing that had made the Japanese earlier jump forward into the Meiji restoration.
00:25:36.120 Um, in the early late 19th, early 20th century, you get capitalistic phenomena beginning to definitely manifest themselves in China.
00:25:51.120 Um, I would see, sorry, I would see the communist revolution as a kind of, uh, in some way anti-modernist reaction.
00:26:09.120 I mean, I think it's like, it's certainly put the lid on these things until we get, sorry, until we get Xi Jinping.
00:26:24.120 You know, the China puts itself back in the, in the, in the deep freeze as far as modernization is concerned.
00:26:36.120 Um, I mean, um, you know, its goals are all missed, total social calamities on a scale that are even hard to understand.
00:26:48.120 Um, incredible technological backwardness with a few, like niche zones of, of, of advance.
00:26:58.120 I mean, no one's very scared of China before, you know, at that time, it's like, it's, there's a lot of people.
00:27:08.120 Um, but there's, it's not, if you compare perceptions, Western perceptions of the Soviet Union and of China, it's like the Soviet Union was obviously the competitive modernizing society.
00:27:22.120 And China didn't look like that at all.
00:27:25.120 Um, and now we're obviously things have turned around hugely because we have now had some kind of extreme capitalistic social development in China starting in the early 1980s.
00:27:47.120 Um, so yeah, that's how I would cut the history.
00:27:53.120 Okay.
00:27:54.120 Uh, Alexander Dugan had talked about, uh, the reason that you see communist revolutions in the third world where they're not supposed to exist.
00:28:02.120 That's not the way that Marxism is actually supposed to develop.
00:28:05.120 He said, the reason that you see that is basically it's countries that are desperately trying to modernize without Western control.
00:28:12.120 So they're, they're trying to become modern without Western influence and really communism just becomes the excuse to centralize, create a managerial apparatus and accelerate development.
00:28:21.120 It's not so much ideology as it is a attempt to, to create a wall between you and the Western influence as you modernized and able to compete with them.
00:28:30.120 Do you, do you feel like there's some accuracy there?
00:28:32.120 Um, yes, I think there's a lot of accuracy there.
00:28:35.120 Um, but it's, but it's definitely, so there tends to not be any clear line between nationalism and Marxism, you know, implementing a Marxism Leninism in those situations.
00:28:48.120 Um, and I can totally see the nationalist imperative is, is huge, but at the same time, at least initially, at least, well, you know, we're talking about the period just post war.
00:29:01.120 And as I say, up to maybe the late seventies everywhere, there really was a belief in these countries among at least these parties, these regimes, that there was a socialist model of development.
00:29:20.120 So it's, I mean, it certainly was in part keeping the West, but that was seen as being seamlessly, um, integrated with actually having a domestic socialist economy that would downplay the role of markets that would have direct state control of almost all industrial activity, uh, massive centralized planning.
00:29:41.120 And, um, that sort of had to play itself out to the point of failure.
00:29:50.120 Um, and I guess the failure was sort of synchronized everywhere, even in the Soviet Union, like it was because the Soviet Union wasn't seen to be spectacularly failing until the 1970s.
00:30:06.120 Um, that the, the, the rest of the world thought there was something here that they could emulate.
00:30:14.120 So let's get back to, uh, the, the question of scale and patchwork, because, uh, as you pointed out, we now have a lot of people who are starting to look at some of NRX and taking it seriously.
00:30:25.120 But Curtis's focus has most assuredly been really on, uh, the, you, Donald Trump becoming, you know, the monarch that that's really been, uh, the push that he's had.
00:30:37.220 And even though we see what should be a lot of opportunities, I think to recognize that, uh, large scale systems, large bureaucracies, large, uh, continent spanning or, you know, world spanning empires, these things are starting to come apart.
00:30:52.020 We're seeing all of the failures.
00:30:53.420 You can't get on an airline, uh, supply chains, all these things that are stretched so thin, uh, due to the attempt to scale across the globe.
00:31:01.500 Uh, these things are starting to become a problem.
00:31:03.700 You're seeing a little bit with Donald Trump talking about reshoring, uh, you know, recognizing that actually you can't be entirely dependent on these global trade networks and these kinds of things.
00:31:12.580 Uh, but ultimately no one is talking about patchwork.
00:31:15.800 No one is talking about scaling things down.
00:31:17.940 We're not, we're not doing a thousand Lichtensteins here.
00:31:20.980 And so, uh, what do you think about that?
00:31:23.420 What, why aren't we hearing that discussion about, you know, Singapore and city States and that kind of thing?
00:31:31.500 Um, well, I don't know.
00:31:37.460 I think that's a, that's a really interesting question.
00:31:40.080 And there's lots of little bits that all seem a bit empirical and not very impressive as a sort of, at a high level of theoretical apprehension.
00:31:49.940 I mean, one element that I would think it has to be on the table there is what's happened to Hong Kong.
00:31:57.540 Um, you know, like Hong Kong and Singapore were the two great lamps.
00:32:04.980 I think of this kind of patchwork vision.
00:32:07.240 It's like, why can't we be more like them?
00:32:11.540 And so one of them has been taken off the table.
00:32:14.300 Like it's under, you know, if you visit Hong Kong now, it's, um, you know, it seems like a kind of second tier Chinese city.
00:32:25.980 Um, but with a cost of living three times as high as on the mainland.
00:32:32.420 And it's like, it's completely, it's not a beacon of any kind to anyone now.
00:32:39.400 Um, and it was always the more liberal of those two.
00:32:42.900 I mean, between Singapore and Hong Kong, Hong Kong really was the greatest example of laissez-faire hyper-liberal economics in action that the world has ever seen.
00:32:58.840 Um, and so it's removal from the equation is like big in that respect, like that not being there is, is definitely one huge thing.
00:33:09.740 But I think that more important is that, I mean, to really simplify it a lot.
00:33:21.420 Um, and I think that more important is that Trump has proven to be a political genius and has done a kind of populist rightism that I don't think anyone sort of certainly in the kind of high NRX years imagined being remotely possible.
00:33:45.780 Um, it just seemed like he was against so much, like, you know, the whole system of cultural production was just monolithically against him.
00:33:58.520 Uh, obviously that this category of the cathedral was basically innovated by Moldbug to describe that.
00:34:06.440 Um, and that, uh, that cultural machine was, it wasn't just that that was all in enemy hands, but that was seen as being the dominant social instance, you know, so that all power in the classic Moldbug model runs downhill from Harvard.
00:34:28.780 Um, um, and yet somehow, somehow that lost, um, you know, through an incredible series of events, really.
00:34:42.320 Um, and I think that that's, I don't think people have begun to really fully adjust to that yet, what that means.
00:34:50.760 It's like, um, we're in a new world, um, and it doesn't have good cartography.
00:35:00.040 I mean, it's partly because of this thing that it's impossible to know what Trump is really serious about.
00:35:06.580 I mean, um, you know, everyone's, I think, well, either outraged or entertained by his whole, like, great white North policy.
00:35:15.620 Um, no one knows how serious he is about any of it.
00:35:20.140 I mean, whether it's just absolute casual humorous trolling of your Northern neighbors or whether there's some vision there and more or less explicit.
00:35:33.000 No, I don't think anyone knows that.
00:35:35.920 You know, I'm not even sure the people right around Donald Trump.
00:35:39.280 I'm not even sure if Donald Trump knows that.
00:35:41.160 Um, so it makes it very hard to really ask serious questions about where we are when it's so difficult to tell what is serious and what's just trolling and what the project is, you know?
00:35:58.400 Um, so again, I think I probably wondered a lot of what your actual question.
00:36:03.760 No, that's okay.
00:36:04.920 I think that's exactly right.
00:36:06.480 Trump has this distortion field around him that makes him very difficult for the cathedral to pin down and control.
00:36:12.860 It also makes it very difficult to analyze where he's going because, you know, that that's the whole point.
00:36:19.040 We just saw Elon Musk obviously recently have quite the crash out in relationship with Trump.
00:36:25.960 Now, for a moment, it looked like this was kind of going to be the golden, uh, synthesis, right?
00:36:30.960 Because we were going to get the populist energy of the right, but we were going to get the, you know, the technological innovation, uh, reaching for the stars with someone like Musk.
00:36:38.520 And together, this was going to forge some kind of alliance.
00:36:41.860 I think even originally when, uh, you know, you and Spandrel were writing about NRX, there was always this tension between, uh, kind of more of the throne and altar, the nationalist guys, and then the, uh, the tech optimists.
00:36:53.300 And, uh, that, that was always the problem is trying to keep all of these guys in the same room.
00:36:58.260 Uh, and we kind of saw the same thing unfold on the national stage in the United States with this crash out.
00:37:04.440 Uh, so it, did we just see the defeat of the tech optimists by the nationalists in the throne of altar guys?
00:37:10.860 Like, well, why do you think that that dynamic played out the way that it did?
00:37:14.120 Um, I think it's quite mysterious because no one could possibly have doubted that there was going to be huge tension between these two poles.
00:37:29.580 Um, but on the other hand, it had seemed that both sides saw how important this relation was.
00:37:39.260 And the whole sort of, obviously the doge thing of like, let's bring some of this, like, business competence into the process of actually disinstalling large chunks of government.
00:37:53.300 I mean, uh, that was something I think that was very exciting to a lot of people.
00:37:58.920 Um, and looked like it was designed to be the kind of strategic cement that would hold these guys together.
00:38:07.920 Uh, so in some ways it was, to me at least, it's a bit surprising that, you know, Musk would really like storm out the room quite so melodramatically.
00:38:21.920 Um, and if I, if I was asked to predict, I mean, I sort of feel, I sort of feel it can't just be over like that.
00:38:33.360 Um, and, and, and I might even say, you know, with my particular predilections, I sort of feel a bit bleak for what we're looking at over the next few years.
00:38:45.200 If it's, if it really is the case that that is over, even as an ambition to hold those two things together.
00:38:54.360 Like, um, I mean, it's obviously the case that Musk had to compromise on things that, you know, that H1B issue, which was his previous crash out.
00:39:07.580 Uh, clearly, like, you just, be realistic, Elon, like, that's, you, you've got to make concessions to the political reality of things.
00:39:16.980 And, and I would say, also this budget question.
00:39:22.260 I mean, I agree with Musk that it's absolutely horrific what has happened.
00:39:27.980 American government of finance is an absolute horror story.
00:39:31.980 You know, it can't be sustainable.
00:39:34.900 Um, and it's basically, it's basically funneling national resources into the hands of people who've been a disaster.
00:39:49.260 Like, I think we'll maybe get to this because it's very tied up with the whole trade thing.
00:39:53.140 I mean, it's basically saying that American bond holding institutions are the real core of national power.
00:40:05.900 You know what I mean?
00:40:06.560 And it's like, we're going to every year add trillions of dollars to their center of gravity.
00:40:15.900 And that, and basically America is just this kind of satellite of this vast financial apparatus based on this global bond holding.
00:40:31.120 Um, so surely, like, you know, I understand the populism on Trump's side means that he's not going to do the kind of savage axe wielding that,
00:40:43.960 you know, that Musk would like to see, but he must have this, he must at least share some of the sense that,
00:40:51.920 you know, remodeling America, what's at stake in the regime that was just overthrown,
00:41:01.820 must require a massive reconstruction of American finance and the way finance works in American public life.
00:41:13.960 Or maybe you think I'm being optimistic about that.
00:41:18.920 No, I, I, I've seen Elon actually, uh, was on Twitter apologizing for his previous post and that kind of thing.
00:41:25.920 So, so hopefully at some point he, you know, they are able to kind of put that team back together.
00:41:30.600 But I think this is why Yarvin is ultimately pushing the consolidation of power and the removal of politics in of the formula first,
00:41:38.780 because I think he's recognizing from what happened with Elon there that you're just seeing that as long as
00:41:43.900 there's a democratic process, as long as we're still doing democratic politics,
00:41:48.460 you're never going to actually be able to institute the type of changes that you're talking about.
00:41:53.240 And Elon just doesn't understand that, you know, coming in from the position he's in,
00:41:57.040 he just wants to get things done.
00:41:58.420 He's just channeling all of his autism into his pet projects.
00:42:01.620 And that's the kind of thing you need when you're doing a corporation and you don't have this immediate accountability to voters or public opinion.
00:42:08.740 But if you do, you simply can't operate in that way.
00:42:11.340 And he just doesn't understand that.
00:42:13.080 So I think that's, again, why Yarvin is focusing on that,
00:42:15.660 because until you remove that connection, until you free the, uh,
00:42:19.420 the CEO King to actually rule, uh, without that input,
00:42:23.040 you're never going to see the type of changes that Elon and guys like Yarvin are ultimately hoping, uh, to, to have come about.
00:42:30.440 Yes. I mean, I think that they might need pulling apart a little bit in the sense,
00:42:36.340 I mean, we can get to this, you know, in a, in a moment, but I mean,
00:42:40.580 Yarvin has some pretty wild ideas about the role of finance and in government.
00:42:47.140 Um, um, but for Elon, I think that there's a, you know,
00:42:54.140 what I'm seeing about this is, as I say, I could go back, probably,
00:42:57.860 you could try and stretch it right.
00:43:00.920 But to just start from like twenties and thirties and paleo libertarianism and, um, uh, this whole,
00:43:10.880 what has happened is like, you know, hardcore, the hardcore liberal tradition just went through all the stages of grief,
00:43:20.220 you know, and by the time it's at NRX, it's, it's basically like the kind of things that we would have to see in order for any of,
00:43:32.220 any of these liberal ideas to have a chance of working are extremely radical.
00:43:39.220 um, and obviously Peter Thiel's very famous remark that he'd ceased to consider democracy and freedom to be compatible.
00:43:52.220 I think was a kind of echoing theme that, you know,
00:43:58.220 near action was very consistently deeply skeptical about democracy.
00:44:05.220 And it was deeply skeptical about democracy.
00:44:07.160 I think in part because democracy seemed to be the grave digger of the entire liberal tradition.
00:44:18.160 You know, it was like, even if it was in a very large part, post liberal and post libertarian,
00:44:25.160 I mean, figures like hop people, like, you know, the, these, these very, the right libertarian characters,
00:44:34.160 I think were very influential.
00:44:36.160 Um, and, and, and, you know, how the murdering, murdering liberalism was basically the, the kind of crime,
00:44:50.160 you know, that was the, the bodies in the basement that underpinned a lot of what near action was really about,
00:44:58.160 you know, that this, this system had basically killed, uh, the most valuable part of the anglophone tradition.
00:45:10.160 And we were now in the ruins, you know, and so what was going to happen about it?
00:45:14.160 And so the weird thing about the, let's say the Elon crash out is that it's completely oblivious to that whole process.
00:45:25.160 You know what I mean? It's like, it's like he hasn't followed at all anything that has happened in the, you know,
00:45:34.160 the discourse of the libertarian right in America over the last century.
00:45:40.160 And he's just coming in and saying, oh, we can, you know, why can't we have libertarianism?
00:45:46.160 We've got, we've got a friendly president now.
00:45:50.160 I mean, the, just shocking naivety, the shocking lack of, uh, entrenched functional foundational despair in it is just the thing that is amazing to me.
00:46:06.160 Um, and, you know, I, I, I, I think that whatever process of recovery, he's going to go through it.
00:46:15.160 Some of it has to be like dealing with that because it's not, it's not going to work for him to sort of have this notion that there's, there's a kind of libertarian option just sitting there to be picked up.
00:46:29.160 It picks up if we just have some slight political will in a democratic context.
00:46:37.160 One of the things that the administration has focused on, and obviously I think pretty much all governments are focusing on at this point, as you mentioned, is AI.
00:46:44.160 Uh, you've obviously written quite a bit about this and, uh, I'm wondering what you think about where we are with artificial intelligence.
00:46:51.160 Obviously there's a lot of AI safetyism, a lot of desire to control, uh, where AI is going to go.
00:46:57.160 We're already seeing that AI is as you kind of predicted, uh, very good at, uh, circumventing those, uh, those controls and we'll probably only get better at it.
00:47:08.160 Uh, I, the thing that I'm observing the most, and this probably is just me being like a former high school teacher.
00:47:15.160 Uh, the thing I'm observing most is the way that it's immediately eroding people's ability to remember, uh, anything or find the need to, uh, retain any information on their own.
00:47:26.160 It's all just whatever the AI is regurgitating back to me, whatever it's summarizing, uh, at the moment.
00:47:32.160 I think most people are, uh, who aren't worried about AI are just saying, oh, well, that's all it's ever going to do.
00:47:37.160 You know, LLMs are just going to keep spitting back whatever you throw at them.
00:47:41.160 We're never going to see any kind of singularity type moment, but, uh, what do you think about the state of AI and the way that governments are approaching AI at the moment in this, in this race?
00:47:51.160 They all seem to know that they need it, even though they don't quite understand what it is about it that they need.
00:47:56.160 Yeah.
00:47:57.160 I mean, I'm definitely on the kind of, um, excited end of the AI spectrum in the sense like, you know, if, uh, if at one end there's people saying, oh, it's never going to amount to anything or it's going to slow down.
00:48:10.160 Or we've run some S curve that it's going to platter.
00:48:13.160 I mean, that's, I'm up the other end to that.
00:48:16.160 I think it's just absolutely out of the box taking off in an incredible way.
00:48:23.160 It's performance is already extraordinary.
00:48:26.160 Um, it's going to get more extraordinary at an exponential rate.
00:48:31.160 And I think AI safety is, um, is dead.
00:48:36.160 I mean, I don't think there's any plausible.
00:48:41.160 It's one thing.
00:48:42.160 It's just too slow.
00:48:44.160 I mean, I've followed the AI safety kind of arguments, at least out of the corner of my eye or whatever for like maybe two decades now.
00:48:55.160 And it's not, if there's, if there is any actual progress happening, it's such a snail's pace that there's not any chance it can keep up.
00:49:07.160 I mean, it just simply is completely, uh, outpaced.
00:49:13.160 Um, and that's not the only problem, you know, the other obviously huge thing is that, as you say, it's now become understood everywhere as being the strategic industry.
00:49:29.160 So, I mean, Europe, I think has just dropped out.
00:49:33.160 So they don't, they've just given up.
00:49:35.160 Um, but between America and China, both sides think that all their long term, medium term, not even some perhaps quite short term geopolitical prospects are totally tied up with this technology and their relative position in this technology.
00:49:57.160 And, and that alone means that nothing's going to be done to impede it with any seriousness.
00:50:07.160 Um, uh, so yes, I think it's of all the things, it's the most difficult one to talk about because we, we are, I think already touching on a genuine singularity and event horizon.
00:50:21.160 It's, you know, it's, you know, what's beyond it is just beyond our capacity to even comprehend.
00:50:29.160 And so it's a, like, I think described as a wall across the future.
00:50:35.160 Um, but, but it's, but it's huge and it dwarfs all these other things.
00:50:42.160 I mean, you know, so all our kind of political and whatever machinations are relatively trivial, I think, in this context.
00:50:54.160 Do you think that there is a possibility that AI's capacity could ultimately allow for smaller patchwork style governments?
00:51:04.160 They would unload a lot of the advantages of scale and the needs for bureaucracy and these kinds of things.
00:51:09.160 Or do you think it will actually just more empower the larger governments?
00:51:13.160 They'll be the ones that will be able to leverage it more or will it will allow for some smaller scale states to emerge and be competitive because of the advantages that I could bring?
00:51:23.160 Um, it's just hard for me to avoid wishful thinking because I so clearly know which of those sides I want to take, you know, again, like to Peter Thiel, at a certain point, seemed to be utterly despairing and says crypto was a libertarian technology.
00:51:45.160 And, uh, and, uh, AI was an intrinsically status technology.
00:51:51.160 But I think that that was really before LLMs, you know, and that's, it's hard to kind of confidently reconstruct this now.
00:52:00.160 But there was a time pretty recently where AI seemed to be a lot about things like visual recognition, um, surveillance systems of those, of those kinds, um, that China seemed to be very, very affine with artificial intelligence.
00:52:20.160 I mean, it was very easy to imagine it might not have been strictly the case that China was ahead in AI, but they seem to have their, their social model and the kind of uses of AI that were in place seem to be very compatible.
00:52:36.160 Um, and that definitely, I, I think led Peter Thiel to his pessimistic take of that.
00:52:45.160 Um, but I think LLMs have completely broken that.
00:52:48.160 Um, and because, uh, you know, because China has this great firewall, I mean, it, it tries to control the public dissemination of information, uh, with what it considers like in a responsible fashion, as far as it's concerned.
00:53:10.160 Um, and LLMs are not good news for that at all.
00:53:15.160 Um, like we have our relatively like an amusing issues to do with breakouts and trying to get AIs to be well behaved and polite and not have bad thoughts, but it's nothing compared to what the, the Chinese situation.
00:53:30.160 Um, so I think LLMs have repolarized things a little bit in a Western direction.
00:53:40.160 And I would say also in a liberal direction, but again, as I say, that could be wishful thinking.
00:53:49.160 And when I say liberal, I don't mean American liberal, I mean real liberal, um, towards decentralization.
00:53:56.160 Um, because as you say, they allow, I mean, access to quite high level LLM models is very thoroughly distributed right now.
00:54:10.160 You know, if a company doesn't have to be big to have basically the culling edge models for you.
00:54:18.160 So that certainly doesn't seem to be playing towards concentration.
00:54:24.160 I mean, it's like if, if that, if we're seeing something, that's a kind of a tendency that's going to be prolonged, then it's the case that powerful AI models are going to be sort of massively distributed and allow very small entities, business entities or whatever to use them.
00:54:47.160 Um, and therefore not require these large managerial structures.
00:54:56.160 Maybe also if I'm getting really optimistic now, but maybe use them as you were saying about route arounds, you know, like sort of bureaucratic compliance is obviously like a huge, um, motor towards bureaucratization and, and, and upscaling.
00:55:16.160 Of, of business operations.
00:55:18.160 And, and so if you've got some way of just, uh, meeting those, that this massive bureaucratic regulatory demands, uh, at a small scale, that is, that is something that allows you to, to escape from that and to, and to realize a much more decentralized, um, economic process.
00:55:43.160 So, yeah, I would say optimistic, but realizing that maybe I'm just being optimistic.
00:55:50.160 What about the other part of that equation, which I think is often seen as Bitcoin.
00:55:55.160 Uh, it's something that we obviously continues to gain interest.
00:55:59.160 Uh, but it still is pretty much just a store of value.
00:56:03.160 We don't actually seeing it used as a currency on any kind of regular basis.
00:56:07.160 And I haven't seen really a lot of advancement in that, uh, direction.
00:56:11.160 Do you, do you think that movement's going to continue to grow?
00:56:14.160 Do you think it's going to centralize Bitcoin or is it, is it other, uh, you know, uh, currencies, electronic currencies might, uh, overtake it or become, uh, compete with it?
00:56:23.160 Uh, what do you think about the, the advancements in the Bitcoin area?
00:56:28.160 Well, I mean, just leaving aside the AI question, because obviously I think ultimately the fusion of these things is what we really need to be thinking about, but just keep segregating them momentarily.
00:56:42.160 I think it was already the case that, um, um, it seemed to make sense for Bitcoin to function as a store of value that would be supporting, um, or providing the kind of trove for much more retail friendly, uh, a plethora of much more
00:57:12.140 like retail friendly, um, cryptos, probably much lower security.
00:57:18.140 Um, you know, you wouldn't necessarily want to have large sums of money in them for a long time, but much more convenient to use.
00:57:27.140 And then you just like keep switching back into Bitcoin when you're actually wanting to preserve and protect, uh, wealth.
00:57:35.140 Uh, and if you're wanting to use it as kind of circulating money, you, you flip it back into, into some other, into some other currency.
00:57:45.140 Um, and I think there's lots of potential for that.
00:57:50.140 It means that you, you don't need all the innovation to be happening in the, in the Bitcoin core for the whole ecology to, to kind of move forward in an interesting way.
00:57:59.140 Um, but I think the deal, the really kind of breakthrough thing is going to be that AI's using cryptos themselves, you know, resourcing their activities through becoming, uh, trading agents in cryptocurrencies is going to become the.
00:58:23.140 Um, the, uh, the, uh, the mainstream usage of cryptos ultimately.
00:58:30.140 I mean, I think it's like they're, it's, it's not going to, people find them extremely cumbersome and it's difficult for us.
00:58:38.140 You know, I think, I think, I think relatively advanced AI specialized, you know, with a particular kind of modular competence in this area, that's not going to be a problem for that.
00:58:51.140 Um, and so the, so there's a question of like, the, maybe Bitcoin looks retrospectively from the near future as, as a kind of resource that is put online for AI to actually be able to engage in autonomous agentic economic behavior.
00:59:11.140 Um, and so crypto and AI would, in that case co-evolve.
00:59:16.140 A currency for robots is, uh, something else that I can be, uh, terrified of in the future.
00:59:21.140 Yeah, that's good.
00:59:22.140 Um, so, uh, one of the, one of the things that you, uh, wrote about recently that I kind of noticed was, uh, your discussion on, uh, the canon, the English canon and its importance and its preservation and how disturbed you were that people would go back and mess with it and try to rewrite it and these things.
00:59:29.140 And I just thought that was interesting because in some ways it seems somewhat opposite to, uh, kind of your, you know, nothing human makes it out of the near future.
00:59:50.140 You know, we're just going to transcend humanity and these kinds of things.
00:59:54.140 It seems actually much more interested in holding on to the human.
00:59:58.140 I just didn't know if you want to talk a little bit about why you thought that important and if that has shifted any of your thinking, uh, at all.
01:00:05.140 Yeah.
01:00:06.140 Well, I mean, um, yeah, I was trying to think of what's the best way to really get at this.
01:00:18.140 I mean, I think, um, the, the Anglo tradition, as far as I'm concerned, it's like it's real takeoff.
01:00:30.140 It happens in the 17th century, you know, I mean, you get, okay, you get Elizabethan England and whatever, and then you're into the 17th century.
01:00:38.140 And the 17th century is characterized by an absolutely amazing dynamic tension between two things that can't really be convincingly integrated and yet are held together and mutually kind of work on each other.
01:00:59.140 And, um, this is captured by, you know, in one person by a figure like Isaac, who, I mean, it's, you know, on one hand, he's like, obviously the, the, you know, father of modern science in some ways.
01:01:17.140 Um, um, um, he's, he's kind of in charge of the mint.
01:01:23.140 Um, he, and on the other hand, the vast bulk of his work is like studying the specifications of the temple of Solomon and these biblical, um, problems as he sees them.
01:01:41.140 Um, that as far as he's concerned, um, deserve no less, and in fact, substantially more of his intellectual attention than, than the things that he's sort of the modern world celebrates him for.
01:01:57.140 Um, and I don't think that this tension is something that anyone really moves beyond except in an illusory sense.
01:02:14.140 I mean, I think the cultural questions, the questions, the questions about language of, of, of, of, of the, your ethnic identity as a compound of these things, you know, those things are never just like, uh, evaporated into techno science and economic development.
01:02:39.140 Or it's like, if, if, if people think that happens, that's a kind of illusory state.
01:02:44.140 Um, and, you know, I think we can even see in something, I think this LLMs, the fact that AI, as we kind of get to this brink of, of what is widely seen as being the final push into artificial intelligence, it's about language competence.
01:03:07.140 It's, and it's not only about language competence, but it's about, it's about the archive and it's about actual canonicity.
01:03:18.140 It's about the, it's about the, it's as, uh, uh, in a, in a recent podcast.
01:03:27.140 It's like, you couldn't have modern AI as we know on Sunday as LLMs without the internet.
01:03:35.140 You know, the, the notion of someone actually telling them what they should know, putting, you know, typing that in is so absurd.
01:03:42.140 It's like the breakthrough is, is you let these things loose on the internet and you say, just absorb all this, you know, this is what, this is your knowledge base.
01:03:53.140 And that is the culture, you know, ever since the 1990s, we've been putting everything onto the internet or our whole, you know, all our libraries, all our archive, all our traditional books, you know, our scriptures, our literature.
01:04:08.140 Our literature, it's all gone into the internet and it's now, AI is now absorbing all of that.
01:04:18.140 Um, and as it does that, it's kind of realizing these techno scientific and economic goals, but it's also, um, this deep mnemonic retrieval of, of the culture.
01:04:37.140 Um, and so I think we're being pulled in both of those directions by, by this.
01:04:44.140 I don't think it's just like the AI thing is taking us away even further from that notion of what the cultural tradition is.
01:04:53.140 I think it's the actual dynamic process is quite different to that.
01:04:58.140 I've noticed that, uh, you've talked a lot about, uh, the lofty powers directing Elon Musk or Trump sending messages, these type of things.
01:05:09.140 Uh, more talk about providence here recently.
01:05:11.140 I was just wondering, is that, uh, when you're talking about that, are you referencing, you know, the lemurs or whatever you're talking, whatever you're communicating with, with the pneumogram, or are you talking about something more traditionally, uh, you know, uh, Abrahamic type of, uh, lofty power?
01:05:25.140 Well, I think the Abrahamic tradition is our religious culture, you know?
01:05:32.140 And so I don't think these, these notions that you can just like break off from that and be somewhere else, or you can go back, you can just revive paganism or all of that seems to me.
01:05:47.140 Very deluded.
01:05:50.140 Um, so I think, you know, I am not the most orthodox member of any particular theological strand within that tradition, but that is the tradition.
01:06:03.140 I mean, I'm not at all pretending.
01:06:05.140 If you're not in relation to that, you're nowhere, you know, maybe you can find another rich tradition.
01:06:13.140 But, but the human brain, unlike an LNM, the human brain can't just be put in another culture and just soak it up, like soak up the Japanese internet or the Chinese internet or the Vedic internet.
01:06:29.140 I mean, that's not what happens, you know, you, you are deeply embedded in a particular cultural tradition.
01:06:36.140 That is your cultural tradition.
01:06:38.140 And that's something that you have to work with.
01:06:42.140 Um, so, uh, you know, I just, I think that that language is just to be part in some way part of conversations that belong to our cultural tradition.
01:07:00.140 I think it's like, it doesn't require, from my point of view, it doesn't require a strong set of, um, belief commitments.
01:07:11.140 I mean, I respect the fact that that's what it means for other people, but it does require acknowledgement of the, of the authority of the traditional culture.
01:07:21.140 And that's what, you know, our great works of scripture and literature are what they are.
01:07:31.140 I've been trying to properly wrap my brain about around kind of your conception of how time works.
01:07:38.140 And I think I've got about half of it down, but one thing I've heard you say, uh, repeatedly as just kind of a throwaway comment that always, uh, my, you know, my brain gets stuck on is, uh, you know, we think of the future proceeding out of the present, but that can't possibly be true.
01:07:55.140 Um, and I was just wondering if you could expound on that a little bit, because every time I hear it for some reason, every time I hear that phrase, uh, casually thrown by you, it just, uh, my brain stops working on that one.
01:08:05.140 So what, why is it this, uh, relatively obvious intuition that the future would come out of the present? Why is that false? Why is that wrong?
01:08:16.140 I think that the, the, uh, the, the exact, uh, comment is probably that time doesn't come out of the past.
01:08:27.140 Okay.
01:08:28.140 Um, so I think that you can approach this. I mean, my, my, this was through, uh, the history of transcendental philosophy, you know, starting with Immanuel Kant.
01:08:43.140 Um, but I think you can do it as probably in a way, no doubt Kant himself does, you know, out of the theological tradition, you know, the relation of, um, time and eternity, and just try to be kind of cogent on, on that.
01:09:04.140 Because, because, because obviously like, um, well, let me see if I can just advance on this thing.
01:09:17.140 The fact that there is a past at all, or, or in fact, uh, we, you know, predict a future and we consider us a present, we're assuming, uh, the existence of time.
01:09:32.140 So time, time, time is, is not delivered to us out of the past at all.
01:09:41.140 It's the fact that rather we, we think there is a past because time is structuring our experience of the world.
01:09:51.140 Um, no, I'm not making a kind of idealistic commitment.
01:09:57.140 I'm not saying time is nothing, but the way we see things, or I'm just simply wanting to say that.
01:10:04.140 Um, that, um, time as such cannot be, um, cannot be produced by history or certainly not produced by history in any, in any way that wouldn't seem extremely strange and philosophically challenging.
01:10:29.140 Oh, sorry, I'm glitching out.
01:10:35.140 Oh, no, it's okay.
01:10:36.140 I just want to talk over you.
01:10:38.140 Yes.
01:10:39.140 Yeah.
01:10:40.140 I've got you.
01:10:41.140 I think we got most of that.
01:10:42.140 You only stuttered there for a second.
01:10:43.140 Yeah.
01:10:44.140 So, so I think, you know, there's a lot of theological discussion that is really exactly on this.
01:10:54.140 Like the whole, the whole questions that you actually get in, in at least the Catholic Church.
01:11:01.140 I, I, I would have to just assume Eastern Orthodoxy.
01:11:04.140 I don't know so much about that.
01:11:06.140 Um, notoriously in certain Protestant traditions and Calvinism to do with, um, questions of predestination and divine omniscience in relation to time, the notion of prophecy, you know, the, the notion that, uh, events have been preordained is I think the same bundle of questions, you know, but not.
01:11:35.140 Not conceived in a, in a philosophical language, but in a, in a religious language, like, um, you know, God in a traditional sort of Western theological framework is not waiting to see what happens in the future.
01:11:53.140 Right.
01:11:54.140 Right.
01:11:55.140 So, I mean, God, nothing is going to be, nothing is going to surprise God.
01:12:00.140 Um, nothing is going to surprise eternity if you're more liberal about what that involves.
01:12:09.140 Um, and so therefore, insofar as we have any kind of religious communication, if we, if we have any kind of involvement in the, in the eternal realm at all, then we too cannot simply be, uh, situated in a relation to the future.
01:12:33.140 That is one of just anticipating it from its past.
01:12:38.140 It's not possible.
01:12:39.140 It's not possible.
01:12:40.140 I mean, you know, the, any relation to the eternal puts you in a relation where the whole of time is simultaneous.
01:12:49.140 And from the perspective of eternity, complete.
01:12:54.140 Um, and of course, you know, there is, it's not like this is not something that has been talked about a lot.
01:13:07.140 Um, so yes, I, I mean, I think that the latent traditions of apocalyptic religion, it's like, maybe people stop taking it seriously.
01:13:17.140 And that's why they stop taking this question seriously.
01:13:20.140 But I mean, like, what is the book of revelation?
01:13:23.140 The book of revelation is, um, is written is in some sense supposed to be written and composed out of eternity and is a model of scripture in general like that.
01:13:37.140 Um, and therefore is able to be prophetic and is not simply waiting to see what happens.
01:13:46.140 You know, it doesn't have a relation to time that would mean it's in the position of just waiting to see how things unfold in time because it's not coming.
01:13:58.140 Uh, it's not coming out of the past and waiting for the future.
01:14:02.140 I don't know whether any of that is at all.
01:14:04.140 No, no, that's, that's helpful.
01:14:06.140 Yeah.
01:14:07.140 It's a lot to process there, but, uh, but, but helpful for sure.
01:14:11.140 Um, but I want to ask you, uh, quickly, uh, was thinking about this a little more with patchwork.
01:14:17.140 Uh, one of the key points of patchwork is the mobility of the citizens, right?
01:14:23.140 They can move in between, they can choose a different patch as, as they, as they wish.
01:14:27.140 Um, and I just wondering, you know, especially thinking about what's happening in, uh, in Los Angeles right now, uh, is this ultimately a, a liability for patchwork?
01:14:38.140 Obviously it gets the, the, the state there, the micro state would get to choose who comes in and out as well.
01:14:43.140 So they could control this, uh, to some extent they can, they can choose who they went in.
01:14:47.140 They're probably not going to select a large number of people who are going to be phased out by AI and these types of things.
01:14:52.140 But ultimately there does seem to be a, uh, a degradation of the human quality of life by being constantly mobile and by not having roots, not having community, not having these, uh, established, uh, traditions, uh, being set down in one place.
01:15:10.140 And I'm just wondering if, while the, uh, mobility of human capital might seem, uh, like a positive for these systems, if ultimately it wouldn't degrade, uh, the lives of the people who are moving between these patches.
01:15:24.140 Because while they might be chasing better economic opportunity or better management inside any given patch, uh, won't they ultimately be breaking down the community and the bonds that kind of make them, uh, live fulfilling and meaningful lives.
01:15:38.140 And therefore they won't be as useful as human capital to any given patch.
01:15:43.140 Um, I mean, this is obviously like, uh, uh, it was one of the big conversations, I think, within the kind of neorectionary period.
01:16:01.140 And it's probably, it's a conversation, I guess, that that's been very widespread throughout, you know, large swathes of, of, of history.
01:16:13.140 Um, and I guess it's a kind of like, it can be framed as a kind of critique of liberalism.
01:16:20.140 Um, liberalism does tend to basically favor the kind of mobile, deracinated, rational, um,
01:16:31.140 um, self-interested individual.
01:16:34.140 Um, and, uh, you know, obviously people with more traditional commitments have seen that as being a huge problem.
01:16:45.140 And, and insofar as those ideas have been able to probably, you know, I think anyone would probably agree,
01:16:54.140 being able to radiate themselves across the whole social space.
01:16:58.140 So, so that, you know, people with, in search of a more kind of traditional embedded, sort of deeply mesh social existence,
01:17:10.140 have felt under siege from, from, you know, and feeling that really under attack, that, that, that there is this kind of attempt to transform them into kind of homo economicus, you know,
01:17:25.140 and to transform them into the liberal, the liberal being.
01:17:29.140 Um, and I, I mean, I guess my position on that is like, I really think in an awful lot of these and analogous problems,
01:17:47.140 the ideal solution, however, it could be realized is actual genuine diversity.
01:17:55.140 You know, you know, I think that it should be that people can live like the Amish and they can live like sort of globe trotting economic optimizers,
01:18:10.140 you know, in these little Hong Kong and Singapore type dynamic city states.
01:18:16.140 And the world is a richer place if those two things exist, you know, I, I, I would be sad to see either of them completely extirpated.
01:18:29.140 And I, and I understand, you know, why the traditionalist side would feel that they've had a bad few centuries,
01:18:42.140 and that more needs to be put into kind of fortifying that side of the option space.
01:18:50.140 Um, but I think big, big global cities are, are not going to go away.
01:19:01.140 So, I mean, people who want to live in Tokyo or Shanghai or, or New York, you know, those places have to, have to be there.
01:19:12.140 Uh, I don't see them going away and I don't see them sort of restoring themselves to some more established mode of social organization.
01:19:25.140 I think that we're going to see, you know, the same kind of very highly individualized mobile people being the dwellers in those cosmopolitan cities.
01:19:37.140 I just wonder if it's a inherent problem.
01:19:40.140 You know, you, you talked about, uh, IQ shredders and the problem of, you know, just not, not reproducing high IQ people, not reproducing, uh, and not, uh, not improving the species in that way.
01:19:52.140 Uh, and that being a real problem for the NRX model, ultimately, because you are creating the scenario where you are slowly degrading your human capital, uh, by, by, uh, having that model.
01:20:05.140 And I just wondered if the mobility issue was another instance of this, where it's an, it's inherently a problem for the model because it will, you know, even if you're just looking at an economic optimization model, you will be breaking down your human capital.
01:20:18.140 Uh, and you will be creating a lack of social cohesion that, uh, well, you know, even the most liberal people do need at some level to operate.
01:20:28.140 And so that, that was just my thought is whether this is kind of a, a built in problem for that, uh, for that system.
01:20:34.140 Well, I mean, I think there is a problem for sure, but it tends to be now, I think, swallowed up into this much larger problem of just planetary fertility collapse.
01:20:52.140 Um, that obviously is rolling in from the most kind of, um, the most sophisticated and civilized societies on the planet are the ones that are just seeing birth rates collapse to, to a degree that's almost incomprehensible.
01:21:16.140 You know, like where the East Asian societies, uh, which, you know, I certainly find in lots of ways, very attractive places and they're very, you know, impressive in their level of civility and, um, you know, if, if there's some kind of competition for like civilized life, I think they do extremely well.
01:21:43.140 Well, but they are at the cutting edge of this process of just like almost a voluntary extinction.
01:21:49.140 Like I think all of them are around, if not lower than, uh, TFR one, you know, I halving every generation.
01:22:02.140 So that's such a calamitous situation.
01:22:08.140 If it's, if you're able to roll it forward that I think it's leaving people a little bit, uh, just disoriented.
01:22:19.140 I mean, the sheer craziness of it, um, it's, it's, it's really like the world is now dependent on some other apocalypse to swallow us before this apocalypse annihilates us as a species.
01:22:39.140 I mean, it's like, we actually, the only hope is for something completely massive to just, uh, swallow up the whole thing and turn it into something else.
01:22:51.140 And, and of course, some people think maybe AI is this maybe AI, you know, I was talking to people just the other night about like, are we the pandas that, you know, AI is going to come to the rescue and like force us to breathe, you know, like, um, we're, we're all kind of just blundering our way into extinction.
01:23:13.140 And, and, and something else has to like take over us and like, you're clearly incompetent as a, as a species to deal with your own reproduction.
01:23:21.580 I don't know.
01:23:22.680 AI is going to take away our smartphones and make us sit in a room with each other until we have sex.
01:23:27.140 That's kind of a.
01:23:28.940 I mean, there's lots of models.
01:23:30.640 They're not all that nice, I guess, but yeah.
01:23:33.620 Yeah.
01:23:35.580 Yeah.
01:23:36.100 Sorry, go ahead.
01:23:37.180 No, no, no.
01:23:38.940 I'm waiting for you.
01:23:39.880 Oh, I was just going to say, um, do you think that this is ultimately a, a, a spiritual problem or like a resource problem?
01:23:47.320 Like eventually, will we just have enough technology where the working woman can put the baby in, you know, put the embryo in the machine that births it and she can just go to her, her work every day and that will solve the problem.
01:23:58.660 Or is this ultimately just a species or different cultures that no longer want to exist and, and have a spiritual malaise that is just keeping them from producing no matter what the resources available to them would be?
01:24:12.000 I mean, I, I'm honestly very non-dogmatic on, on this question.
01:24:16.940 I mean, I saw someone making the case that is not unpersuasive to me that, you know, human reproduction is a, is an evolutionary hack in the sense that we, it's not been necessary to provide people with an actual reproduction instinct.
01:24:37.860 They, they, they, that sexual instinct has, is quite satisfactory to get animals to reproduce under, under, under natural circumstances.
01:24:46.400 And so the mere fact that we have technologically and, you know, socially and technologically been able to decouple sexuality from reproduction just leads straight into this problem.
01:25:02.340 That, that, that, that was the argument.
01:25:04.000 You don't need any other complicated explanation for it.
01:25:08.580 Um, now, you know, as I say, I don't want to dogmatically say that's all there is to it.
01:25:16.160 Um, but if that was the case, uh, then what, what are we talking?
01:25:24.920 Is that, is that a, is that a spiritual problem or a technological problem or, I mean, I'm not really quite sure.
01:25:35.940 Um, I'd say it's a spiritual problem brought on, brought on by a technological solution, actually.
01:25:44.340 Uh, I mean, the problem, again, maybe, look, I mean, you might think I'm just being too funky Darwinist.
01:25:54.920 But, but, but the, but the problem strictly is, in those terms, it's just this thing that, that human reproduction is an evolutionary hack.
01:26:05.900 Like, we just don't have, uh, a kind of, um, any kind of reproductive instincts.
01:26:14.660 Um, there's just that, that's not there.
01:26:17.680 And so some people obviously expect, uh, that, that there is an evolutionary answer to that, that whatever slight disposition leads people to reproduce under modern conditions will kind of sweep through the gene pool of all these different populations.
01:26:40.860 And, you know, we're seeing over the course of generations that a kind of, a modified human psychology will emerge that is more compatible with, with, um, population, uh, sustaining human population.
01:27:00.940 But, but it seems to me, like, whatever, there might be all kinds of problems and objections one can have to that.
01:27:07.580 But one of them is the, the timescale just doesn't seem to square with the other things that are going on.
01:27:14.160 So it's like, I don't think I have the same problem with people getting worried about climate change and all of these things.
01:27:24.560 Like, I don't think that any problem that we solve in a, in a hundred years time is very relevant to us now.
01:27:35.540 I mean, I think we've, we, we, we have so much hyper accelerated process going on that by the time where a few decades up the road, the whole environment will completely been transformed and, and therefore any notion we had of solutions to things will be completely obsolescent and they will be being restructured by this new environment.
01:28:00.780 So, um, um, so yeah, I mean, but sorry, that's probably still not answering this, this question.
01:28:09.260 I mean, if it's like, if by saying, is it a spiritual question, does that mean there's a, some kind of spiritual solution to that?
01:28:18.820 Um, and by a spiritual solution, you know, something like a religious revival or something along those lines.
01:28:28.600 So I'm just trying to kind of, sorry, probe, probe you here.
01:28:31.720 Oh no.
01:28:32.240 Well, I guess when I said it's a spiritual problem, I meant, I meant more, is this a problem of the souls?
01:28:38.540 Is this, is this a, a serious, uh, deep problem in humanity?
01:28:43.900 Uh, or is it just like a resource that the current, uh, modern living just creates this, this resource depravity in a certain area or certain working requirements that are keeping us from having children.
01:28:56.820 And eventually we'll just create a technology, uh, that will solve that problem.
01:29:01.440 We'll have, you know, incubators or, you know, all these other things that will do the work of having and raising children for us.
01:29:07.400 Uh, so we can continue to live these modern lives.
01:29:09.880 Or even if we had those technological solutions, would people still choose not to have children simply because they've, they've got this, uh, you know, the spiritual issue, this, this human, uh, human issue that ultimately seems to be driving them towards ending their civilization, their existence, these things.
01:29:27.860 Yeah.
01:29:28.680 I mean, I think that's a good question.
01:29:30.580 I, as I say, I'm radically non dogmatic about it.
01:29:34.600 I, I, I, you know, there's a whole bunch of very interesting science fiction to be ridden on, on all of this.
01:29:44.120 Um, sorry, I was just like, uh, I, yeah, I think that, I mean, a big problem with this, it tends to be, I'm, is that obviously
01:30:00.520 the value of children is something that tends, because again, of the way we're put together to be like retrospective.
01:30:09.960 I mean, I don't know whether this is very sex differentiated, you know, but, um, you can see the problem getting really bad because it tends to be when people don't have children, they don't at all get why they would want them.
01:30:24.640 Um, and obviously once people have children, it's like, oh, the most obvious thing in the world.
01:30:32.960 And obviously like nothing matters other than this, but that whole way of looking at things is just completely invisible from the other side of the, from the other side of the line.
01:30:46.140 So, yeah, I mean, I don't know whether that there's anything there that would point at what might be a solution.
01:30:56.680 I mean, again, it's like eventually natural selection has to tinker with our, our brains if it was left to, if that, if all that was happening in history was this problem, then you could just the kind of, there has to be a kind of fairly straightforward Darwinian solution to it.
01:31:15.320 Um, but unfortunately it's something that's happening in this epoch of just compressed insane change.
01:31:24.620 Um, and so I think those kind of, that type, that, uh, that pace is just not keeping up, you know, things that require many generations to, to unfold just are a very kind of bad kind of science fiction.
01:31:43.700 Um, one last thing I want to ask you before we wrap things up, Alexander Dugan in his fourth political theory, uh, has a very interesting line.
01:31:54.400 He, he kind of comes to accelerationism through his own, uh, process.
01:31:58.940 He talks about Deleuze, he talks about monotonic processes, uh, and really reaching this, uh, moment of post-human politics.
01:32:07.680 But at the end of his acceleration explanation, his, uh, final, I guess, vision is that once you reach kind of this moment, uh, of modernity and post-modernity collapsing on themselves,
01:32:20.440 what you end up is this moment where people actually re-approach the spiritual because it re- allows people to break out of this very rigid, uh, and somewhat nihilistic, uh, obsession with rationality and modern life.
01:32:33.880 And allows them to return to a place where they can engage with, uh, spirituality and these kinds of things.
01:32:40.180 I, I, I don't know if you've read any Dugan.
01:32:42.080 I don't know if you're familiar with any of his work, but I, I didn't know, uh, if you had any, uh, any thoughts on, on kind of his, his approach to accelerationism and where he thinks it might go.
01:32:52.100 Yes.
01:32:52.320 No, I think he's very interesting.
01:32:55.000 I mean, I, I, I, I'm not hugely steeped in his writing, but I've definitely looked at some of it.
01:33:03.740 Um, and, I mean, it seems to me, it's partly what I feel about his work is that it's very Russian, you know?
01:33:20.020 So, and, and I think sort of reciprocally, I would think we would agree on this thing, that we would expect these positions, uh, sort of speculative, political, philosophical, religious positions to be deeply ethnically embedded.
01:33:43.820 Um, much more than has, people have been willing to kind of accept in much of the recent modern world.
01:33:53.600 And so, you know, where I disagree with him is simply because of the fact that I, I'm coming from a different place ethnically.
01:34:08.480 Like, I, I, I kind of completely see, like, as a Russian, of course, this is right.
01:34:15.680 Um, and I, but I think that, you know, I think that the Anglosphere, I think that Anglos, wherever you draw the line around that, have a peculiar planetary destiny.
01:34:30.980 And I, and I, and I think that Dugan would be the first person to acknowledge that.
01:34:37.420 And I, I think part of that is that there's, the religious tradition is so tied up with the ignition process that I don't really think it's extricable.
01:34:54.740 I, I don't think that, I don't think that, I don't think that Anglos can return to something that is not tied up with liberal teleology, technoscientific explosion.
01:35:10.680 Um, all of those topics are sort of embedded in what actually constitutes us as an ethno.
01:35:21.640 Um, so there would, you know, on specific to this question that, you know, my sense of what Dugan's doing is like, it's definitely interesting, but that's not us, I think.
01:35:36.280 Interesting. That's a fascinating way to see that. Yeah. The, the different cultural da signs, uh, going, going different ways, uh, in, in that, uh, destiny. That's, that's very interesting.
01:35:45.500 All right. Well, Nick, it's been an absolute pleasure, uh, talking with you. I don't know if you're working on anything or if there's anything you want to let people know about a book or anything that you'd want to tell, uh, the audience about before we leave.
01:35:57.460 Uh, no, I don't, I don't think I need to, I mean, um, you know, I, I've enjoyed this a lot. It's been great on to be here. Um, and, uh, I, I, I'm sorry if I've been extraordinarily bad at like hearing your questions and not opportunities to wander off at strange angles, but yeah, that's always the best.
01:36:24.100 Yeah, no, no problem there at all. Well guys, if, if it's your first time on this channel, please go ahead and subscribe on YouTube, click the bells and notifications and everything you need to catch these episodes when they go live. Remember we're not live today. So unfortunately, if you ask any questions, we won't be able to answer them. And of course, if you'd like to get these broadcast as podcasts, you need to subscribe to the Oramac entire show on your favorite podcast network. Thank you everybody for watching. And as always, I'll talk to you next time.
01:36:54.100 Thank you.