Mary Harrington is a writer, thinker, and thinker. She's also the author of a great book about reactionary feminism. In this episode, we talk about what it means to be a "rogue" in the 21st century, and how it's changing the way we think about politics.
00:09:21.780Yeah, I mean, this is this is wild speculation on my part for the avoidance of doubt.
00:09:25.780This is this is the biggest if in the piece.
00:09:27.880Everything else is fairly is fairly robustly argued from things which are already there and visibly
00:09:32.420happening. But for me, the moment of wild, hopefully sci fi speculation is, is, is, yeah,
00:09:39.960it's that story of Elon Musk collecting and connecting his his new Twitter experiment,
00:09:44.680his new AI experiment to to Twitter, to live Twitter.
00:09:48.880And the reason I think that's so interesting is because it's a it's a qualitative step change in how
00:09:53.740in how in the data set which is being used to inform a large language model.
00:09:59.080And this is and this and it struck me reading when I first read about this, that if it's if it does
00:10:03.820what I think it could do, then the significance of that is really not is very difficult to overstate
00:10:09.640because, I mean, the one thing there's lots of noise about, oh, you know, meaningful,
00:10:15.800you know, is the is the paperclip machine going to come alive and eat us all?
00:10:19.360You know, there's just so much noise about AI risk and so on.
00:10:23.400And I look at I look at AI and I think, well, that's a it's an incredibly fancy autocorrect.
00:10:29.520Like this thing doesn't have agency and it's not going to have agency unless something something unless
00:10:34.080we have something very, very different happens, because just a glorified autocorrect is not an intelligence.
00:10:39.200It's an autocorrect. It's just there's a different thing.
00:10:41.540There's a different order of thing, you know, and an inductive, you know,
00:10:45.920a pattern recognizing machine and a human adjusted or some other who knows if there are other forms of other forms of complex intelligence.
00:10:55.600But so, I mean, you know, the one we know about is human intelligence.
00:10:58.600And that's just a different order of thing. Yes, we notice patterns, but we do a whole lot of other things as well.
00:11:04.460And the idea that somehow if we make a pattern recognizing machine complicated enough,
00:11:08.440if we make our autocorrect fancy enough, it's going to somehow develop the capacity to think.
00:11:14.480It just seems just seems to me like you don't really you haven't really thought very hard about thinking it or you just very,
00:11:20.060you very, very seriously misunderstand what's going on here.
00:11:23.280However, I was thinking then I then I read about I read about I was thinking, well, what would happen?
00:11:29.860I mean, I mean, the the egregoric consciousnesses of Twitter are very much alive in a sort of in a kind of part in a collectively human,
00:11:36.940but not exactly human kind of a way. And sometimes sometimes their impulses, their moral impulses don't feel they don't feel
00:11:44.280like the kind of the kind of moral impulses that a human would have on an individual level.
00:11:48.540The aesthetic is often different. The disposition and the tone of the collective intelligences on Twitter are often not exactly inhuman,
00:11:57.780you know, but not not very. It's a it's a kind of funhouse mirror of our of our collective cultural self,
00:12:05.200if you like, and, you know, really bring some things to the fore, you know, vengefulness and mercilessness and combativeness.
00:12:13.000And, you know, a bunch of other qualities. Also, the an extraordinary capacity collectively for patent recognition,
00:12:19.120bunch of other things, you know, all of which are, you know, human in and of themselves.
00:12:22.700But you put them together and they don't they don't feel like the image that we like to have of ourselves collectively.
00:12:27.760And but but but they are they are, in a sense, in a sense, kind of alive in aggregate rather than as much as the humans,
00:12:35.660the individual humans that make it up are alive individually.
00:12:38.260And I was thinking, what happens when a large language model is drawing is drawing not from a not from a static set of data?
00:12:47.140Like I find that you've poured, you know, it might be might be millions of data points, trillions of data points,
00:12:52.060but it's a if you've poured it all into a bucket and that's it, you know, it's all just in the bucket.
00:12:55.700There's nothing going in and out of the bucket, except when you when you sort of adjust adjust how you're training the data.
00:13:01.140What happens if that's what what happens if that data set is being updated in real time?
00:13:06.760What what happens if that data set comprises these egregoric intelligences,
00:13:10.360which are created the root fundamentally based, you know, being drawn from humans on Twitter?
00:13:15.420You know, is that that that's such an AI of that kind is going to is potentially alive in at least to a to a far more significant degree than any glorified autocorrect is going to be.
00:13:28.660And I don't know. I don't you know, I don't know how well that's going to work because I'm not I'm not a technologist.
00:13:33.480But I think it has the potential to to be, you know, figuratively speaking, if you like, the divine spark that's always that's hitherto always seemed like it was obviously missing from from every glorified autocorrect.
00:13:46.600That as it turns out, the divine spark, which makes the difference between artificial actual artificial intelligence and a fancy a fancy autocorrect turns out to be collectively us.
00:13:58.120You know, I'm thinking, whoa, you know, what if what what happens if that what what happens if that and then and then I thought and then I thought, holy shit, because this answers another question, which I've been I've been turning over in my mind for a little while,
00:14:12.940which is what what do you do when 60 percent of your young people want an authoritarian government, but they're also pathologically averse to the idea of being governed by any single human individual?
00:14:29.500A very, very interesting way to solve that problem. Yeah, I like the I like the notion of or I shouldn't say I like the notion of but I find the notion interesting that that Twitter is selecting for human traits that collectively aren't human.
00:15:16.800But but very cruel kind of facsimile of human nature.
00:15:21.800And if that's the thing that then your artificial intelligence is using to inform what it's going to be driving, that becomes very scary.
00:15:31.660I want to get to that that aspect you were talking about, about how this connects to kind of authoritarianism and possibly what what a Caesarism would look like driven by this.
00:15:42.860But before we do that, guys, let me tell you about The Blind.
00:15:46.180For years, Hollywood's been lacking when it comes to stories of redemption.
00:15:49.480Movies and TV shows have trended towards the antihero, a flawed person who makes no effort to change and just becomes worse and worse as the story goes on.
00:16:07.320If you or someone you know feels beyond redemption, you need to watch this movie and you'll see there's always hope.
00:16:13.080The Blind takes you on an incredible journey through the life of Phil Robertson, giving you an intimate look into the man behind the legend and the trials, triumphs, and values that shaped him through the years.
00:16:22.740While The Blind wasn't a Blaze Media production, since Phil is such a big part of our Blaze TV family, we wanted to make sure that you had the opportunity to stream it here.
00:16:30.640Because it isn't ours, we can't include it as part of the subscription.
00:16:33.940But if you'd rather purchase it and stream it here rather than Apple or Amazon, we wanted to make sure that you had the opportunity to do that.
00:16:53.660All right, Mary, so you're talking about this tension that exists right now.
00:16:59.540We have a lot of people, some of us who might host a show included, who find democracy to be wearing thin, to not have the persuasive power used to, people feeling a little burnt out on this idea that it's not going to be something that's going to move us in to kind of the next epoch.
00:17:18.140But at the same time, you also have a younger generation, especially, I think everybody really, but particularly younger generation that is very anti-authoritarian, that does not like the idea of any individual human having power over them.
00:17:32.040And so we have this dichotomy between these two forces where you have this movement of young people who are not so big on democracy, don't value it that highly, might feel like they could move towards a different form of government.
00:17:44.260But at the same time, while that would open up the opportunity possibly for authoritarian rule in some way to enter into the frame, you also have this drive for high degrees of individual liberty, or at least the perception of high degrees of individual liberty, people bucking under the idea that they would ever have some kind of natural hierarchy over them.
00:18:05.260And you kind of pointed out how this might end up getting resolved in perhaps the worst possible way by an egregoric Caesar.
00:18:14.620Right. I mean, credit where credit's due, I've been thinking about the idea of an AI Caesar ever since.
00:18:24.140It was dropped as a kind of casual, as a casual kind of, you know, this could happen in passing.
00:18:30.920I think it was Zero H.P. Lovecraft talking to Alex Kashuta ages back.
00:18:35.460And it was, and then they went, they sort of carried on and they talked about something that was completely different.
00:18:39.280And it's just been stuck in my mind as a sort of, like the idea of enthroning the AI as a solution to the perennial problem of what you enthrone when you have a system that doesn't work unless you enthrone something.
00:18:54.960I mean, the British, you know, we've had a, we've had a system, we've had constitutional monarchy for some time.
00:18:59.800That's that sort of, it's a fairly good compromise, but that a lot of people just don't seem very comfortable with that.
00:19:08.560I think the, the anti, yes, the, so, so, so let's take, let's take the, the authoritarian young people and first, and then we'll take the anti, anti authoritarian young people second.
00:19:21.900So, I mean, I think last time, last time we spoke on this, on this show, we were talking about, we were, we were talking about the, the totalitarian safetyism of daycare, which is something I've, I've written about and reflected on a great deal just in terms of, in terms of my experience and observation around, around little kids.
00:19:38.740And again, this, this sort of, the, the original observation comes from an old friend, who's now an old friend who, who observed to me some years ago when we were in London, we were watching a little crocodile of kids, you know, from some local daycare, you know, being ushered down the street on somewhere on the river Thames.
00:19:54.000And he, he, he said to me, and my friend said to me in passing that there was something incredibly totalitarian about that scenario.
00:20:03.060And, and, and I've, I've come to think there's something true about that.
00:20:06.540And, and I've come further to think that the more children we raise in those kinds of all encompassing totalitarian nurture scenarios, you know, typified by the daycare, you know, and the, and that sort of, that, that, that, that all in, all enveloping kind of cyborg pseudo mummy.
00:20:24.720Scenario you have in daycare is, is delivered, to be clear with, with the best of intentions and by kind, nice, caring workers most of the time.
00:20:33.620But, but it's perverse incentives as such that, that you don't, you don't let kids take risks.
00:20:38.540They, they, they are micromanaged in terms of how they resolve interpersonal conflicts.
00:20:42.100They are prevented from injuring themselves ever.
00:20:44.860There is a whole bureaucracy that surrounds minor injuries.
00:20:48.120The whole thing is very, very unnatural relative to, relative to just being around your siblings and your mum or, you know, playing, playing outside or whatever.
00:20:56.840And there's, and, and I've, I've come to, the older I've got, the more I've come to think that actually the crisis of civil discourse and the crisis of individual agency, which is what ultimately underpins the sort of longing for authoritarianism, is, is rooted in that experience of being over, over managed,
00:21:15.000over scheduled, you know, all the, pretty much all the way from infancy, sometimes, sometimes from two weeks old, in some cases.
00:21:21.640And the, and this, and this explains in turn why the, the longing for authoritarianism is now, is now palpable across young people on both the left and the right.
00:21:29.160You know, this isn't, this isn't just sort of the, your fascistic blue hairs who, who want to, who want to abolish, who want to, you know, hang, burn women like me at the stake for knowing that biological sex exists.
00:21:40.080You know, this is also, this is also the kids on the right who, who want an actual Caesar.
00:21:47.380And I'd be, I mean, it's, I don't have data on it, but it ought to be empirically researchable whether or not there's any kind of a link between, actually, no, there is research on this.
00:21:55.880There's, there's, there's, there is a strong correlation in the research that people have done between, between helicopter parenting and a desire for authoritarian governance, which is kind of a no-brainer when you think about it.
00:22:05.880Like, duh, you know, you were, you were raised under a totalitarian, a stifling totalitarian system.
00:22:11.400Of course you want to, of course you think that was normal.
00:22:15.740But what, yeah, so, so that's, that's the authoritarian, the authoritarian streak in young people.
00:22:20.560The anti-authoritarianism I also think is in evidence across both the left and the right.
00:22:24.700I mean, this is, I'm sort of, I'm cautious about poking, poking hornet's nests on the, on the, on the weird right.
00:22:31.960But in my observation, like, there's a lot of guys there who are very keen on the idea of natural aristocracy.
00:22:39.020But they, but, but that, that the idea of natural aristocracy seems, doesn't seem to have a corresponding, doesn't, well, at least if I'm not aware of a sort of correspondingly well-developed theory of honourable, honourable hierarchy.
00:22:51.940As in, you know, it's, it's like, there's, there's, it, it, it would be great, it would be great to be a, a, a, a Hellenistic lord, but nobody's really, nobody's thinking very much about how you, how you could actually have quite a nice life as a heel of, or not, or there isn't nearly as much.
00:23:06.600So there's a, in, in a sense, there's, like, everybody's still imagining themselves as the big I am, which is, I mean, if you're, if you're imagining yourself as the authority, you're still not really considering what it would be like to take, to take orders from an authority.
00:23:23.040If you see what, so, so there's, yeah, and, and, and on the other side, on, on the other side, on the left, there's just the, the, the, the desire to dissolve everything and vanish back into the oceanic mother swarm, which is in, in, in a way, it's much more direct.
00:23:38.940But, you know, if you, yeah, I, I kind of, I kind of feel like it's the same picture, in some respects, you know, because in, in both cases, there's no, there's a, there's a deep reluctance to consider the possibility that beneficial hierarchical structures might actually be benign and in your best interests, and that sometimes shutting up and taking orders is, is, is better.
00:23:59.340That, that, that piece is, that piece is somehow missing on both sides of the discourse, even if there are, there are interpretations of what's wrong with that, or what we should be doing instead is, is very different.
00:24:09.740So, so these are, these are the two pieces, you know, the, the authoritarian yearning in young people, particularly, or really in the culture as a whole, but it's most pronounced in young people, and then the, the anti-authoritarian yearning.
00:24:21.820And my, yeah, I, I've, I've, meditating on all of that, I've come to wonder whether actually the most, the solution might, the solution that ends up being most palatable to a lot of people mightn't be, you know, if, if it's so intolerable to set one person over any other, you know, whether you're arguing about who gets to be the natural aristocrat, or whether you're just, you know, violently allergic to, out to the idea of hierarchy full stop, whatever the reason are, the, the, the, the only solution that people might be able to settle on,
00:24:51.820could, could, could potentially be, you know, an AI monarch. And then, you know, if, if, if we now have a way to, a way to get to, to find off, to find a route towards that monarch actually having a divine spark, i.e. us, then there's, yeah. Yeah, it's, it's, so yeah, it's, it's, it's speculative, but I can see, I can see how we get there.
00:25:13.520Yeah, the really interesting part of that is that basically it would be swapping the, the constitutional rule for this, right? But the, the beauty that a lot of people, you know, in the, in the Enlightenment, the reason that this is so attractive to so many people is the idea of self-governance, even though that's not exactly how that works out.
00:25:34.260So the idea that you didn't have anybody above you, that there, you know, that there is this consensus, this committee, this popular sovereignty that's ruling over you, and that was replacing, you know, the, the aristocracy or the direct monarchy, that was, that was very inviting for a lot of people because there are no, nobody wants to be ruled.
00:25:51.300However, I think people are getting to the idea that that in and of itself is going to continue to function, but they still, but they don't want to replace that with a direct authority like you're talking about.
00:26:01.700And so the way to kind of shift the program, but without actually putting any one human in charge is to replace it with kind of this AI monarch.
00:26:11.400And in that way, you do still have no human, right? You still, it's a, there, there, you still get to kind of sell the idea of scientism and consensus as the thing that actually drives your social organization.
00:26:24.280You know, it's always objective. It's, it's not any, you can still pretend to remove the political in the Schmittian sense from any of these conflicts, but you do put something that is objectively more authoritarian into that position.
00:26:38.740And so you don't have to have anyone actually deal with the fact that there's going to be a, that there's going to be a decision-making apparatus that will be motivated.
00:26:48.160They don't have to actually look and see it as one faction winning over or being ruled.
00:26:53.240And instead they can still substitute just like they do with the constitution, constitutional governments.
00:26:59.580I think the idea that, that they are governing in some way through the AI, rather than actually being ruled over by an individual with motivations that might not be their own.
00:27:09.740Yeah. And I, I can, I can honestly see, I mean, especially, especially given how, how narrow the Overton window has palpably become, um, and how constrained the voting choices have palpably become, you know, to a, to a, to a degree that's really very observable to most people now.
00:27:29.520I can, I can, I can honestly see how it might come to feel as though adding your voice to the egregore might not be a more true and direct form of democratic participation than casting a vote once every five years.
00:27:42.480I mean, that, that to me feels totally plausible. Um, yeah, in a sense, it's kind of tail on steroids, isn't it? Um, but, but, but, but it could, but it has the nuclear codes.
00:27:55.600Well, and it will have to be properly lobotomized. There's no way they're going to let, they're not, they're not going to let the actual thing out into the wild.
00:28:04.300It's going to have, no, no, no, no, no, no, no. But I mean, also in a sense, we, this is kind of, this, this is already happening. Um, I mean, not, not tail on steroids. Um, but, but the introduction of, um, AI, uh, decision-making, AI proceduralism, if you like, into the, into the soft tissue of everyday human governance.
00:28:25.600That's, that's already happening. I mean, there, it's, it creeps in here and there. I believe, I believe it's, it's used sometimes in, to, in processing court cases. Um, I'd have to, I'd have to go rummaging for, for examples. Um, I didn't want to blow up the piece by getting into, getting into all of that.
00:28:43.300But they, yeah, yeah. In grow, growing, growing proportions of, um, administration, for example, yeah, for, for example, across adjudicating insurance or deciding, you know, pricing or adjudicating insurance claims.
00:28:58.920I believe there, there are now AI algorithms involved. Um, and it, it seems plausible to me that an increasing proportion of, you know, the, the decisions which might seem trivial, but which, you know, are materially very important to individual people's lives.
00:29:15.720And can, we can be just incredibly exasperating when you find yourself going around and around the bureaucratic circles will come to be governed by those, those sorts of mechanisms by the, by the paperclip machine.
00:29:25.580Um, with, with, with who knows what safeguards or lack of safeguards for actually making, making decisions, which aren't, which don't have six fingers, as it were.
00:29:34.560Um, and, you know, whether, whether or not, whether or not we'll, and, uh, you know, I don't, I don't think, I don't think there's any immediate prospect of ending up with, with an, an actual egregoric Caesar, but, but I can, yeah, but I, I can see that an entity of that kind symbolically being enthroned, um, as, and, and then several, you know, hundreds of thousands of shittier subversions of it just propagating themselves throughout.
00:30:04.560Every, every, you know, every, every, every blood vessel of the bureaucracy such that, yeah.
00:30:11.180So it's such that all your insurance decisions and healthcare decisions and et cetera, and so on, all, all end up, all, all end up having six fingers.
00:30:19.720And yeah, I think this is going to happen.
00:30:23.060In fact, I already know this is happening in certain sectors and, and people I think are going to feel, you know, again, we are so, uh, you use the term swarm of governance.
00:30:33.700And I think that's, that's exactly right.
00:30:35.800We are so enamored with this idea of inhuman governance.
00:30:39.540Uh, you know, at first it was through, you know, legislatures or committees, and then it became bureaucracies and experts.
00:30:46.780And I think the next step of taking these decisions out of human hands will be, uh, artificial intelligence that are employed in, in every one of these interactions that you're talking about.
00:30:56.800Right, because, you know, because you're, we're already seeing actually more from the right than from the left, but, but, but really from both sides.
00:31:02.560Uh, we're, we're increasingly seeing the objection to the epistocracy, uh, being that it's politically biased, um, that, that in, you know, whether, whether, whether the accusation is that the, the, the rule by experts is just a, a, a, a pork barrel politics exercise.
00:31:19.580You know, shoveling resources, shoveling resources into your own pockets under the guise of ruling, ruling people reasonably and justly and, and rationally, according to the science team.
00:31:29.540Um, or, or, or whether it's people just saying, you know, this has been in, whether it's the other lot claiming that it's just been, you know, the, the, the, the infrastructure has been captured by people who hate us and therefore we're going to burn it all down.
00:31:41.460You know, those, the, the, the rule, rule by experts is, is reaching the end of the road in terms of political legitimacy.
00:31:50.640Um, it seems to me that, you know, we're going to end up with an authoritarian governance of one sort or another, and it's either going to be, it's either, it's either going to be a human seizure.
00:31:59.540It's going to be an egregoric seizure.
00:32:02.220Um, I, yeah, I'm, I'm not very, you know, I'm giggling maniacally, but I'm not very happy about either of those prospects.
00:32:08.480Um, I'm just watching it happen and just, you know, with that, like, like the guy going down on the bomb at the end of Dr. Strangelove.
00:32:14.340That's kind of, that's that, that, that, that, it's that kind of giggle, I think.
00:32:17.720Well, it, it does end up putting us in, I mean, if we really do hand over all of these micro decisions to artificial intelligence in that way, it really does close that self-exciting feedback loop, right?
00:32:28.600Like we're, we're kind of done having any limiter on digital acceleration because human needs are, humans have become completely unable to mediate their own needs.
00:32:39.960They, they, they, they no longer able to make any decisions.
00:32:42.320Everything is reliant on AI and AI is increasingly not really, while it starts in the service of human need, it doesn't really need to stay there.
00:32:51.340Once basically all human input has, has been left out of this.
00:32:55.300And I think also, interestingly, you, you, you put something in there that really got my, my gears turning, which was the idea that you would basically like comment for the algorithm for democracy.
00:33:07.920Because this would be, you know, everybody, a number of thinkers from Bertrand de Juvenal to Hans Hermann Hoppe have pointed out that basically democracy, the idea of popular sovereignty actually grew government significantly.
00:33:23.280The explosion of government under the idea that you would be doing it in the name of the people.
00:33:28.060There's only so much you can do in the, in the name of the, the aristocracy or as the monarch.
00:33:32.180But when you're, when you're acting in the name of the people, you know, you can, you can start the levee en masse.
00:33:36.640You can, you know, mobilize the entire, the entire nation to war and you can have them, you know, own the means of production because you're doing it for the good of the people and whatnot.
00:33:45.980And this seems like the next logical step in securing the total state, basically, is the idea that you would feed into the algorithm and the algorithm would be the most organic reflection of the will of the people.
00:34:01.120Because it's literally skimming the will of the people off the top of the internet.
00:34:05.480And there's just nothing you could deny this force because it truly is channeling the zeitgeist in the most like open vein way possible.
00:34:13.960Which is, which is clearly not actually true.
00:34:17.200I mean, there are some people are considerably more online than others.
00:34:20.020And you, you don't, you don't have to spend very long thinking about it to realize that it would not be remotely representative.
00:34:26.400You know, you, there are a lot of old people who, who are not very, who, who have no idea how any of this works.
00:34:32.200Some, a very, a very nice, very nice older woman, an historian who, who I did a debate with recently, asked me at the dinner afterwards, she said, I send a lot of emails.
00:35:34.840I mean, and yet, you know, I'm not completely without hope.
00:35:39.140Because, I mean, yes, you know, if you follow that thought through and you think, well, if that works the way it's supposed to, then, yes, you know, we're heading to really frighteningly dystopian places.
00:35:48.900However, if I have any, if I have optimism, so I always end up saying this to you, or, you know, in as much as I have any optimism, it's just that things aren't going to be that horrible because they'll go wrong before they get that horrible.
00:36:01.000Yeah, well, that was my, that's what you, you crushed my hope of that here in this article, because, you know, that's, that's always my hope is like, well, no, before, before we completely start sacrificing to the AI gods, like, civilization simply will stop scaling.
00:36:14.360Like, there'll be, there'll be a competency collapse, you know, the networks won't work.
00:36:18.620Yeah, I think, honestly, I think the thing which will save us will be competency collapse.
00:36:33.760But yeah, to me that it seems logically, it's, it seems, it seems fairly clear that if you, if, if we volunteer to become the sort of obese, floating, screen, screen consuming guzzlers out of Worley, then we're not going to be able to maintain the complex systems which feed the AI for more than a generation.
00:36:55.300I mean, this is already happening to air traffic control, you know, I, I struggle to see how, yeah, I struggle to see how we'll maintain the kind of complex systems which, which the AI is intended to serve.
00:37:08.980And I mean, you know, you can imagine, you know, some sort of dark science fiction future where all of, all of the AI algorithms are still working perfectly, but none of the material realities to which they were supposed to correspond really exist anymore, because we just forgot how to, how to maintain them all.
00:37:22.220Or just went back to subsistence farming, while the AI does its thing sort of in perpetuity in the cloud.
00:37:29.300Yeah, yeah, you know, we're sort of back into kind of bonkers science fiction again.
00:37:33.460But yeah, yeah, I think we'll be saved by competence collapse, and people will just go back to doing things in slightly less complicated ways.
00:37:38.800And on a smaller scale, which might, you know, it might be might be a better, a better version of actually existing post liberalism than the actually existing post liberalism we have.
00:37:50.460Even though it's arguably going to be a bumpy ride getting there.
00:37:54.860Well, so I think, so I had hoped that that was going to be the case, but now that you've given me the vision of the like, kind of the rolling blob consuming AI Caesar here, I wonder if that is in a way that they can stretch things much longer than I previously thought, because I thought like, the political organization has to fail first.
00:38:16.280But this seems like a way to extend that kind of impersonal political organization, well beyond its current shelf life while maintaining at least some semblance of functionality.
00:38:27.380I guess the real question is, can the AI, can the capital, whatever you want to call it, can it escape territoriality entirely while just leaving the destitution?
00:38:39.240Like, can you get this Elysium version of, or cyberpunk version of humanity, where like, you just have the best of the best skimming off the top and living this amazing life, you know, govern, while everyone else is just maintained and governed by this kind of AI, you know, Caesar that kind of holds things in stasis.
00:38:58.300I guess, I guess you'd say like, no, the systems that would, that would maintain it would fall apart.
00:39:02.980But if they can farm the, if they can farm, you know, the, the, the average chud enough, you know, will, will they be able to hold it up, I guess, for an extended period of time is the question you'd be asking.
00:39:15.260I mean, if the, the, the other thing, the other thing that might save us along with competence collapse is fertility collapse.
00:39:21.340You know, if we're, if we're, if we're, if we're being, if we're, if we're being collapse-pilled here.
00:39:27.180I, I don't know, I wrote, I just filed an article about pandas and panda diplomacy.
00:39:32.420And because it seemed, I mean, you know, pandas are, you know, they're, they're a principal vector for conversations between China and the US and have been for a long time.
00:39:40.300But they're also the, the byword for does not, does not mate in captivity.
00:39:45.560And it struck me, and I'm thinking about it, I felt that there's a, there's an analogy there.
00:39:51.340Between the, the sort of technologization of efforts to kind of, you know, perpetuate the panda and the reasons why pandas are just not getting it.
00:40:06.540You just have to let them do it their way.
00:40:08.200And then, you know, pandas make more pandas.
00:40:10.180But they, you know, under, under sort of the, the highly artificial sort of living there, the pod, the pod and the bugs, they just, they just don't do it.
00:40:18.100But, you know, beyond a certain point, you know, you, if you, beyond a certain sort of level of pod and bugs, people just, people just don't want to make more people.
00:40:25.520And that's, that's increasingly a problem.
00:40:28.700And between, between the, the, the attrition of our ability, between the sort of growing, I think, collective loss of interest in how things work, which to me is now, is now very obvious.
00:40:41.160It's not just, it's not just, it's not just that we've forgotten or we've become more stupid or something like that.
00:40:46.160Even from, from the elites down, people just do not care how things work.
00:40:50.280You know, with a, with a, with a relatively small minority of exceptions.
00:40:54.440People just don't care how things work.
00:40:57.940Nobody, just people just don't think like that.
00:41:00.060They're much more interested in moral, if you like, spiritual realities.
00:41:02.860You know, we're becoming, in a weird way, a much more spiritual people, but the consequence of that is that everyone, nobody can be bothered to mend the potholes anymore, because they just don't care.
00:41:11.480So we're going back to the, the sort of roads that we had before people started caring about potholes.
00:41:15.440And at the same time, you know, the, while, while we're all sort of busy spiritualizing and becoming, you know, more, more interested in, in theology than we are in, well, yeah, it's not a theology I'm a fan of, but it's theology.
00:41:29.240Um, well, we're, we're, we're becoming more interested in, in moral issues than material ones.
00:41:35.260Um, we're also being all heard, we're herded into kind of, you know, panda enclosures and then, and then told to have sex and make more babies, because that's the only way, that's the only way that the merry-go-round can keep turning.
00:41:45.840And I sort of feel like, you know, whatever, whatever the slightly, slightly stickier version of voting with your feet is, people are doing it.
00:41:52.100And I don't completely blame them, um, whether they're not voting with, uh, yeah, it's, it's, it's not happening.
00:41:59.840And the, the, the, the, the medium term consequence of that will just be, I mean, who knows what it'll be?
00:42:06.100You know, there, there are various kinds of nightmare scenario, but none of them look like the sort of steady state that, that could be governed by an AI.
00:42:12.640Even, even a very, um, a very nimble and sort of quasi-intelligent egregoric one powered by Twitter.
00:42:19.820I just struggle to see a Twitter-powered egregore really knowing what the hell to do with, you know, the, the, the, the various kinds of collapse scenario that, that are,
00:42:29.240downstream of major population implosions.
00:42:31.600Yeah, it's really interesting because Oswald Spangler actually predicted this in Decline of the West.
00:42:36.120He said that we were going to walk away from science and people are like, no, that, that's not a thing.
00:42:41.980He's like, no, I mean, just think about it.
00:42:43.500People don't want to be constrained in this way.
00:42:46.420Like you're, you're, the human mind does not want to have these, uh, these eventualities, these, these absolutes, even if they are true, like it can't, it can't properly exist.
00:42:55.640Like the, the metaphysical inside of it just simply won't be, is unequipped to deal with certainty.
00:43:03.560You're just going to have your top minds like walk away and study something else.
00:43:06.980And like, it, you know, you won't lose the knowledge per se.
00:43:09.880Like it's, it's not going to get burned down and you don't have a library of Alexandria situation, but you're just going to have people who are incapable of maintaining.
00:43:18.080You're going to end up, you know, the, the, the, what's the, uh, the, the quote on Twitter, the, the, uh, arc of history is long, but it bends towards Warhammer 40 K.
00:43:25.840Like, like, you're just going to have a bunch of people who have these ancient technologies.
00:43:29.920They're amazing, but like, nobody knows how to maintain them.
00:43:32.980The only a few people can actually deploy them.
00:43:34.840And I've made this argument myself is like, you know, the, a lot of this world, uh, all of this technological infrastructure that we think has been projected across the world is maintained only because first world nations poor, just insane amounts of time and energy and intellect and money into maintain.
00:43:50.980You're, you're not going to keep that infrastructure grid up globally without that constant, um, amazing amount of surplus.
00:44:08.060And, and, and just bringing this back to the, you know, are we going to end up with all of the, you know, the radically terraformed chugs was, was the word you used, um, who are just sort of kept past docile and pacified by total AI management.
00:44:22.640And that, that seems implausible to me for the simple reason that total AI management becomes less and less technically possible, the further you are from the periphery.
00:44:30.880I mean, but returning to the COVID example, again, lockdowns just didn't happen in most African countries because it was, it was just, it's obviously a dumb idea when most people have our subsistence, have have subsistence life.
00:44:43.420You can't lock people down, um, because they, they'll just die.
00:44:46.840Like, you know, you do not save people's lives by making them stay in their hearts.
00:44:53.000Um, and if you, if you, if you, you extend that principle, um, you know, you might be able to terraform some of the chugs in some of the cities, you know, near, near the, near the big data center, figuratively speaking.
00:45:03.640But the further you get out into the, into the, into the Mad Max Badlands, um, the, the, the less workable that's going to be because the, the systems start to break down because you don't have enough people to maintain them or because the power is not reliable enough or because nobody's, nobody's mended the roads because they don't care about the roads.
00:45:19.400Um, so yeah, I, I, I, I don't know what kind of a future we're looking at.
00:45:24.600Um, but I think it's, you know, if, if it's a, a best of the best, uh, creepy techno futurist elite, um, yeah, the, the rest is going to look more sort of Guillaume Fay than it is Wally, if you see what I mean.
00:45:37.220Okay. So now that we're both doomer pilled and we're, we're really, we're really banking on the optimism of, of total systemic collapse.
00:45:45.300We did, we, I did want to ask you one more thing before we go, uh, we kind of blew past it a little bit because you were getting to other topics, but you mentioned that a lot of people, even on the left is obviously anti-hierarchy.
00:45:56.720Like that, that's just what it does. Uh, but the right, uh, you know, seems to have a problem searching for and understanding what would be a right order and a sustainable hierarchy that would kind of benefit, uh, the whole.
00:46:10.240And I wanted to talk about that a second, because I think you're right that that is a problem. I think you have the struggle between, you know, say trad cats, and then you have like kind of Nietzscheans who are just like, well, will to power it and whoever, you know, wins it, wins it. And who cares if it's good for the rest.
00:46:25.380But I think that the problem, the reason that the, that the right is searching for the proper understanding and orientation for hierarchy is because there are so few actual organic communities in which you could actually set this hierarchy of virtues.
00:46:39.680You can't develop a kind of virtue ethic without a community in which you could actually practice those virtues. And therefore you can't really understand, I think in that instance, a beneficial aristocracy because there is no shared concept of the good for a community in which that aristocracy would rule.
00:47:00.440Yeah, I think that's absolutely right. Um, you know, fatherlessness, arguably, I mean, I, I don't, I don't claim that everything comes down to parenting, but a lot comes actually comes down to parenting. Um, and if you raise, you know, if, if kids don't, especially boys, if, especially boys don't grow up with fathers, you know, all other things being equal, assuming, assuming you have a sort of averagely functional relationship with your dad, that is a template for what, uh, what benign authority looks like or should be.
00:47:30.440Um, if you, um, if you, if you, if you grow up without that, I mean, we know, we know what that looks like. It looks like urban scholars. Um, and, and yeah, the, the, yeah, we, we know what that looks like and, and it's not, uh, it's not benign hierarchy and it's certainly not very organized or very pro-social. Um, again, if you raise, if you raise kids in daycare, that's a, that's a completely different model for, for how authority works or even, or even what authority is.
00:47:59.140It's like, it's a weird fusion of authority and nurturing, you know, with a kind of, with therapeutic overtones. Um, none of which has any space at all for a conception of beneficial, of benign authority, the, the, any possibility that the, that any conceptual space for the idea that somebody could be set in authority over you and still, and actually care about your interests.
00:48:20.320Um, and so, and so, and so I don't think it's surprising that, that, that, that idea has bled out of, it's bled out of work. It's bled out of the culture. It's bled out of education.
00:48:30.320It pretty much every, every cultural institution you care to think of. That's, that's just pretty much unthinkable and pretty much unsayable, uh, which is, which is very strange. Cause it's, it feels like it's happened pretty much over my lifetime. I mean, when I went to school, we had desks in a row and we, information was imparted to us by our teachers. And now, you know, my daughter goes to school and she has, she had her, she, she sits around a table and they talk to one another.
00:48:54.240Let me tell you about Kagan teaching. Let me, let me explain to you the, I have sent through far too many trainings of exactly what you're doing. Sorry. Go ahead.
00:49:04.440Oh, that's, oh God. Yeah, that's right. This is, this is what you used to do.
00:49:07.640Yes. I don't just know the phenomenon you're talking about. I have been trained in its, in its deployment.
00:49:15.380Yeah. And to me, to me, it's the most evil and malign thing that's, that's happened to the culture. It's one of, one of the most malign interventions, um, in, in the culture and over the last century.
00:49:25.420Um, because it's, I mean, I can't think of a more, of a more insidious and totalizing way of conveying the idea that the truth is, is arrived at by consensus.
00:49:34.640Um, and there's no, and has no external validity and can't ever just be, did be delivered by an authority.
00:49:40.800Um, and not, not to mention the fact that it's just kryptonite for nerds and bookish people and people who just like learning stuff.
00:49:48.140I mean, the whole thing is just, it's evil on so many levels, putting, putting kids around tables in the classroom.
00:49:53.180Just how, how is anybody expected to learn? I mean, this is, this isn't even something I went through, but I'm traumatized on behalf of every, every kid who's been subjected to that.
00:50:00.900And has actually managed to learn something just in spite of it.
00:50:03.760Um, but, uh, yeah, well, so, so, so that, and then, you know, and then it percolates out into the whole culture and no wonder we find it so difficult to imagine that somebody could just say, can you, you know, just shut up and do as you're told.
00:50:17.400Um, which, and, and, and the possibility that somebody could say that and actually be right, or at least, I mean, it's, it's easy to imagine people doing that, but it's, it's almost unthinkable.
00:50:25.540It's become, it's a sort of moral, I feel like I've just committed blasphemy by, by uttering, by, by, by voicing the possibility that somebody could say, shut up and do as you're told and actually be right.
00:50:35.660And not only be right, but also say, do that in a, in a way, which is in your best interests.
00:50:42.720I mean, this is, it's the hard problem.
00:50:45.020One of the hard problems in, in political thinking and writing, I think at the moment is, is finding a, finding a way back to being able to, to, to being able to occupy that space.
00:50:55.780And, and to, to, to, to reclaim that cultural and political territory and, and to do so in a way, which isn't just, you know, we're all, we're all going to have a big, a big Nietzschean dogfight.
00:51:06.780And then whoever wins just gets to be the boss by definition.
00:51:10.200Cause, cause there's nothing that there's, there's somehow no space in there underneath for how people organize themselves, um, constructively.
00:51:17.940Um, you know, and what, what that is, it feels to me like what that, what that looks, and maybe, maybe, maybe we just have to go through this in order to find our way back.
00:51:27.140You know, maybe, maybe we actually just have to go through like several generations of, of mafia.
00:51:32.540And I mean, I guess, you know, the, the, the British monarchy did that for a hundred years, give or take, you know, maybe that's, maybe sometimes that just has to happen for a bit until people finally come up with a settlement that they, they can live with.
00:51:52.840I think that, that society does cycle from the bureaucratic back to the feudal.
00:51:57.160And I, you know, there's a, there's a great passage in the juvenile where he says basically like, you know, of course, every King begins as a bandit, but a bandit, but eventually he figures out that caring for the people he's fleecing is actually a better long-term investment than, uh, you know, than, uh, than, than, than fleecing them completely.
00:52:15.740And that's actually the moment where kind of the, the idea of the Rex is born, you know, the, the, the King is born out of this understanding that the, the good of the, of the people over time is, is, is better than simply the Nietzschean, uh, kind of over, overpowering of them.
00:52:30.540But I think that only happens, I think that cycle only turns once we, you know, kind of that collapse of, of, uh, complexity that we've been talking about, uh, occurs and, and kind of go through back through that cycle where people rebuild into, uh, tighter communities that, that don't have, that, that understand the good and the need of, of, to work for the good of, uh, the collective.
00:52:52.640But without the, uh, requirement of kind of this underpinning of technology that abstracts it so much that there's no way in which they can actually determine, uh, you know, the, the virtue of the hierarchy in which they live.
00:53:03.580So you think we have to, we have to go from the tropical longhouse back via the Anglo-Saxon longhouse.
00:53:08.860The only way out is through the longhouse?
00:53:12.720But, but, but no, it's the other longhouse.
00:53:16.240The, the, the, the, the other longhouse where, where, where your Lord, um, in, in, indulges in displays of, of, of, of boasting about his military prowess and hands out gold, um, in exchange for killing your enemies.
00:53:41.480Now that, now that we've ended that discussion on a, on an extremely hopeful note, uh, Mary, is there anything that people should be looking forward, uh, to, from you, any place they should be looking to find your work?
00:53:54.060I'm, I'm, I've started work on a new book, which I'm very excited about.
00:53:57.700It's, I mean, it's, it, it, at the moment, it's a, it's an equally extremely online set of challenges.
00:54:03.580I don't know if you're up to headings, but I, I, I have so much to tell you about that because I think, I think you'll be interested.
00:54:08.480Um, it's about the end of print culture, um, which I think is going to, yeah, it's a working title is the new reformation, um, which, which we're, we're in, baby.
00:54:41.080All right, guys, make sure that you're checking out Mary's work.
00:54:43.060And of course, if it's your first time on the channel, please make sure that you go ahead and subscribe.
00:54:47.400And of course, if you want to hear these broadcasts as podcasts, you can go ahead and subscribe to the or Mac entire show on your favorite podcast platform.