The Auron MacIntyre Show - November 22, 2023


Egregoric Caesar | Guest: Mary Harrington | 11⧸22⧸23


Episode Stats

Length

54 minutes

Words per Minute

191.08072

Word Count

10,506

Sentence Count

477

Misogynist Sentences

4

Hate Speech Sentences

8


Summary

Mary Harrington is a writer, thinker, and thinker. She's also the author of a great book about reactionary feminism. In this episode, we talk about what it means to be a "rogue" in the 21st century, and how it's changing the way we think about politics.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.320 Rocky's Vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.200 Sponsored by Bell. Conditions apply.
00:00:28.720 CRCanada.com.
00:00:30.320 Hey everybody, how's it going?
00:00:31.960 Thanks for joining me this afternoon.
00:00:33.280 I've got a great stream with a great guest.
00:00:35.120 I think you're really going to enjoy.
00:00:37.700 So, Mary Harrington is a really interesting author.
00:00:40.460 She's got a great book out there about reactionary feminism.
00:00:43.980 Don't let that title scare you.
00:00:46.180 Mary, I'm telling you, I like the phrase,
00:00:48.140 but every time you're on, someone's always like,
00:00:50.220 feminism, it's got to be the devil.
00:00:52.720 She's subversive, and so I've always got to explain
00:00:55.220 there's more to it, but thanks for joining me.
00:00:58.240 Thank you for having me.
00:01:00.000 So you wrote this great piece about egregoric
00:01:04.220 Caesarism, which is the most online title a human being
00:01:10.180 could ever craft.
00:01:12.660 No, no, no.
00:01:13.140 I think I actually came up with a more online one once
00:01:15.780 about hyperpalatability and male-to-male transsexuals,
00:01:19.860 but it's definitely in the top five.
00:01:22.080 Yeah, I appreciate that that's a category that's really stuffed for you,
00:01:26.160 but yeah, it's a very online title.
00:01:31.620 However, it had some really interesting twists and turns that got me thinking
00:01:35.940 about a lot of things that I've been exploring when it comes to what you call
00:01:39.860 governance by swarm and the possibility of how this is going to transform the way that
00:01:45.080 people look at what seems to be a trend towards possibly authoritarian government.
00:01:51.860 But before we dive into all that, guys, let's hear really quickly from our sponsor.
00:01:56.260 These days, it's impossible to thrive with just one job.
00:01:58.600 Between increasing living costs, paying off debts, and planning for the future,
00:02:02.300 things like buying a home, building savings, and even going on vacation can seem like fantasies.
00:02:07.060 If your goal is financial freedom, you could start taking on more hours at your current job,
00:02:10.920 work towards a promotion, or try putting your money into something risky like stocks,
00:02:15.220 cryptocurrencies, or even a side hustle.
00:02:17.260 But at the end of the day, do you really want to sacrifice time and energy
00:02:20.260 that could otherwise be spent with your loved ones or on your hobbies just to make a living?
00:02:24.440 Luckily, you don't have to hustle to reliably make more money.
00:02:27.260 All you have to do is job stacking.
00:02:28.980 Job stacking is the best way for regular people, regular employees,
00:02:32.420 to unleash their earning potential and increase job and financial security.
00:02:36.000 How?
00:02:36.500 By working multiple jobs, but without burning out, or more importantly,
00:02:39.880 getting caught by corporate overlords.
00:02:41.940 Job stacking allows you to reliably receive paychecks from multiple employers each month
00:02:45.840 without having to work more than eight hours a day.
00:02:48.380 You don't have to be in tech or any particular field or industry to do it as long as you can work
00:02:52.820 remotely.
00:02:53.580 If you've thought about working multiple jobs, but you're not sure how to start or are afraid of
00:02:56.860 getting caught, get the fundamental job stacking course today and learn all of the secrets
00:03:00.920 on how to sustainably work multiple full-time jobs from the foremost expert on the matter,
00:03:05.880 Rolf Halza, author of Job Stacking.
00:03:08.020 Rolf has worked multiple full-time jobs since 2018, including hybrid jobs, and has condensed
00:03:13.040 all of his experiences and wisdom into a single four-module online course so you can start
00:03:18.480 proficiently job stacking without having to make mistakes, figuring things out on your
00:03:22.440 own, or reinventing the wheel in the process.
00:03:25.320 Go to www.jobstacking.com and enter the promo code ORIN to get a special discount.
00:03:30.760 All right, so before we dive too far into this extremely online topic, let's define one
00:03:37.500 of our terms here because an egregore, it's something I've used in my writing.
00:03:42.000 It's something that gets knocked around online, especially in kind of dissonant right or in
00:03:47.500 our X circles.
00:03:48.820 But I think a lot of people won't know what we're talking about here.
00:03:51.880 So can you explain what an egregore is?
00:03:54.300 So an egregore is a term that comes out, I guess, out of occultism, which is used to describe
00:04:01.140 an entity which is composed of a swarm.
00:04:04.460 And there are people who will tell you that egregores have agency in their own right, as
00:04:08.940 in their channels for some kind of consciousnesses which are not human.
00:04:12.700 And there are plenty of other people who just use it as a handy shorthand for the emergent
00:04:20.760 phenomenon which I think anybody who's ridiculously online will notice of collective intelligence,
00:04:27.980 which is unmistakably noticeably evident.
00:04:32.000 I think it's particularly evident on Twitter where the discourse just moves quickly enough
00:04:37.980 for the moods to, for mood shifts to be really obvious and for consensus form, you can watch
00:04:44.340 consensus formation happening in real time.
00:04:46.680 I remember it was extraordinary watching it happen during the COVID, the early months of
00:04:50.920 COVID, for example, the early weeks, I suppose, when people really weren't sure for a little
00:04:55.400 while, you know, which way to jump on what the right opinion was to have.
00:05:00.000 And there was some genuine back and forth between, even amongst sort of quite kind of respectable
00:05:06.200 opinion people on which way we should go and, you know, should we be masking, should we
00:05:09.520 do this, should we do that.
00:05:10.520 And then all of a sudden the consensus just set in and it's like, and it ripples.
00:05:15.440 It's like, it's like the sort of shockwave that travels throughout the whole network,
00:05:19.600 the whole sort of network, networked intelligence.
00:05:21.340 And suddenly everyone's just saying the same thing.
00:05:23.440 I mean, there's, again, another very online meme about, you know, the NPC who gets the new
00:05:28.340 software chip implanted in his head and then he's just like, I support the current thing.
00:05:32.200 But this is, I mean, people make the meme and people recognise the meme and laugh about the meme
00:05:35.760 because you can watch it happening if you spend enough time scrolling.
00:05:40.740 And it's a phenomenon, you know, at a micro and a macro scale, it's very obvious the kind
00:05:46.440 of very online network phenomenon of consensus formation that happens within distinct in-groups
00:05:53.240 on pretty much any topic you care to name.
00:05:55.260 And then once the consensus is there, it'll be viciously policed.
00:05:59.220 And God help you if you dissent from it in any obvious ways, unless you can form an alternate
00:06:04.580 consensus of your own and create another little reality bubble somewhere else.
00:06:08.340 So when I use egregore or egregoric, I'm just swerving the question of whether these composite
00:06:14.740 intelligences have agency in their own right or are channels for something else.
00:06:19.580 I just have no idea.
00:06:20.800 I'm not qualified to comment on that.
00:06:22.360 But for me, it's a really useful shorthand for something which I just see happening, which
00:06:27.320 is collective intelligences and which really do move things in the outside world.
00:06:32.880 I mean, the consensus which formed on COVID being a case in point, that changed things
00:06:38.660 for a lot of people and ruined lots of people's lives and impelled major political shifts and
00:06:44.360 all sorts of stuff.
00:06:45.280 I mean, we're all living with the consequences now.
00:06:47.100 People's kids in particular are living with the consequences now.
00:06:49.780 And that sort of egregoric intelligence, which is pretty much kind of faded away now, isn't
00:06:59.500 just some abstract thing which you can use.
00:07:01.780 It isn't just an abstract phenomenon which stays confined to the internet and ends up
00:07:06.860 percolating out to the material world as well.
00:07:09.700 So it's just, for me, it's a useful way of understanding a real phenomenon without getting
00:07:15.600 too obviously theological and without discounting the possibility that there's something, because
00:07:21.580 it's genuinely spooky to watch.
00:07:23.880 Yeah, no, I think, like you said, a lot of people can observe this.
00:07:27.060 And it's something that we certainly saw well before the information age.
00:07:31.200 You know, this originally comes from the idea that you would have kind of this shared collective
00:07:35.580 consciousness inside different tribes or nations and things.
00:07:38.840 Richard Dawkins hit on this a little bit, pieces of this, when he looked at the memetic
00:07:43.240 information, the meme and these kind of things, having having lives of their own.
00:07:47.220 And of course, the first time I interacted with this concept was Curtis Yarvin and his
00:07:52.760 idea of the cathedral.
00:07:54.300 You know, this being the kind of the global American empire's egregore and the way that
00:07:59.940 it kind of manifests itself and drives things and is self-interested.
00:08:05.220 But I think the most interesting intersection I've had with this is with Nick Land, the thing
00:08:10.080 that made Nick Land so interesting to me is he took kind of that understanding that Curtis
00:08:14.920 Yarvin had of the cathedral and he gives it this the supernatural quality where these things
00:08:19.980 are no longer simply, you know, an amalgamation of different ideas that have kind of gained
00:08:25.340 a little bit of momentum on their own.
00:08:27.340 But these are truly different consciousnesses.
00:08:29.600 They're separating themselves from the interests of the things that have incubated them.
00:08:33.540 And they've they've got their own kind of metaphysical drive.
00:08:36.820 They are conduits for things outside greater than kind of kind of ourselves.
00:08:42.120 And so we might get into that deeper at the end there once we've explained this topic.
00:08:46.760 But but I do find that part of it particularly fascinating.
00:08:49.720 So one thing that you really hit on in the piece was looking at the way that AI and and
00:08:56.780 particularly Elon Musk's development of marrying AI with Twitter might change kind of
00:09:03.360 the game when it comes to to artificial intelligence, because it will actually attach it directly
00:09:08.840 to this zeitgeist or kind of this egregore, this online consensus.
00:09:14.240 And it would it would somehow change the way that artificial intelligence interacts with
00:09:19.580 the average person.
00:09:21.780 Yeah, I mean, this is this is wild speculation on my part for the avoidance of doubt.
00:09:25.780 This is this is the biggest if in the piece.
00:09:27.880 Everything else is fairly is fairly robustly argued from things which are already there and visibly
00:09:32.420 happening. But for me, the moment of wild, hopefully sci fi speculation is, is, is, yeah,
00:09:39.960 it's that story of Elon Musk collecting and connecting his his new Twitter experiment,
00:09:44.680 his new AI experiment to to Twitter, to live Twitter.
00:09:48.880 And the reason I think that's so interesting is because it's a it's a qualitative step change in how
00:09:53.740 in how in the data set which is being used to inform a large language model.
00:09:59.080 And this is and this and it struck me reading when I first read about this, that if it's if it does
00:10:03.820 what I think it could do, then the significance of that is really not is very difficult to overstate
00:10:09.640 because, I mean, the one thing there's lots of noise about, oh, you know, meaningful,
00:10:15.800 you know, is the is the paperclip machine going to come alive and eat us all?
00:10:19.360 You know, there's just so much noise about AI risk and so on.
00:10:23.400 And I look at I look at AI and I think, well, that's a it's an incredibly fancy autocorrect.
00:10:29.520 Like this thing doesn't have agency and it's not going to have agency unless something something unless
00:10:34.080 we have something very, very different happens, because just a glorified autocorrect is not an intelligence.
00:10:39.200 It's an autocorrect. It's just there's a different thing.
00:10:41.540 There's a different order of thing, you know, and an inductive, you know,
00:10:45.920 a pattern recognizing machine and a human adjusted or some other who knows if there are other forms of other forms of complex intelligence.
00:10:55.600 But so, I mean, you know, the one we know about is human intelligence.
00:10:58.600 And that's just a different order of thing. Yes, we notice patterns, but we do a whole lot of other things as well.
00:11:04.460 And the idea that somehow if we make a pattern recognizing machine complicated enough,
00:11:08.440 if we make our autocorrect fancy enough, it's going to somehow develop the capacity to think.
00:11:14.480 It just seems just seems to me like you don't really you haven't really thought very hard about thinking it or you just very,
00:11:20.060 you very, very seriously misunderstand what's going on here.
00:11:23.280 However, I was thinking then I then I read about I read about I was thinking, well, what would happen?
00:11:29.860 I mean, I mean, the the egregoric consciousnesses of Twitter are very much alive in a sort of in a kind of part in a collectively human,
00:11:36.940 but not exactly human kind of a way. And sometimes sometimes their impulses, their moral impulses don't feel they don't feel
00:11:44.280 like the kind of the kind of moral impulses that a human would have on an individual level.
00:11:48.540 The aesthetic is often different. The disposition and the tone of the collective intelligences on Twitter are often not exactly inhuman,
00:11:57.780 you know, but not not very. It's a it's a kind of funhouse mirror of our of our collective cultural self,
00:12:05.200 if you like, and, you know, really bring some things to the fore, you know, vengefulness and mercilessness and combativeness.
00:12:13.000 And, you know, a bunch of other qualities. Also, the an extraordinary capacity collectively for patent recognition,
00:12:19.120 bunch of other things, you know, all of which are, you know, human in and of themselves.
00:12:22.700 But you put them together and they don't they don't feel like the image that we like to have of ourselves collectively.
00:12:27.760 And but but but they are they are, in a sense, in a sense, kind of alive in aggregate rather than as much as the humans,
00:12:35.660 the individual humans that make it up are alive individually.
00:12:38.260 And I was thinking, what happens when a large language model is drawing is drawing not from a not from a static set of data?
00:12:47.140 Like I find that you've poured, you know, it might be might be millions of data points, trillions of data points,
00:12:52.060 but it's a if you've poured it all into a bucket and that's it, you know, it's all just in the bucket.
00:12:55.700 There's nothing going in and out of the bucket, except when you when you sort of adjust adjust how you're training the data.
00:13:01.140 What happens if that's what what happens if that data set is being updated in real time?
00:13:06.760 What what happens if that data set comprises these egregoric intelligences,
00:13:10.360 which are created the root fundamentally based, you know, being drawn from humans on Twitter?
00:13:15.420 You know, is that that that's such an AI of that kind is going to is potentially alive in at least to a to a far more significant degree than any glorified autocorrect is going to be.
00:13:28.660 And I don't know. I don't you know, I don't know how well that's going to work because I'm not I'm not a technologist.
00:13:33.480 But I think it has the potential to to be, you know, figuratively speaking, if you like, the divine spark that's always that's hitherto always seemed like it was obviously missing from from every glorified autocorrect.
00:13:46.600 That as it turns out, the divine spark, which makes the difference between artificial actual artificial intelligence and a fancy a fancy autocorrect turns out to be collectively us.
00:13:58.120 You know, I'm thinking, whoa, you know, what if what what happens if that what what happens if that and then and then I thought and then I thought, holy shit, because this answers another question, which I've been I've been turning over in my mind for a little while,
00:14:12.940 which is what what do you do when 60 percent of your young people want an authoritarian government, but they're also pathologically averse to the idea of being governed by any single human individual?
00:14:22.500 And then I thought, oh, crap.
00:14:29.500 A very, very interesting way to solve that problem. Yeah, I like the I like the notion of or I shouldn't say I like the notion of but I find the notion interesting that that Twitter is selecting for human traits that collectively aren't human.
00:14:46.820 Yeah, yeah, yeah, yeah.
00:15:16.800 But but very cruel kind of facsimile of human nature.
00:15:21.800 And if that's the thing that then your artificial intelligence is using to inform what it's going to be driving, that becomes very scary.
00:15:31.660 I want to get to that that aspect you were talking about, about how this connects to kind of authoritarianism and possibly what what a Caesarism would look like driven by this.
00:15:42.860 But before we do that, guys, let me tell you about The Blind.
00:15:46.180 For years, Hollywood's been lacking when it comes to stories of redemption.
00:15:49.480 Movies and TV shows have trended towards the antihero, a flawed person who makes no effort to change and just becomes worse and worse as the story goes on.
00:15:56.580 Well, here's some great news.
00:15:57.980 The Blind, the true story of the Robertson family is now available for purchase on Blaze TV.
00:16:02.580 Maybe you've made a mess of your life.
00:16:04.080 Maybe someone you love is in a dark place.
00:16:06.140 Maybe all of the above.
00:16:07.320 If you or someone you know feels beyond redemption, you need to watch this movie and you'll see there's always hope.
00:16:13.080 The Blind takes you on an incredible journey through the life of Phil Robertson, giving you an intimate look into the man behind the legend and the trials, triumphs, and values that shaped him through the years.
00:16:22.740 While The Blind wasn't a Blaze Media production, since Phil is such a big part of our Blaze TV family, we wanted to make sure that you had the opportunity to stream it here.
00:16:30.640 Because it isn't ours, we can't include it as part of the subscription.
00:16:33.940 But if you'd rather purchase it and stream it here rather than Apple or Amazon, we wanted to make sure that you had the opportunity to do that.
00:16:40.180 Make sure to act now.
00:16:41.400 Don't miss this opportunity to own The Blind, a Phil Robertson story on Blaze TV.
00:16:45.880 You can buy it today at blazetv.com, The Blind, for $19.99.
00:16:50.700 That's blazetv.com slash The Blind.
00:16:53.660 All right, Mary, so you're talking about this tension that exists right now.
00:16:59.540 We have a lot of people, some of us who might host a show included, who find democracy to be wearing thin, to not have the persuasive power used to, people feeling a little burnt out on this idea that it's not going to be something that's going to move us in to kind of the next epoch.
00:17:18.140 But at the same time, you also have a younger generation, especially, I think everybody really, but particularly younger generation that is very anti-authoritarian, that does not like the idea of any individual human having power over them.
00:17:32.040 And so we have this dichotomy between these two forces where you have this movement of young people who are not so big on democracy, don't value it that highly, might feel like they could move towards a different form of government.
00:17:44.260 But at the same time, while that would open up the opportunity possibly for authoritarian rule in some way to enter into the frame, you also have this drive for high degrees of individual liberty, or at least the perception of high degrees of individual liberty, people bucking under the idea that they would ever have some kind of natural hierarchy over them.
00:18:05.260 And you kind of pointed out how this might end up getting resolved in perhaps the worst possible way by an egregoric Caesar.
00:18:14.620 Right. I mean, credit where credit's due, I've been thinking about the idea of an AI Caesar ever since.
00:18:24.140 It was dropped as a kind of casual, as a casual kind of, you know, this could happen in passing.
00:18:30.920 I think it was Zero H.P. Lovecraft talking to Alex Kashuta ages back.
00:18:35.460 And it was, and then they went, they sort of carried on and they talked about something that was completely different.
00:18:39.280 And it's just been stuck in my mind as a sort of, like the idea of enthroning the AI as a solution to the perennial problem of what you enthrone when you have a system that doesn't work unless you enthrone something.
00:18:54.960 I mean, the British, you know, we've had a, we've had a system, we've had constitutional monarchy for some time.
00:18:59.800 That's that sort of, it's a fairly good compromise, but that a lot of people just don't seem very comfortable with that.
00:19:05.100 And so it's, so, so it's a problem.
00:19:08.560 I think the, the anti, yes, the, so, so, so let's take, let's take the, the authoritarian young people and first, and then we'll take the anti, anti authoritarian young people second.
00:19:21.900 So, I mean, I think last time, last time we spoke on this, on this show, we were talking about, we were, we were talking about the, the totalitarian safetyism of daycare, which is something I've, I've written about and reflected on a great deal just in terms of, in terms of my experience and observation around, around little kids.
00:19:38.740 And again, this, this sort of, the, the original observation comes from an old friend, who's now an old friend who, who observed to me some years ago when we were in London, we were watching a little crocodile of kids, you know, from some local daycare, you know, being ushered down the street on somewhere on the river Thames.
00:19:54.000 And he, he, he said to me, and my friend said to me in passing that there was something incredibly totalitarian about that scenario.
00:20:00.780 And it just stuck in my mind.
00:20:03.060 And, and, and I've, I've come to think there's something true about that.
00:20:06.540 And, and I've come further to think that the more children we raise in those kinds of all encompassing totalitarian nurture scenarios, you know, typified by the daycare, you know, and the, and that sort of, that, that, that, that all in, all enveloping kind of cyborg pseudo mummy.
00:20:24.720 Scenario you have in daycare is, is delivered, to be clear with, with the best of intentions and by kind, nice, caring workers most of the time.
00:20:33.620 But, but it's perverse incentives as such that, that you don't, you don't let kids take risks.
00:20:38.540 They, they, they are micromanaged in terms of how they resolve interpersonal conflicts.
00:20:42.100 They are prevented from injuring themselves ever.
00:20:44.860 There is a whole bureaucracy that surrounds minor injuries.
00:20:48.120 The whole thing is very, very unnatural relative to, relative to just being around your siblings and your mum or, you know, playing, playing outside or whatever.
00:20:56.840 And there's, and, and I've, I've come to, the older I've got, the more I've come to think that actually the crisis of civil discourse and the crisis of individual agency, which is what ultimately underpins the sort of longing for authoritarianism, is, is rooted in that experience of being over, over managed,
00:21:15.000 over scheduled, you know, all the, pretty much all the way from infancy, sometimes, sometimes from two weeks old, in some cases.
00:21:21.640 And the, and this, and this explains in turn why the, the longing for authoritarianism is now, is now palpable across young people on both the left and the right.
00:21:29.160 You know, this isn't, this isn't just sort of the, your fascistic blue hairs who, who want to, who want to abolish, who want to, you know, hang, burn women like me at the stake for knowing that biological sex exists.
00:21:40.080 You know, this is also, this is also the kids on the right who, who want an actual Caesar.
00:21:45.480 It's, it's, it's kind of everybody.
00:21:47.380 And I'd be, I mean, it's, I don't have data on it, but it ought to be empirically researchable whether or not there's any kind of a link between, actually, no, there is research on this.
00:21:55.880 There's, there's, there's, there is a strong correlation in the research that people have done between, between helicopter parenting and a desire for authoritarian governance, which is kind of a no-brainer when you think about it.
00:22:05.880 Like, duh, you know, you were, you were raised under a totalitarian, a stifling totalitarian system.
00:22:11.400 Of course you want to, of course you think that was normal.
00:22:14.000 You know, it kind of makes sense.
00:22:15.740 But what, yeah, so, so that's, that's the authoritarian, the authoritarian streak in young people.
00:22:20.560 The anti-authoritarianism I also think is in evidence across both the left and the right.
00:22:24.700 I mean, this is, I'm sort of, I'm cautious about poking, poking hornet's nests on the, on the, on the weird right.
00:22:31.960 But in my observation, like, there's a lot of guys there who are very keen on the idea of natural aristocracy.
00:22:39.020 But they, but, but that, that the idea of natural aristocracy seems, doesn't seem to have a corresponding, doesn't, well, at least if I'm not aware of a sort of correspondingly well-developed theory of honourable, honourable hierarchy.
00:22:51.940 As in, you know, it's, it's like, there's, there's, it, it, it would be great, it would be great to be a, a, a, a Hellenistic lord, but nobody's really, nobody's thinking very much about how you, how you could actually have quite a nice life as a heel of, or not, or there isn't nearly as much.
00:23:06.600 So there's a, in, in a sense, there's, like, everybody's still imagining themselves as the big I am, which is, I mean, if you're, if you're imagining yourself as the authority, you're still not really considering what it would be like to take, to take orders from an authority.
00:23:23.040 If you see what, so, so there's, yeah, and, and, and on the other side, on, on the other side, on the left, there's just the, the, the, the desire to dissolve everything and vanish back into the oceanic mother swarm, which is in, in, in a way, it's much more direct.
00:23:38.940 But, you know, if you, yeah, I, I kind of, I kind of feel like it's the same picture, in some respects, you know, because in, in both cases, there's no, there's a, there's a deep reluctance to consider the possibility that beneficial hierarchical structures might actually be benign and in your best interests, and that sometimes shutting up and taking orders is, is, is better.
00:23:59.340 That, that, that piece is, that piece is somehow missing on both sides of the discourse, even if there are, there are interpretations of what's wrong with that, or what we should be doing instead is, is very different.
00:24:09.740 So, so these are, these are the two pieces, you know, the, the authoritarian yearning in young people, particularly, or really in the culture as a whole, but it's most pronounced in young people, and then the, the anti-authoritarian yearning.
00:24:21.820 And my, yeah, I, I've, I've, meditating on all of that, I've come to wonder whether actually the most, the solution might, the solution that ends up being most palatable to a lot of people mightn't be, you know, if, if it's so intolerable to set one person over any other, you know, whether you're arguing about who gets to be the natural aristocrat, or whether you're just, you know, violently allergic to, out to the idea of hierarchy full stop, whatever the reason are, the, the, the, the only solution that people might be able to settle on,
00:24:51.820 could, could, could potentially be, you know, an AI monarch. And then, you know, if, if, if we now have a way to, a way to get to, to find off, to find a route towards that monarch actually having a divine spark, i.e. us, then there's, yeah. Yeah, it's, it's, so yeah, it's, it's, it's speculative, but I can see, I can see how we get there.
00:25:13.520 Yeah, the really interesting part of that is that basically it would be swapping the, the constitutional rule for this, right? But the, the beauty that a lot of people, you know, in the, in the Enlightenment, the reason that this is so attractive to so many people is the idea of self-governance, even though that's not exactly how that works out.
00:25:34.260 So the idea that you didn't have anybody above you, that there, you know, that there is this consensus, this committee, this popular sovereignty that's ruling over you, and that was replacing, you know, the, the aristocracy or the direct monarchy, that was, that was very inviting for a lot of people because there are no, nobody wants to be ruled.
00:25:51.300 However, I think people are getting to the idea that that in and of itself is going to continue to function, but they still, but they don't want to replace that with a direct authority like you're talking about.
00:26:01.700 And so the way to kind of shift the program, but without actually putting any one human in charge is to replace it with kind of this AI monarch.
00:26:11.400 And in that way, you do still have no human, right? You still, it's a, there, there, you still get to kind of sell the idea of scientism and consensus as the thing that actually drives your social organization.
00:26:24.280 You know, it's always objective. It's, it's not any, you can still pretend to remove the political in the Schmittian sense from any of these conflicts, but you do put something that is objectively more authoritarian into that position.
00:26:38.740 And so you don't have to have anyone actually deal with the fact that there's going to be a, that there's going to be a decision-making apparatus that will be motivated.
00:26:48.160 They don't have to actually look and see it as one faction winning over or being ruled.
00:26:53.240 And instead they can still substitute just like they do with the constitution, constitutional governments.
00:26:59.580 I think the idea that, that they are governing in some way through the AI, rather than actually being ruled over by an individual with motivations that might not be their own.
00:27:09.740 Yeah. And I, I can, I can honestly see, I mean, especially, especially given how, how narrow the Overton window has palpably become, um, and how constrained the voting choices have palpably become, you know, to a, to a, to a degree that's really very observable to most people now.
00:27:29.520 I can, I can, I can honestly see how it might come to feel as though adding your voice to the egregore might not be a more true and direct form of democratic participation than casting a vote once every five years.
00:27:42.480 I mean, that, that to me feels totally plausible. Um, yeah, in a sense, it's kind of tail on steroids, isn't it? Um, but, but, but, but it could, but it has the nuclear codes.
00:27:55.600 Well, and it will have to be properly lobotomized. There's no way they're going to let, they're not, they're not going to let the actual thing out into the wild.
00:28:04.300 It's going to have, no, no, no, no, no, no, no. But I mean, also in a sense, we, this is kind of, this, this is already happening. Um, I mean, not, not tail on steroids. Um, but, but the introduction of, um, AI, uh, decision-making, AI proceduralism, if you like, into the, into the soft tissue of everyday human governance.
00:28:25.600 That's, that's already happening. I mean, there, it's, it creeps in here and there. I believe, I believe it's, it's used sometimes in, to, in processing court cases. Um, I'd have to, I'd have to go rummaging for, for examples. Um, I didn't want to blow up the piece by getting into, getting into all of that.
00:28:43.300 But they, yeah, yeah. In grow, growing, growing proportions of, um, administration, for example, yeah, for, for example, across adjudicating insurance or deciding, you know, pricing or adjudicating insurance claims.
00:28:58.920 I believe there, there are now AI algorithms involved. Um, and it, it seems plausible to me that an increasing proportion of, you know, the, the decisions which might seem trivial, but which, you know, are materially very important to individual people's lives.
00:29:15.720 And can, we can be just incredibly exasperating when you find yourself going around and around the bureaucratic circles will come to be governed by those, those sorts of mechanisms by the, by the paperclip machine.
00:29:25.580 Um, with, with, with who knows what safeguards or lack of safeguards for actually making, making decisions, which aren't, which don't have six fingers, as it were.
00:29:34.560 Um, and, you know, whether, whether or not, whether or not we'll, and, uh, you know, I don't, I don't think, I don't think there's any immediate prospect of ending up with, with an, an actual egregoric Caesar, but, but I can, yeah, but I, I can see that an entity of that kind symbolically being enthroned, um, as, and, and then several, you know, hundreds of thousands of shittier subversions of it just propagating themselves throughout.
00:30:04.560 Every, every, you know, every, every, every blood vessel of the bureaucracy such that, yeah.
00:30:11.180 So it's such that all your insurance decisions and healthcare decisions and et cetera, and so on, all, all end up, all, all end up having six fingers.
00:30:19.720 And yeah, I think this is going to happen.
00:30:23.060 In fact, I already know this is happening in certain sectors and, and people I think are going to feel, you know, again, we are so, uh, you use the term swarm of governance.
00:30:33.700 And I think that's, that's exactly right.
00:30:35.800 We are so enamored with this idea of inhuman governance.
00:30:39.540 Uh, you know, at first it was through, you know, legislatures or committees, and then it became bureaucracies and experts.
00:30:46.780 And I think the next step of taking these decisions out of human hands will be, uh, artificial intelligence that are employed in, in every one of these interactions that you're talking about.
00:30:56.800 Right, because, you know, because you're, we're already seeing actually more from the right than from the left, but, but, but really from both sides.
00:31:02.560 Uh, we're, we're increasingly seeing the objection to the epistocracy, uh, being that it's politically biased, um, that, that in, you know, whether, whether, whether the accusation is that the, the, the rule by experts is just a, a, a, a pork barrel politics exercise.
00:31:19.580 You know, shoveling resources, shoveling resources into your own pockets under the guise of ruling, ruling people reasonably and justly and, and rationally, according to the science team.
00:31:29.540 Um, or, or, or whether it's people just saying, you know, this has been in, whether it's the other lot claiming that it's just been, you know, the, the, the, the infrastructure has been captured by people who hate us and therefore we're going to burn it all down.
00:31:41.460 You know, those, the, the, the rule, rule by experts is, is reaching the end of the road in terms of political legitimacy.
00:31:48.660 Um, what comes next?
00:31:50.640 Um, it seems to me that, you know, we're going to end up with an authoritarian governance of one sort or another, and it's either going to be, it's either, it's either going to be a human seizure.
00:31:59.540 It's going to be an egregoric seizure.
00:32:02.220 Um, I, yeah, I'm, I'm not very, you know, I'm giggling maniacally, but I'm not very happy about either of those prospects.
00:32:08.480 Um, I'm just watching it happen and just, you know, with that, like, like the guy going down on the bomb at the end of Dr. Strangelove.
00:32:14.340 That's kind of, that's that, that, that, that, it's that kind of giggle, I think.
00:32:17.720 Well, it, it does end up putting us in, I mean, if we really do hand over all of these micro decisions to artificial intelligence in that way, it really does close that self-exciting feedback loop, right?
00:32:28.600 Like we're, we're kind of done having any limiter on digital acceleration because human needs are, humans have become completely unable to mediate their own needs.
00:32:39.960 They, they, they, they no longer able to make any decisions.
00:32:42.320 Everything is reliant on AI and AI is increasingly not really, while it starts in the service of human need, it doesn't really need to stay there.
00:32:51.340 Once basically all human input has, has been left out of this.
00:32:55.300 And I think also, interestingly, you, you, you put something in there that really got my, my gears turning, which was the idea that you would basically like comment for the algorithm for democracy.
00:33:07.920 Because this would be, you know, everybody, a number of thinkers from Bertrand de Juvenal to Hans Hermann Hoppe have pointed out that basically democracy, the idea of popular sovereignty actually grew government significantly.
00:33:23.280 The explosion of government under the idea that you would be doing it in the name of the people.
00:33:28.060 There's only so much you can do in the, in the name of the, the aristocracy or as the monarch.
00:33:32.180 But when you're, when you're acting in the name of the people, you know, you can, you can start the levee en masse.
00:33:36.640 You can, you know, mobilize the entire, the entire nation to war and you can have them, you know, own the means of production because you're doing it for the good of the people and whatnot.
00:33:45.980 And this seems like the next logical step in securing the total state, basically, is the idea that you would feed into the algorithm and the algorithm would be the most organic reflection of the will of the people.
00:34:00.560 Yes, exactly.
00:34:01.120 Because it's literally skimming the will of the people off the top of the internet.
00:34:05.480 And there's just nothing you could deny this force because it truly is channeling the zeitgeist in the most like open vein way possible.
00:34:13.960 Which is, which is clearly not actually true.
00:34:17.200 I mean, there are some people are considerably more online than others.
00:34:20.020 And you, you don't, you don't have to spend very long thinking about it to realize that it would not be remotely representative.
00:34:26.400 You know, you, there are a lot of old people who, who are not very, who, who have no idea how any of this works.
00:34:32.200 Some, a very, a very nice, very nice older woman, an historian who, who I did a debate with recently, asked me at the dinner afterwards, she said, I send a lot of emails.
00:34:41.820 Does that mean I'm very online?
00:34:43.960 When does fast grocery delivery through Instacart matter most?
00:34:47.720 When your famous grainy mustard potato salad isn't so famous without the grainy mustard.
00:34:52.560 When the barbecue's lit, but there's nothing to grill.
00:34:55.220 When the in-laws decide that, actually, they will stay for dinner.
00:34:58.960 Instacart has all your groceries covered this summer.
00:35:01.560 So download the app and get delivery in as fast as 60 minutes.
00:35:05.160 Plus, enjoy $0 delivery fees on your first three orders.
00:35:08.860 Service fees, exclusions, and terms apply.
00:35:11.160 Instacart. Groceries that over-deliver.
00:35:13.960 No, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no.
00:35:18.900 The fact that you asked that question, so tells me that you're, no, no, no, no, no, no, no, no.
00:35:23.700 That's not how it works.
00:35:24.480 So, of course, it's not going to be equal or representative or democratic in any meaningful egalitarian sense.
00:35:33.240 I mean, what it would be, who knows?
00:35:34.840 I mean, and yet, you know, I'm not completely without hope.
00:35:39.140 Because, I mean, yes, you know, if you follow that thought through and you think, well, if that works the way it's supposed to, then, yes, you know, we're heading to really frighteningly dystopian places.
00:35:48.900 However, if I have any, if I have optimism, so I always end up saying this to you, or, you know, in as much as I have any optimism, it's just that things aren't going to be that horrible because they'll go wrong before they get that horrible.
00:36:01.000 Yeah, well, that was my, that's what you, you crushed my hope of that here in this article, because, you know, that's, that's always my hope is like, well, no, before, before we completely start sacrificing to the AI gods, like, civilization simply will stop scaling.
00:36:14.360 Like, there'll be, there'll be a competency collapse, you know, the networks won't work.
00:36:18.620 Yeah, I think, honestly, I think the thing which will save us will be competency collapse.
00:36:22.480 Yeah.
00:36:22.980 Which is, yeah, that's not a very, not a very cheery prospect.
00:36:26.040 Save, save by collapsing, collapsing infrastructure and the decimation of complex systems.
00:36:32.680 Yes.
00:36:33.760 But yeah, to me that it seems logically, it's, it seems, it seems fairly clear that if you, if, if we volunteer to become the sort of obese, floating, screen, screen consuming guzzlers out of Worley, then we're not going to be able to maintain the complex systems which feed the AI for more than a generation.
00:36:55.300 I mean, this is already happening to air traffic control, you know, I, I struggle to see how, yeah, I struggle to see how we'll maintain the kind of complex systems which, which the AI is intended to serve.
00:37:08.980 And I mean, you know, you can imagine, you know, some sort of dark science fiction future where all of, all of the AI algorithms are still working perfectly, but none of the material realities to which they were supposed to correspond really exist anymore, because we just forgot how to, how to maintain them all.
00:37:22.220 Or just went back to subsistence farming, while the AI does its thing sort of in perpetuity in the cloud.
00:37:29.300 Yeah, yeah, you know, we're sort of back into kind of bonkers science fiction again.
00:37:33.460 But yeah, yeah, I think we'll be saved by competence collapse, and people will just go back to doing things in slightly less complicated ways.
00:37:38.800 And on a smaller scale, which might, you know, it might be might be a better, a better version of actually existing post liberalism than the actually existing post liberalism we have.
00:37:50.460 Even though it's arguably going to be a bumpy ride getting there.
00:37:54.860 Well, so I think, so I had hoped that that was going to be the case, but now that you've given me the vision of the like, kind of the rolling blob consuming AI Caesar here, I wonder if that is in a way that they can stretch things much longer than I previously thought, because I thought like, the political organization has to fail first.
00:38:16.280 But this seems like a way to extend that kind of impersonal political organization, well beyond its current shelf life while maintaining at least some semblance of functionality.
00:38:27.380 I guess the real question is, can the AI, can the capital, whatever you want to call it, can it escape territoriality entirely while just leaving the destitution?
00:38:39.240 Like, can you get this Elysium version of, or cyberpunk version of humanity, where like, you just have the best of the best skimming off the top and living this amazing life, you know, govern, while everyone else is just maintained and governed by this kind of AI, you know, Caesar that kind of holds things in stasis.
00:38:58.300 I guess, I guess you'd say like, no, the systems that would, that would maintain it would fall apart.
00:39:02.980 But if they can farm the, if they can farm, you know, the, the, the average chud enough, you know, will, will they be able to hold it up, I guess, for an extended period of time is the question you'd be asking.
00:39:15.260 I mean, if the, the, the other thing, the other thing that might save us along with competence collapse is fertility collapse.
00:39:21.340 You know, if we're, if we're, if we're, if we're being, if we're, if we're being collapse-pilled here.
00:39:27.180 I, I don't know, I wrote, I just filed an article about pandas and panda diplomacy.
00:39:32.420 And because it seemed, I mean, you know, pandas are, you know, they're, they're a principal vector for conversations between China and the US and have been for a long time.
00:39:40.300 But they're also the, the byword for does not, does not mate in captivity.
00:39:45.560 And it struck me, and I'm thinking about it, I felt that there's a, there's an analogy there.
00:39:51.340 Between the, the sort of technologization of efforts to kind of, you know, perpetuate the panda and the reasons why pandas are just not getting it.
00:40:03.320 Because they do fine in the wild.
00:40:04.780 You just have to leave them alone.
00:40:06.540 You just have to let them do it their way.
00:40:08.200 And then, you know, pandas make more pandas.
00:40:10.180 But they, you know, under, under sort of the, the highly artificial sort of living there, the pod, the pod and the bugs, they just, they just don't do it.
00:40:16.660 And the same is true for people.
00:40:18.100 But, you know, beyond a certain point, you know, you, if you, beyond a certain sort of level of pod and bugs, people just, people just don't want to make more people.
00:40:25.520 And that's, that's increasingly a problem.
00:40:28.700 And between, between the, the, the attrition of our ability, between the sort of growing, I think, collective loss of interest in how things work, which to me is now, is now very obvious.
00:40:41.160 It's not just, it's not just, it's not just that we've forgotten or we've become more stupid or something like that.
00:40:46.160 Even from, from the elites down, people just do not care how things work.
00:40:50.280 You know, with a, with a, with a relatively small minority of exceptions.
00:40:54.440 People just don't care how things work.
00:40:56.100 They're not interested.
00:40:57.940 Nobody, just people just don't think like that.
00:41:00.060 They're much more interested in moral, if you like, spiritual realities.
00:41:02.860 You know, we're becoming, in a weird way, a much more spiritual people, but the consequence of that is that everyone, nobody can be bothered to mend the potholes anymore, because they just don't care.
00:41:11.480 So we're going back to the, the sort of roads that we had before people started caring about potholes.
00:41:15.440 And at the same time, you know, the, while, while we're all sort of busy spiritualizing and becoming, you know, more, more interested in, in theology than we are in, well, yeah, it's not a theology I'm a fan of, but it's theology.
00:41:29.240 Um, well, we're, we're, we're becoming more interested in, in moral issues than material ones.
00:41:35.260 Um, we're also being all heard, we're herded into kind of, you know, panda enclosures and then, and then told to have sex and make more babies, because that's the only way, that's the only way that the merry-go-round can keep turning.
00:41:45.840 And I sort of feel like, you know, whatever, whatever the slightly, slightly stickier version of voting with your feet is, people are doing it.
00:41:52.100 And I don't completely blame them, um, whether they're not voting with, uh, yeah, it's, it's, it's not happening.
00:41:59.840 And the, the, the, the, the medium term consequence of that will just be, I mean, who knows what it'll be?
00:42:06.100 You know, there, there are various kinds of nightmare scenario, but none of them look like the sort of steady state that, that could be governed by an AI.
00:42:12.640 Even, even a very, um, a very nimble and sort of quasi-intelligent egregoric one powered by Twitter.
00:42:19.820 I just struggle to see a Twitter-powered egregore really knowing what the hell to do with, you know, the, the, the, the various kinds of collapse scenario that, that are,
00:42:29.240 downstream of major population implosions.
00:42:31.600 Yeah, it's really interesting because Oswald Spangler actually predicted this in Decline of the West.
00:42:36.120 He said that we were going to walk away from science and people are like, no, that, that's not a thing.
00:42:41.440 You can't do that.
00:42:41.980 He's like, no, I mean, just think about it.
00:42:43.500 People don't want to be constrained in this way.
00:42:46.420 Like you're, you're, the human mind does not want to have these, uh, these eventualities, these, these absolutes, even if they are true, like it can't, it can't properly exist.
00:42:55.640 Like the, the metaphysical inside of it just simply won't be, is unequipped to deal with certainty.
00:43:02.160 And so it's, you're good.
00:43:03.560 You're just going to have your top minds like walk away and study something else.
00:43:06.980 And like, it, you know, you won't lose the knowledge per se.
00:43:09.880 Like it's, it's not going to get burned down and you don't have a library of Alexandria situation, but you're just going to have people who are incapable of maintaining.
00:43:18.080 You're going to end up, you know, the, the, the, what's the, uh, the, the quote on Twitter, the, the, uh, arc of history is long, but it bends towards Warhammer 40 K.
00:43:25.840 Like, like, you're just going to have a bunch of people who have these ancient technologies.
00:43:29.920 They're amazing, but like, nobody knows how to maintain them.
00:43:32.980 The only a few people can actually deploy them.
00:43:34.840 And I've made this argument myself is like, you know, the, a lot of this world, uh, all of this technological infrastructure that we think has been projected across the world is maintained only because first world nations poor, just insane amounts of time and energy and intellect and money into maintain.
00:43:50.980 You're, you're not going to keep that infrastructure grid up globally without that constant, um, amazing amount of surplus.
00:43:59.580 And that's going to fall apart.
00:44:00.960 Once that does, then you're going to have pockets of this technology.
00:44:03.700 It's still going to exist, but it's not going to perpetuate itself the way it does now.
00:44:07.920 Right.
00:44:08.060 And, and, and just bringing this back to the, you know, are we going to end up with all of the, you know, the radically terraformed chugs was, was the word you used, um, who are just sort of kept past docile and pacified by total AI management.
00:44:22.640 And that, that seems implausible to me for the simple reason that total AI management becomes less and less technically possible, the further you are from the periphery.
00:44:30.880 I mean, but returning to the COVID example, again, lockdowns just didn't happen in most African countries because it was, it was just, it's obviously a dumb idea when most people have our subsistence, have have subsistence life.
00:44:43.420 You can't lock people down, um, because they, they'll just die.
00:44:46.840 Like, you know, you do not save people's lives by making them stay in their hearts.
00:44:50.480 Um, that, that doesn't work.
00:44:53.000 Um, and if you, if you, if you, you extend that principle, um, you know, you might be able to terraform some of the chugs in some of the cities, you know, near, near the, near the big data center, figuratively speaking.
00:45:03.640 But the further you get out into the, into the, into the Mad Max Badlands, um, the, the, the less workable that's going to be because the, the systems start to break down because you don't have enough people to maintain them or because the power is not reliable enough or because nobody's, nobody's mended the roads because they don't care about the roads.
00:45:19.400 Um, so yeah, I, I, I, I don't know what kind of a future we're looking at.
00:45:24.600 Um, but I think it's, you know, if, if it's a, a best of the best, uh, creepy techno futurist elite, um, yeah, the, the rest is going to look more sort of Guillaume Fay than it is Wally, if you see what I mean.
00:45:37.220 Okay. So now that we're both doomer pilled and we're, we're really, we're really banking on the optimism of, of total systemic collapse.
00:45:45.300 We did, we, I did want to ask you one more thing before we go, uh, we kind of blew past it a little bit because you were getting to other topics, but you mentioned that a lot of people, even on the left is obviously anti-hierarchy.
00:45:56.720 Like that, that's just what it does. Uh, but the right, uh, you know, seems to have a problem searching for and understanding what would be a right order and a sustainable hierarchy that would kind of benefit, uh, the whole.
00:46:10.240 And I wanted to talk about that a second, because I think you're right that that is a problem. I think you have the struggle between, you know, say trad cats, and then you have like kind of Nietzscheans who are just like, well, will to power it and whoever, you know, wins it, wins it. And who cares if it's good for the rest.
00:46:25.380 But I think that the problem, the reason that the, that the right is searching for the proper understanding and orientation for hierarchy is because there are so few actual organic communities in which you could actually set this hierarchy of virtues.
00:46:39.680 You can't develop a kind of virtue ethic without a community in which you could actually practice those virtues. And therefore you can't really understand, I think in that instance, a beneficial aristocracy because there is no shared concept of the good for a community in which that aristocracy would rule.
00:47:00.440 Yeah, I think that's absolutely right. Um, you know, fatherlessness, arguably, I mean, I, I don't, I don't claim that everything comes down to parenting, but a lot comes actually comes down to parenting. Um, and if you raise, you know, if, if kids don't, especially boys, if, especially boys don't grow up with fathers, you know, all other things being equal, assuming, assuming you have a sort of averagely functional relationship with your dad, that is a template for what, uh, what benign authority looks like or should be.
00:47:30.440 Um, if you, um, if you, if you, if you grow up without that, I mean, we know, we know what that looks like. It looks like urban scholars. Um, and, and yeah, the, the, yeah, we, we know what that looks like and, and it's not, uh, it's not benign hierarchy and it's certainly not very organized or very pro-social. Um, again, if you raise, if you raise kids in daycare, that's a, that's a completely different model for, for how authority works or even, or even what authority is.
00:47:59.140 It's like, it's a weird fusion of authority and nurturing, you know, with a kind of, with therapeutic overtones. Um, none of which has any space at all for a conception of beneficial, of benign authority, the, the, any possibility that the, that any conceptual space for the idea that somebody could be set in authority over you and still, and actually care about your interests.
00:48:20.320 Um, and so, and so, and so I don't think it's surprising that, that, that, that idea has bled out of, it's bled out of work. It's bled out of the culture. It's bled out of education.
00:48:30.320 It pretty much every, every cultural institution you care to think of. That's, that's just pretty much unthinkable and pretty much unsayable, uh, which is, which is very strange. Cause it's, it feels like it's happened pretty much over my lifetime. I mean, when I went to school, we had desks in a row and we, information was imparted to us by our teachers. And now, you know, my daughter goes to school and she has, she had her, she, she sits around a table and they talk to one another.
00:48:54.240 Let me tell you about Kagan teaching. Let me, let me explain to you the, I have sent through far too many trainings of exactly what you're doing. Sorry. Go ahead.
00:49:04.440 Oh, that's, oh God. Yeah, that's right. This is, this is what you used to do.
00:49:07.640 Yes. I don't just know the phenomenon you're talking about. I have been trained in its, in its deployment.
00:49:15.380 Yeah. And to me, to me, it's the most evil and malign thing that's, that's happened to the culture. It's one of, one of the most malign interventions, um, in, in the culture and over the last century.
00:49:25.420 Um, because it's, I mean, I can't think of a more, of a more insidious and totalizing way of conveying the idea that the truth is, is arrived at by consensus.
00:49:34.640 Um, and there's no, and has no external validity and can't ever just be, did be delivered by an authority.
00:49:40.800 Um, and not, not to mention the fact that it's just kryptonite for nerds and bookish people and people who just like learning stuff.
00:49:48.140 I mean, the whole thing is just, it's evil on so many levels, putting, putting kids around tables in the classroom.
00:49:53.180 Just how, how is anybody expected to learn? I mean, this is, this isn't even something I went through, but I'm traumatized on behalf of every, every kid who's been subjected to that.
00:50:00.900 And has actually managed to learn something just in spite of it.
00:50:03.760 Um, but, uh, yeah, well, so, so, so that, and then, you know, and then it percolates out into the whole culture and no wonder we find it so difficult to imagine that somebody could just say, can you, you know, just shut up and do as you're told.
00:50:17.400 Um, which, and, and, and the possibility that somebody could say that and actually be right, or at least, I mean, it's, it's easy to imagine people doing that, but it's, it's almost unthinkable.
00:50:25.540 It's become, it's a sort of moral, I feel like I've just committed blasphemy by, by uttering, by, by, by voicing the possibility that somebody could say, shut up and do as you're told and actually be right.
00:50:35.660 And not only be right, but also say, do that in a, in a way, which is in your best interests.
00:50:40.560 Um, you know, this is where we are.
00:50:42.720 I mean, this is, it's the hard problem.
00:50:45.020 One of the hard problems in, in political thinking and writing, I think at the moment is, is finding a, finding a way back to being able to, to, to being able to occupy that space.
00:50:55.780 And, and to, to, to, to reclaim that cultural and political territory and, and to do so in a way, which isn't just, you know, we're all, we're all going to have a big, a big Nietzschean dogfight.
00:51:06.780 And then whoever wins just gets to be the boss by definition.
00:51:10.200 Cause, cause there's nothing that there's, there's somehow no space in there underneath for how people organize themselves, um, constructively.
00:51:17.940 Um, you know, and what, what that is, it feels to me like what that, what that looks, and maybe, maybe, maybe we just have to go through this in order to find our way back.
00:51:27.140 You know, maybe, maybe we actually just have to go through like several generations of, of mafia.
00:51:32.540 And I mean, I guess, you know, the, the, the British monarchy did that for a hundred years, give or take, you know, maybe that's, maybe sometimes that just has to happen for a bit until people finally come up with a settlement that they, they can live with.
00:51:46.320 I don't know.
00:51:47.280 I mean, where, where I'm, I'm, I'm, I'm black-tilled again, man.
00:51:50.040 Sorry.
00:51:51.400 I think that's right.
00:51:52.840 I think that, that society does cycle from the bureaucratic back to the feudal.
00:51:57.160 And I, you know, there's a, there's a great passage in the juvenile where he says basically like, you know, of course, every King begins as a bandit, but a bandit, but eventually he figures out that caring for the people he's fleecing is actually a better long-term investment than, uh, you know, than, uh, than, than, than fleecing them completely.
00:52:15.740 And that's actually the moment where kind of the, the idea of the Rex is born, you know, the, the, the King is born out of this understanding that the, the good of the, of the people over time is, is, is better than simply the Nietzschean, uh, kind of over, overpowering of them.
00:52:30.540 But I think that only happens, I think that cycle only turns once we, you know, kind of that collapse of, of, uh, complexity that we've been talking about, uh, occurs and, and kind of go through back through that cycle where people rebuild into, uh, tighter communities that, that don't have, that, that understand the good and the need of, of, to work for the good of, uh, the collective.
00:52:52.640 But without the, uh, requirement of kind of this underpinning of technology that abstracts it so much that there's no way in which they can actually determine, uh, you know, the, the virtue of the hierarchy in which they live.
00:53:03.580 So you think we have to, we have to go from the tropical longhouse back via the Anglo-Saxon longhouse.
00:53:08.860 The only way out is through the longhouse?
00:53:11.780 I don't, yes.
00:53:12.720 But, but, but no, it's the other longhouse.
00:53:16.240 The, the, the, the, the other longhouse where, where, where your Lord, um, in, in, indulges in displays of, of, of, of boasting about his military prowess and hands out gold, um, in exchange for killing your enemies.
00:53:27.380 Exactly.
00:53:27.640 And that is the other longhouse.
00:53:29.640 I mean, I don't know, maybe, maybe, maybe the, maybe the only way out is through the other longhouse.
00:53:34.540 All right.
00:53:35.280 Well, now that, oh, sorry.
00:53:37.300 Go ahead.
00:53:37.920 No, no.
00:53:38.580 There's a, there's a cheery note, maybe.
00:53:40.620 Yes.
00:53:41.000 Yes.
00:53:41.480 Now that, now that we've ended that discussion on a, on an extremely hopeful note, uh, Mary, is there anything that people should be looking forward, uh, to, from you, any place they should be looking to find your work?
00:53:54.060 I'm, I'm, I've started work on a new book, which I'm very excited about.
00:53:57.700 It's, I mean, it's, it, it, at the moment, it's a, it's an equally extremely online set of challenges.
00:54:03.580 I don't know if you're up to headings, but I, I, I have so much to tell you about that because I think, I think you'll be interested.
00:54:08.480 Um, it's about the end of print culture, um, which I think is going to, yeah, it's a working title is the new reformation, um, which, which we're, we're in, baby.
00:54:18.060 It's happening.
00:54:19.020 It's going to be messy.
00:54:20.040 Last time there was, there were several decades, several centuries of war.
00:54:23.540 Um, it's happening, baby.
00:54:25.440 So that's what I'm working on.
00:54:26.800 I mean, you can find me at reactionaryfeminist, which is my sub stack.
00:54:29.520 You can, I tweet at moving circles.
00:54:31.500 I write weekly for unheard.
00:54:33.420 U N H E R D.
00:54:35.280 Um, that's me.
00:54:36.460 I show up some other places too.
00:54:38.220 This is, this has been fun.
00:54:39.400 Thank you.
00:54:40.080 No, absolutely.
00:54:41.080 All right, guys, make sure that you're checking out Mary's work.
00:54:43.060 And of course, if it's your first time on the channel, please make sure that you go ahead and subscribe.
00:54:47.400 And of course, if you want to hear these broadcasts as podcasts, you can go ahead and subscribe to the or Mac entire show on your favorite podcast platform.
00:54:54.360 Thank you, Mary.
00:54:55.100 And thank you everybody for coming by as always.
00:54:57.320 We'll talk to you next time.