The Auron MacIntyre Show - March 07, 2025


Was Nick Land the Real Winner of the Trump Election? | Guest: Patrick Casey | 3⧸7⧸25


Episode Stats

Length

43 minutes

Words per Minute

191.05817

Word Count

8,249

Sentence Count

382

Misogynist Sentences

6

Hate Speech Sentences

17


Summary

Patrick Casey is a columnist and host of the Restoring Order podcast. He joins me to talk about the rise of the tech elite under President Trump, and the role of a neo-reactionary philosopher who has a lot to do with it.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.320 Rocky's Vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.200 Sponsored by Bell. Conditions apply.
00:00:28.720 CRCanada.com.
00:00:30.100 Hey everybody, how's it going?
00:00:31.780 Thanks for joining me this afternoon.
00:00:33.360 I've got a great stream with a great guest that I think you're really going to enjoy.
00:00:37.980 Curtis Yarvin has burst onto the scene.
00:00:40.800 He's in the New York Times.
00:00:42.260 He's getting all these big, splashy articles about how he's the architect of the new regime.
00:00:47.600 He's the thought leader.
00:00:48.840 He's the court philosopher of the Trump presidency.
00:00:52.480 But there's another neo-reactionary philosopher who had a big impact on the movement.
00:00:58.660 And his ideas might actually be more prominent than many people realize.
00:01:03.260 In fact, he might have won the election in the way that Trump didn't.
00:01:08.380 Joining me today to discuss that is Patrick Casey.
00:01:11.440 He is a columnist and the host of the Restoring Order podcast.
00:01:15.260 Thanks for joining me, man.
00:01:16.740 It's great to be back, Oren.
00:01:17.680 Thank you.
00:01:18.060 Absolutely.
00:01:19.540 So you have recently, I think, been falling further down the Nick Land rabbit hole.
00:01:25.840 It is a weird, esoteric journey, but it is one that I think ultimately is very valuable.
00:01:33.000 What sparked your interest?
00:01:34.600 Why are you suddenly focusing on Nick Land's work?
00:01:37.060 Sure.
00:01:38.040 So I've been interested in Nick Land for years, but it's an on-again, off-again.
00:01:42.340 Sometimes I don't know.
00:01:44.900 I don't know.
00:01:45.260 The inspiration strikes.
00:01:46.400 Maybe it's AI in the future compelling me to become interested in this.
00:01:50.020 You know, Nick Land has many ideas of that sort.
00:01:53.260 I don't actually believe that for the record.
00:01:55.160 But, you know, I have been thinking about the rise of the tech right.
00:01:59.400 And Nick Land offers a lot of insights into that I think no one else really offers with regard to what's happening here.
00:02:10.800 So I'm sure we'll be getting into that and other stuff.
00:02:14.340 Yeah.
00:02:14.560 I mean, obviously, Curtis Yarvin talks about the CEO monarch, right?
00:02:20.160 And a lot of people could look at someone like Trump, who is himself, of course, a very successful businessman, or the support, obviously, of someone like Elon Musk, one of the richest men, or the richest man in the world, depending on the day.
00:02:32.520 You know, this is obviously a scenario that Yarvin wrote about.
00:02:37.500 However, he doesn't spend a lot of time on technology.
00:02:40.600 He doesn't focus on that aspect of what is happening.
00:02:44.760 And the fact that Trump has really promoted guys like Sachs, guys like Andreessen, guys like Musk to very important spots, you know, these are guys who are assisting him and, in many cases, directly advising, obviously, he's got other guys like Bezos and Zuckerberg lining up behind him.
00:03:05.340 This is obviously there's something about his presidency that really incentivizes the tech industry to be behind him.
00:03:13.680 We see the big push for artificial intelligence as part of Trump's presidency.
00:03:19.800 What aspects of that do you feel play into Lance's philosophy?
00:03:25.080 Sure.
00:03:25.820 So I think, first, we've got to be very upfront about the fact that the tech right, this phenomenon of all of these tech guys, a number of the ones you've mentioned, some you didn't, joining up with Trump, if you go back to 2016, that would have been unthinkable.
00:03:40.520 So this is an incredibly, I would say, historic, certainly significant phenomenon.
00:03:46.940 It's a big counterfactual, but I don't know, Aaron, if Trump would have won the 2024 election if he didn't have Elon's money, if Elon Musk's buying Twitter didn't, you know, free up some of the dissemination of information and also the money and support from other people in the tech world.
00:04:03.060 So it's not the only factor, but one of the main factors, and it's a huge shift.
00:04:09.920 So what does Nick Land have to offer there?
00:04:11.820 Well, there are a few ways you can look at these tech elites moving rightward.
00:04:15.740 You can look at it on an individual level.
00:04:18.060 You can say, OK, these guys are, you know, just individually kind of thought, well, I'd be better off with Trump.
00:04:23.340 You know, I'm getting harassed by the government.
00:04:25.020 I'd have lower taxes.
00:04:26.560 Yeah, and that's absolutely the case.
00:04:27.780 I mean, these are individuals making decisions.
00:04:29.860 OK, well, you could zoom out a little bit and see things from somewhat of a bigger picture perspective, maybe a Burnhamite, James Burnham perspective, and say, OK, this is a class kind of almost like a class struggle.
00:04:41.300 I know it sounds a little bit Marxist, but, you know, the billionaire class versus the managerial or perhaps more accurately, the bureaucrat class.
00:04:49.400 You know, one thing Burnham got right was over the course of the 20th century, there would be more and more government oversight over business.
00:04:57.000 One might even refer to that as the total state.
00:04:59.560 I hear there's a great book on the subject, of course, referring to yours.
00:05:04.400 Right.
00:05:04.560 And that's that's a valid perspective as well, because if you look at Elon Musk, just as one example, the government was launching all of these investigations into him.
00:05:11.740 He got hit with civil rights lawsuits.
00:05:13.420 You know, some guy at Tesla claimed that he was the victim of racism, not like from Elon Musk directly, but you have this whole thing called hostile work environment.
00:05:21.020 OK, next thing you know, Elon Musk's company is shelling over hundreds of millions or whatever it was.
00:05:25.440 So and, you know, Zuckerberg being dragged in, getting harassed from both the right and the left over, you know, censoring too much, too little.
00:05:34.080 I think you did have a lot of tech people that just kind of decided, like, as a class, like, OK, our interests are not really aligned with with this managerial regime.
00:05:43.600 There's too much oversight.
00:05:44.860 We don't have the freedom to do what we want.
00:05:47.320 So that's the kind of the class way of looking at it.
00:05:50.080 And I think that's that's that's very valid.
00:05:52.000 Now, there's the Landian perspective, which is what I've been thinking about lately.
00:05:55.780 And I haven't seen a lot of other people discuss.
00:05:57.760 I'm sure someone who's as familiar with his work as you are is probably I'm sure I'm going to be preaching to the choir.
00:06:02.880 But I think the audience will hopefully get a lot out of this.
00:06:06.000 So what Nick Land's big thing is when he talks about accelerationism, he's talking about a process that was kind of almost like a monster that was unleashed during the Renaissance.
00:06:16.080 A number of things happened during the Renaissance with the rise of capitalism somewhere around that time.
00:06:23.420 You know, you had double entry bookkeeping, the joint stock exchange, some of these sorts of things.
00:06:28.760 And there was it was almost like all of these these guardrails on this process were removed.
00:06:34.020 What is that process?
00:06:35.180 Well, it's it's the acceleration of techno capitalism.
00:06:38.580 Sometimes he just refers to it as capitalism.
00:06:41.220 He also refers to it other ways.
00:06:43.140 But but that's a good way of looking at it is there's this positive feedback loop.
00:06:46.600 The more capitalism you do, the more technological innovation you get, the more technological innovation you get, the more capitalism.
00:06:54.580 And that that just kind of we're at the beginning of of an exponential or logistic curve.
00:07:00.520 We're still like relatively early, you know, prior technological progression, some would say was was linear prior to that.
00:07:08.460 It just was kind of going straight up.
00:07:10.020 I mean, that's not entirely the case.
00:07:11.380 So the point is, is there is this process that was kind of unleashed during the Renaissance of technological and capitalistic growth.
00:07:19.480 And it's really a battle between deregulation and the state where you have the state kind of a lot of what the state does is try to inhibit or redirect that process of growth.
00:07:31.920 And I think there's from a Landian perspective and I'll leave off here for now.
00:07:36.580 There's no way you can look at the 2024 election and not conclude that this was a huge victory for that process.
00:07:44.540 Right. This acceleration is process that is totally restructuring society that is taking us in land's view closer to the obsolescence of human beings.
00:07:54.720 Right. He has that famous quote that I'm probably going to butcher.
00:07:57.640 But, you know, nothing human makes it out on the other side, something to that effect.
00:08:01.020 The idea that eventually we will reach the singularity, man will be phased out.
00:08:05.360 It will just be AI and robots or something of the sort, which is a future I find horrific, just to be clear.
00:08:10.060 Nick Milan might think that's cool.
00:08:11.740 I'm interested in land from like a descriptive, how can we understand this process kind of perspective.
00:08:16.860 I don't really share the same values as him beyond that, though.
00:08:20.580 When I found out my friend got a great deal on a designer dress from Winners, I started wondering, is every fabulous item I see from Winners?
00:08:29.480 Like that woman over there with the Italian leather handbag.
00:08:32.420 Is that from Winners?
00:08:33.620 Ooh, or that beautiful silk skirt.
00:08:36.140 Did she pay full price?
00:08:37.380 Or those suede sneakers?
00:08:38.960 Or that luggage?
00:08:40.040 Or that trench?
00:08:41.200 Those jeans?
00:08:41.880 That jacket?
00:08:42.600 Those heels?
00:08:43.480 Is anyone paying full price for anything?
00:08:45.940 Stop wondering.
00:08:47.740 Start winning.
00:08:48.640 Winners find fabulous for less.
00:08:50.820 Yeah, and land has softened that language in recent interviews, so it's difficult to know how much he still holds to some of that.
00:09:01.500 But as you point out, the descriptive aspect of his work is really the most interesting.
00:09:06.800 I think it's really important there that you layered in the comparison between Burnham's analysis and land's,
00:09:13.540 because I think they actually intersected in a very interesting way in this process.
00:09:18.140 So I had this thought, and I shared it with land, and the nice thing about the modern era is your favorite philosopher will just pop in on Twitter and comment on things that you're talking about.
00:09:29.820 But, you know, I floated this idea to him that the managerial elite were kind of the last piece of human safety software on technology, that they were kind of this final barrier, this last-ditch effort to try to recapture the capital process.
00:09:48.860 And when you talk about that tension between the venture capitalist class, the tech right, and the managerial class, what we're seeing in a way is the fact that the managerial class, while they're very inhuman in a lot of what they've done,
00:10:05.080 they have become so in an effort to capture what is going on, to re-territorialize it, to bring it back into the interests of specific people, specific groups, specific ones, you know, material interests, these kind of things.
00:10:19.160 And the fact that these two things are at war, and, you know, Burnham talked about this, and Francis talked about this later on, Sam Francis, that these two classes were at war with each other, that the capitalist and the managerial, even though we, you know, and especially the left, thinks of these as like the same thing, right?
00:10:38.920 Oh, it's just all the rich people, it's all the capitalism. But there's actually a real tension between the managerial part of the apparatus, and those that are founding these organizations, making the money, the CEOs, these are very different interests, which creates a lot of the dynamic we see today.
00:10:56.780 But another step in that is that if guys like Elon, guys like Andreessen, many of these other people want to free themselves from many of the restrictions that the managerial class places on them, the kind of things that are very human, even if they are a warped version of humanity, like wokeness, they need to find ways around the existence of that current interest group.
00:11:20.880 And I think this is one of the reasons that AI is such a priority for these guys, I think they want to replace the managerial class with AI, allowing them to scale and make these big, interesting advances in technology without having to answer to a whole bureaucratic morass that enables all the things that allow them to launch rockets.
00:11:45.800 You need a lot of civilization between nothing and rockets. And the civilization is great, but it also hinders a lot of what Elon wants to do. So wouldn't it be great if I could just automate all that civilization, so I can get back to launching rockets?
00:12:01.020 I think that is actually a big part of what is happening here. And so the fact that land in many ways kind of predicted this movement, this desire of capital to free itself from these very particular human interests, and we're kind of seeing that in a way play out in the way that the tech right is pushing forward, I think is pretty fascinating.
00:12:23.600 Yeah, it's absolutely fascinating. I couldn't agree more. So one angle, one thing that I wanted to touch on here, and it's relevant to what you were saying, you mentioned Mark Andresen.
00:12:33.860 So he was on Joe Rogan, it was a pretty interesting interview. And one of the things he said was, look, there used to be kind of this arrangement between Silicon Valley tech elites and the kind of liberal regime that, you know, they get their sandbox, they get to the play there, they get to do their thing, they get to innovate, whatever, as long as they, I forget exactly what he said, but donate to charities, you know, as long as they do kind of like the bare minimum that the liberal regime expected.
00:12:59.680 But over time, over time, the expectations and requirements of the liberal regime became, they grew, and both in like, you know, number and intensity, to the point where, you know, they don't really feel, tech elites don't really feel like, okay, we just got to, you know, play nice with society, and we get our, you know, you don't get your sandbox anymore, right?
00:13:20.020 The total state has infiltrated the sandbox. And so that's a big part of the reason why they obviously chose to do this. Now, again, that's looking at it as the individual perspective.
00:13:29.680 The Landian perspective is there's been this process that's been happening for hundreds of years, the acceleration, this positive feedback loop, technology, capitalism, working together.
00:13:40.160 And that that, like any attempt to stifle or restrict that is going to be, it's going to be a battle. And in that sense, sure, you are, you know, if you're trying to restrict that, you are going to run up against human beings whose interests are aligned with that force.
00:13:58.620 But you are fighting a force, right? This process is perhaps a better way of looking at it that's been around for, since the Renaissance, at the very least.
00:14:08.100 So that's, that's how I view that, that sandbox quote is, okay, so you had this process, it was allowed to continue.
00:14:13.680 But it just got to the point where there were too many restrictions on it. Yeah.
00:14:17.740 One thing you were saying about, which was really important here is about, you know, Nick Land is always talking about, he's talking about the people, he has these futuristic, what was it?
00:14:28.140 Cyber revolution was, was one of the chapters in Fang Numino, one of the essays. And it's, it's the one in which he, he puts forward, like it's, it's like a dialogue.
00:14:38.460 They're talking about a future in which these K revolutionaries, like cyber revolutionaries are going to war with like people that are consciously seeing themselves aligned with this process of, of acceleration.
00:14:51.020 And they're like straight up going to war with, with the government, which represents, it's like the, the pro tech side and the pro human side.
00:15:00.000 And, uh, you know, I don't know if things are exactly going to play out that way. I certainly hope not, but the pro human side, that's, that's what Nick Land says that that's basically what leftism is.
00:15:09.380 And you mentioned that even though these people are, or they're pro human in like a demented way, right? Not in a way that, that we would as, as actual right wingers.
00:15:17.180 Um, but that's, that's worth considering because a big part of the, uh, of, of what this managerial elite stands for, at least claims to stand for some people promoting this or doing it cynically is, is right.
00:15:30.420 Obviously that there are differences that inequality, the idea that capitalism and they're correct. Capitalism creates a lot of inequality.
00:15:37.000 I would also argue that it raises the standard of living overall. So you're not going to have a totally equal society, but everyone overall, like it's better to be poor in, uh, America than it is to be, you know, in most cases than it is to be middle class.
00:15:50.760 And, you know, some like total third world country, right. Um, you know, middle class people from those countries would love to be, uh, live existing at the bottom strata of our society.
00:15:59.580 So the point of that is that that is like a very pro human perspective. Um, but just kind of a, an egalitarian one, they're saying, Oh, capitalism and technology and these rich billionaires, you know, they're not doing enough for inequality.
00:16:13.480 Um, well, you know, that we saw how things played out, uh, that didn't, uh, they, this has been a huge win for the process of acceleration and for the billionaire class.
00:16:23.420 So it's, and it's in the short term, it's been a win for us. Uh, I guess I'll leave it there for now, but it is worth considering maybe some of the long-term ramifications of this, of this new alliance.
00:16:35.940 Well, and speaking of, of course, we both know that we have had a little bit of a knockdown and drag out with some of the tech right here in the last few months.
00:16:45.500 Many people who are, uh, more nationalistic people who, uh, are more traditional, uh, religious, uh, these are forces that, well, as you point out, the tech right is incredibly important.
00:16:59.020 I, uh, I very much agree with you that Donald Trump probably would not have been elected if Elon had not bought Twitter, just, just that act alone, everything else, you know, the money, all that stuff is also helpful.
00:17:10.120 But just the fact that all of the news could get to the people unfiltered, uh, was such an amazing change in the dynamic, the American electoral dynamic that I think that kind of completely, uh, changed the game.
00:17:25.000 It completely shook up the board and allowed Trump to get that second victory.
00:17:29.500 So obviously these are people who are key to this coalition.
00:17:32.880 Um, we're, we're not blind to that fact.
00:17:35.480 And I'm very happy that Elon ultimately has made those moves.
00:17:39.020 Even if I disagree with him, but disagreements we have, and many of those disagreements are centered around the fact that the tech right, um, doesn't seem very pro human in the sense of, uh, worrying about say protecting a nation, protecting the actual people of the nation, seeing the United States as not a sports team, not a business, not a,
00:18:02.700 economic zone, economic zone, but as a real place where people live, where they raise their children, where they create generations of, uh, tradition and habit and custom that bind them together and make a real people.
00:18:16.680 They don't see the world that way, the world.
00:18:19.940 And it has to be when you're, when you're operating at this level, but for them, the world is just a set of pieces to rearrange, right?
00:18:26.760 It's how, how do I move the world around so I can launch the next rocket?
00:18:30.840 How do I move the world around so I can make the next advancement, you know?
00:18:34.400 And, and that's how you have to think when you're doing great things, but it also means that you're, you say things out loud, like we need infinite HP one visas because Americans are too stupid to program computers.
00:18:46.740 Like, you, you, you, you kind of vomit rhetoric like that.
00:18:50.100 And so we have this, uh, you know, kind of standoff between these forces, which you're right to point out, even though both of us recognize the importance of land's work and, uh, you know, the predictive power that we're kind of watching unfold.
00:19:03.720 It also points us to a possible conflict of interest in a future that ultimately we aren't moving towards.
00:19:11.100 Again, that's kind of the value of real philosophy.
00:19:13.240 You can read Plato without wanting the state to take children from people at seven years old and, you know, train them, train them to brainwash them.
00:19:21.260 You know, you can, you can read Nick land without ultimately being like, well, I want to replace every part of my body with a, you know, cyborg, whatever.
00:19:27.660 Uh, and so, uh, you know, his, his, uh, philosophy is very useful, but it does point us to a, a conflict inside the MAGA coalition that is probably only going to continue to grow.
00:19:40.020 Yeah, you're absolutely right.
00:19:41.440 The H1B debate was, uh, revealed, uh, some of the fault lines in this, in this new coalition.
00:19:47.920 Now I will say, I don't know if you actually get to choose your elites.
00:19:51.120 I think that just historic Providence or Fortuna, you know, historical circumstance kind of, you know, gives you that's what you have.
00:19:58.760 Right.
00:19:59.220 You know, you don't get to, you and I, even if we, uh, I, I do really like Elon.
00:20:03.000 I do have some disagreements.
00:20:04.160 The H1B thing is an example, but if we, you and I decided, well, we don't want Elon anymore.
00:20:07.500 We want this, some other billionaire.
00:20:09.260 It's kind of, even if like, you know, 10,000 of us decided that, well, that's kind of just not gonna, well, we want to know.
00:20:15.060 We want, uh, you know, some other billionaire to be president other than Sean.
00:20:17.780 Okay, it's not, I don't, for the record, but, um, you just kind of play the hand you're dealt, right?
00:20:23.120 Uh, so, yeah, we're gonna have these disagreements, and I think it's important to keep that in mind.
00:20:28.560 We should be very, we should be very thankful, um, but we shouldn't, uh, you know, we shouldn't shy away from disagreeing when necessary.
00:20:34.380 I don't, I think, like, attacking and, and insulting, uh, these people is not, it's not the right way to go about it.
00:20:39.600 And, in fact, I think that makes having these discussions, you know, people who are, you know,
00:20:43.740 if you're the richest man in the world and you're getting insulted, uh, you know, just nastily on Twitter,
00:20:48.020 uh, that's probably gonna make you less, probably gonna double down.
00:20:52.080 I think that's a lot of what happened, that debate, is, is people just double down on, on their takes
00:20:56.380 because they were getting insulted, and they were like, well, screw you.
00:20:58.960 So, now, when you said that, you know, Elon and a lot of these tech guys, they want to just,
00:21:03.140 they want to launch rockets, they don't want, whatever gets in the way of that, basically, is, is the obstacle.
00:21:08.120 Okay, well, let's go back to this process of acceleration that we've been talking about, um, that, uh, you know,
00:21:13.960 any, any guardrails, any, any, I'm afraid, Nick, Nick Land has some other great, uh, metaphors, uh, here
00:21:20.100 that I'm, I'm drawing a blank on, but any, any obstacles to that, any, anything that impedes that process, um,
00:21:26.500 is, is kind of the enemy. So, now, but there are some, yeah, there are some things that we could point to
00:21:31.400 that are, well, I'm generally pro-capitalism, uh, that I wouldn't want to happen, that might be good for that process.
00:21:36.780 So, while I think that we are along for the ride, I don't think we can stop this process.
00:21:42.220 I do think we are given choices of, like, different versions of, you know, we're going to have a future
00:21:48.000 that's more technologically advanced, but I don't think there's only one more technologically advanced
00:21:52.580 future. I think there are ones where America is stronger, is healthier, um, and there, there are
00:21:57.880 certainly more technologically advanced versions of the future where, yeah, America's weaker, uh, you do have
00:22:02.720 more mass migration. So, and that process, right, it de-territorializes. I know you've covered this
00:22:08.760 aspect of, of land's work. He, he borrows it from two other, uh, philosophers, Deleuze and, uh, Guattari.
00:22:14.400 The idea that as this process of capitalism, of technological advancement goes on, it breaks down
00:22:20.460 social boundaries, uh, social orders, entire social orders. Just look at what happened to the,
00:22:26.180 the Ancien regime. And that, that, that's kind of like what we're kind of concerned about. We don't
00:22:31.100 want America to be destroyed in the process demographically, culturally, spiritually. Um,
00:22:36.920 and I think, you know, again, if you're going to have this technological process, uh, uh, you know,
00:22:42.400 growth and, and progress, uh, you're going to get some of that. But like I said, I think we can kind
00:22:47.060 of steer things a little bit, but we can't stop the process. And that's, that's, uh, something we have
00:22:51.200 to largely accept. Yeah, I think that is pretty critical. Again, you know, one of the most famous
00:22:57.800 land quotes is the only way out is through, right? And ultimately we can't go back where we aren't
00:23:04.320 returning with a V, uh, there is going to be a technological advancement. We will have to
00:23:10.200 interact with it at some level. And so the question is how to do that in the interests of the United
00:23:15.900 States for the good of the people. Now, again, that's something that land might not like. He doesn't
00:23:20.420 want monkey business. He doesn't want those human limitations, those human interests, uh, you
00:23:24.900 know, hamstringing what's going on, but there is, but with that insight, we can at least recognize
00:23:30.560 that this process is struggling to occur and we can do our best to try to steer it. As you said,
00:23:35.980 can never completely control it, can never completely stop it, but we can recognize that it is ongoing
00:23:41.080 and something that we should be thinking about. Uh, you know, interestingly, one of the things that
00:23:46.360 Lance says in his work, uh, is, is, you know, one of the most interesting parts of accelerationism is
00:23:51.700 it collapses decision space. It reduces the amount of time that you have to think about things because
00:23:57.580 the developments happen more rapidly and there's less and less time for you to interact with
00:24:02.480 developments before you move to the next one. So if you are not prepared, if you are not well out
00:24:07.280 ahead of this stuff, if you don't have kind of this axiomatic understanding of everything we do
00:24:13.140 needs to be for the sake of American people, for the betterment of the American people,
00:24:18.160 then you won't have time to turn that ship when the next problem comes. And that's one thing I
00:24:24.700 wanted to bring in because I was just at the arc conference in London, Jordan Peterson's conference.
00:24:29.520 And one of the main focuses on this was AI and technological development. I found the politics
00:24:35.460 actually to be relatively, uh, you know, kind of behind, uh, in a lot of ways, America is, is,
00:24:40.760 is well ahead of kind of the European trend. Most, most of the Europeans I spoke to there,
00:24:45.580 uh, were wishing they could catch up to what was happening in the United States politically.
00:24:50.020 Uh, but one thing that was really focused on, uh, was, was technology. A lot of speeches about
00:24:55.900 this, uh, Peter Thiel giving a very interesting attack on kind of, uh, Descartes, uh, Descartes,
00:25:02.260 uh, kind of, uh, Cartesian dualism, uh, really found that, uh, the idea that, uh, people were
00:25:10.620 separate minds, uh, was very dangerous in the world of AI because it meant that you basically
00:25:15.680 had to attribute humanity to artificial intelligence. Uh, and so they were really
00:25:22.200 tackling some of the higher level, uh, problems that come with a lot of the technological
00:25:27.480 advancements. Uh, Eric Weinstein gave a very interesting, uh, speech, uh, which sound like
00:25:32.820 he ripped it right out of Curtis Yarvin and he just renamed everything. Like he just relabeled
00:25:37.660 the cathedral, like the disc or something. And he relabeled, uh, fifth generation warfare
00:25:42.880 as, uh, something else. Uh, but, but basically there, you know, there's a very big and real
00:25:48.840 concern from, I think some pretty top level, uh, decision makers, uh, on both kind of the
00:25:55.520 center and right, uh, that what is coming down the pipeline technologically is going to radically
00:26:00.440 change the nature of states, if not the way that humans, uh, live their lives. And if we're
00:26:06.940 not prepared for that, if we're not thinking about that, if we have, if we're not already
00:26:10.260 realizing that we're existing in a world where, uh, large amounts of, you know, technological
00:26:15.180 algorithmic egregores are kind of, uh, altering the way that we think about, uh, our nationhood
00:26:22.300 and our interests, uh, then we will be completely driven by a process that is inhuman.
00:26:28.940 What's better than a well-marbled ribeye sizzling on the barbecue, a well-marbled ribeye sizzling
00:26:34.420 on the barbecue that was carefully selected by an Instacart shopper and delivered to your door,
00:26:39.380 a well-marbled ribeye you ordered without even leaving the kiddie pool. Whatever groceries your
00:26:44.960 summer calls for, Instacart has you covered. Download the Instacart app and enjoy $0 delivery
00:26:50.880 fees on your first three orders. Service fees, exclusions, and terms apply. Instacart, groceries
00:26:56.900 that over-deliver.
00:26:58.060 Yeah, that's absolutely the case. So yeah, the thing with AI is very interesting as well. Uh,
00:27:04.740 I read a good book last year, The Coming Wave by Mustafa Suleiman. Uh, he was a high up Google
00:27:10.780 AI guy, went off to found his own country. So definitely, you know, at the top of the AI game
00:27:16.220 and, you know, he's someone who's, he's come out and basically said, and this is very relevant to
00:27:20.700 land as you'll see, um, that he thinks that we should try to contain AI, but he also doesn't think
00:27:26.060 we can really do it. But he thinks the ramifications of not like, you just got to try, even if there's
00:27:30.120 a 1% chance of, of doing it. So it's, uh, kind of fascinating that some of these people are sort
00:27:34.900 of reaching Landian conclusions on their own, uh, that like, you kind of can't stop it. But if you
00:27:40.580 look at what the consequences of not containing it, um, not stopping it, but containing it, guiding
00:27:46.280 its, its progress, then, uh, you know, some of these ramifications could be pretty bad. Um, one thing
00:27:51.720 he talks about in that book as well is that we're not, it's not just like AI that we're on the cusp
00:27:55.720 of. He talks about waves of technological development where, um, you have a bunch of
00:27:59.980 different technologies at once that come into being. He's talking about quantum computing,
00:28:04.580 nano computers. Um, yeah, obviously AI robotics that over the course of our lifetimes, we'll probably
00:28:10.700 see. It's not just like one thing that's going to be, it's just like a whole new era that we're,
00:28:16.320 that we're going to be entering into. And we're, we're definitely, we're definitely there. I mean,
00:28:19.840 uh, we're both millennials, I believe. I remember growing up, um, without smartphones and I know
00:28:24.360 that's kind of a boomer thing, all the smartphone, but like, it has absolutely altered, uh, quite a bit
00:28:29.880 about what it means to be human. Um, dating, uh, things like, I mean, almost every form of, of, uh,
00:28:38.800 you know, social participation has kind of declined and, you know, people want to attribute,
00:28:43.240 there are certainly multiple factors, but like, you know, the internet, TV, you know, computer,
00:28:48.080 all of these are big factors. So these are things just over the course of our lifetime,
00:28:52.060 uh, you know, and we're not, we're not old contrary to what the zoomers watching might think.
00:28:55.960 Yeah. Now, Aaron and I were, we're, you know, in the prime of our, our youths here, but, uh,
00:29:00.220 you know, we, it's just, it's kind of crazy to think even like 30 something years, we've seen
00:29:04.200 like new technologies get introduced, totally change things. So, um, yeah, over the course of,
00:29:10.080 uh, you know, the next like 50 or whatever years, we're going to see some crazy stuff.
00:29:13.720 And the question, again, I just go back to, I think we're, we have multiple options on, uh,
00:29:19.500 the way it's like, you're playing civilization or something. Okay. You get to the next stage,
00:29:22.600 you still have, you know, choices of like how that next stage is going to play out. And, uh,
00:29:27.640 I think the immigration angle is, is really an important one because having, having like,
00:29:31.280 you know, not wildly rupturing and altering your country's demographics is an essential part of
00:29:37.560 maintaining your country as, as a nation. Um, and now at this point, demanding that America is,
00:29:42.600 is, you know, like totally ethnically or racially pure. That's like, you know, it's just not going
00:29:47.740 to happen. Uh, there's going to be some diversity, but you know, future waves of immigration, uh,
00:29:52.260 that's, that's definitely something that we can do to stop. And I think it's easy when we talk about
00:29:57.160 the kind of the inevitability of this techno capital acceleration, uh, it's, we've got to be careful
00:30:03.120 to not say that certain things happening now that it, that, that could be perceived to be
00:30:08.600 inevitable, um, uh, inevitable part, inevitable parts of that process. We got to make sure that
00:30:14.280 we don't say that they, they are, if they're not so like, you know, Reese the Cato sorts of people
00:30:19.360 making arguments for legal immigration. Oh, this is just, you know, the market, it's the invisible
00:30:23.040 hand. Okay. Well, there can be short-term economic gains. And, you know, when it comes to illegal
00:30:30.260 immigration, a lot of these, it's like, goes to the businesses, the GDP gains, they go to the
00:30:33.980 businesses, they go to the illegals, don't really go to the actual, you know, most Americans. So you
00:30:39.240 gotta, you gotta put that, you know, into perspective. But I, what I'm getting at here
00:30:42.900 is, um, you know, if we're at the cusp of this automation revolution, um, why, why are we bringing
00:30:49.720 in like, uh, uh, you know, cheap laborers? Right. It doesn't make any sense. There's still a room for
00:30:55.080 choosing. There's still room for choosing. And in fact, some of these things might actually,
00:30:59.060 the, maybe, maybe the pro tech, the acceleration is perspective is actually to have
00:31:03.500 less immigration for these reasons, uh, with, with chat, GPT, and the rise of AI, do we need,
00:31:09.180 uh, you know, tons of cheap H1B workers? And the answer is no. And we probably never did.
00:31:14.500 Um, you know, and do we need, uh, tons of, of cheap farm laborers? No, we've, we've gotten even
00:31:19.400 talked to anyone who works in agriculture, even the last decade or two, uh, you know, machinery and,
00:31:24.500 uh, tech in general has, has rendered obsolete some of these positions. So we need to, maybe we need
00:31:29.860 to lean into the technology angle a little bit more. Now, with that said, there are still going
00:31:34.340 to be downsides to, um, to things like automation, Tucker Carlson. Uh, you know, I remember his
00:31:41.340 interview with Andrew Yang. They're talking about, you know, one of the most common jobs for non-college
00:31:47.040 whites is truck driving. Okay. Well, what happens when AI drives trucks? Well, uh, uh, you know,
00:31:53.420 Andrew Yang in his book, he's where he, he thinks all these truck drivers are going to become like
00:31:56.580 white nationalist guerrillas trying to overthrow the government. Yeah. I don't think that's the
00:31:59.880 case. Certainly wouldn't support anything of the sort, but, um, you know, to the extent that we can
00:32:05.000 account for some of these changes, there's gotta be like something for, for these sorts of people
00:32:10.420 to do. And the last I'll say is, um, you know, we got to not bring in more people who are, you know,
00:32:15.800 not going to have jobs, uh, you know, this, uh, kind of, uh, lower, you know, immigrant underclass
00:32:21.220 is just going to turn to crime. It's, you know, that's kind of what happened. And, uh, when there's
00:32:24.820 something to do, and that's a big thing that happened in Europe, uh, where they brought all
00:32:28.260 of these migrants over and unlike in America, they're not really put to work or they do,
00:32:31.820 they just sit around on welfare committing crimes. So, um, I think that we can account for some of
00:32:35.800 these changes, not entirely, but you know, to the extent that we can, we should.
00:32:40.080 Yeah. It's a really important that you point out the kind of, uh, tension between the technological
00:32:46.860 advancements and the immigration question, because so much of what we've heard is that immigrants
00:32:53.720 make the United States run. We just wouldn't be able to do anything without them. They, you know,
00:32:57.660 they, they, they pick all the food, you know, who will pick the cotton, right? Like, you know,
00:33:01.200 this is, this is kind of the approach, no matter how economically callous that is. This is what we
00:33:06.420 hear from the people who are supposed to be the most compassionate people in the world is, well,
00:33:10.740 if no one's out there delivering my food, you know, um, at slave labor wages, then how will I get my,
00:33:17.560 you know, whatever, uh, you know, delivered to my apartment at three in the morning in New York.
00:33:21.820 Like this is why we need infinite immigration. But as you point out, if we are planning to phase
00:33:27.940 out so many of these jobs, if so many of these agricultural jobs, so many of these manual labor
00:33:33.560 jobs are going to be automated, uh, very quickly, then there is literally, uh, not only no reason,
00:33:40.180 but it's disadvantageous to the entire United States to do this. You are setting up a dependent
00:33:45.660 class that is immediately going to need to be, uh, given welfare retrained. You're going to have to
00:33:51.320 find something for them to do. Or like you said, it's going to turn to crime and it really destroys
00:33:55.840 any arguments. You know, the technology really destroys any arguments ultimately, even for,
00:34:00.760 I think, uh, moderately skilled immigration, the, the stuff like the H1Bs that we're supposed to be
00:34:05.480 getting, because why does any in the, why does any engineer in India need to move to the United States?
00:34:12.040 If I have a Skype call and email, like it just, no one has explained, literally zero people have
00:34:18.100 explained to me why I need to physically relocate a guy who works only on digital material and can
00:34:25.440 communicate with anyone anywhere simultaneously without, you know, and instantaneously, like it
00:34:31.200 just, it makes no sense. And yet we're told, no, we have to move these people in. So the technology
00:34:37.000 is its own argument against a large amount of immigration. Uh, and I think that that that's
00:34:43.140 pretty critical. There's a larger question, as you point out with some of the native population
00:34:48.500 of the United States, as we automate away these jobs, what do they do? But obviously if anyone is
00:34:54.380 going to get any kind of where welfare job retraining, any of these things, ultimately it should be the
00:35:00.940 people who are native to the United States. It's the people who hold the jobs, who built the
00:35:06.040 infrastructure that allowed for the automation in the first place. If there's any people who
00:35:10.180 should be supported by the government, as they try to transition to something else, it should
00:35:14.920 ultimately be those people. Now, I think there is a danger in UBI. I think ultimately, uh, that is,
00:35:21.920 you know, the grain dole, uh, it goes all the way back to ancient Rome. You know, it goes, you know,
00:35:26.900 it's a classic, uh, subsidization of a population that gets phased out. At first the Romans were farmers.
00:35:34.040 Every citizen was a farmer. Uh, but over time you got enough slave labor, the farm labor of the
00:35:41.080 average, uh, Roman became obsolete. Uh, so they had to put those obsolete workers on the grain dole
00:35:46.500 in order to support them. Uh, this just slowly degraded the quality of Roman life. So, you know,
00:35:52.860 it's not exactly the same thing, but we've seen this process before there's, there's really nothing
00:35:57.960 new under the sun, even though the technological aspect is different from, uh, say the, the
00:36:03.200 consolidation of farms or whatever in Rome, we can see this pattern play out. So ultimately,
00:36:07.820 I don't think that this is a good thing and we need to find solutions for, uh, Americans who are
00:36:14.800 being impacted by this kind of advancement. But the other really interesting things that this
00:36:19.660 technology does is it allows for possibly different political organizations, right? Like one
00:36:25.920 of the reasons we currently have to centralize government as heavily as we do is that massification,
00:36:31.900 uh, and kind of volume is its own powerful effect. You need the force multiplier of just the,
00:36:39.540 the sheer scale and size in order to compete with places like China or these, you know, these kinds of
00:36:45.560 other countries. But what if the Singapore model became something that was very easy for everyone
00:36:51.780 to do? What if the technological advancements of AI and automation allow for the localization of more
00:36:59.040 capitalistic and governmental processes, because you no longer have to centralize everything across
00:37:04.700 an entire continent in order to generate the scale and advantage that it brings. These are advantages
00:37:10.020 that could come with this technological advancement and could ultimately move the interests of people
00:37:17.080 who are more nationalistically minded, uh, you know, without, without, you know, so it's not all
00:37:21.880 downside. There are downsides, but as you say, whether it's addressing the immigration issue or the
00:37:26.060 centralization of government or phasing out the managerial elite, there are possible upsides to this
00:37:31.660 advancement that could help people who have a more traditional mindset. Yeah, absolutely. And I mean,
00:37:37.680 we talked about how this process was kind of this, this monster, so to speak was, and I say that without any
00:37:42.740 real, um, you know, value judgment, it is a very powerful force was, was unleashed from, from its,
00:37:48.880 uh, it was let out of the box, so to speak during the Renaissance. Well, you know, I'm sorry, the
00:37:53.900 trads won't like this, but, uh, a lot of great stuff happened after the Renaissance. I mean, you look at
00:37:58.600 That can't be true, Patrick. It can't be true. Uh, you're a Protestant. Maybe, you know, it's,
00:38:03.840 it's okay for you to say that. Okay. My fellow Catholics will give me a hard time, but yeah, I mean,
00:38:07.360 you look at the age of exploration, like the, the, the colonialism expert, you know, scientific
00:38:12.240 revolution, these sorts of things like, yeah, the age of like European dominance of the globe,
00:38:16.500 uh, you know, which I'm not calling for anything like that, so to speak, but you know, there,
00:38:20.780 there have been like great eras of our civilization that have in large part come from like being the
00:38:26.840 first to discover this force to harness it. And, uh, you know, so there's, it's not something that,
00:38:32.180 you know, we should, we should, uh, we shouldn't be too pessimistic about things, but, uh,
00:38:37.220 it will be a wild ride. And, uh, like I said, just even in my lifetime, just seeing the changes
00:38:41.400 brought by smartphones and so forth. But, you know, McLuhan, uh, who is the great, you know,
00:38:48.200 philosopher, intellectual of, of technology, his thing, he focused on technological mediums
00:38:53.600 and the way that the introduction of these new mediums changes society, right? So his famous phrase,
00:38:59.060 uh, the medium is the message. The point was like in introducing the TV to society will have a
00:39:04.360 greater effect on society than whatever you're playing on the TV. Not that the content doesn't
00:39:09.400 matter, but that the form precedes content and, you know, but, but his takeaway was like, okay,
00:39:14.440 once you understand these technological mediums and the, the way they change people, the way they
00:39:19.620 impact and reshape what it is to be human, then you have, like, you're able to reap the benefits and
00:39:27.440 kind of minimize to some extent, maybe not ever entirely downsides of the introduction of,
00:39:33.180 of these new technological mediums. So I think if you zoom out and apply that to technology
00:39:37.180 overall, yeah, I think that's, I think that's the case. Um, I don't think that, you know,
00:39:41.840 any, all regulation is, is bad now. And Nick Land has had some pretty explicit tweets saying that's
00:39:46.600 like a, a communist perspective to have. I'm sorry, uh, you know, Mr. Land, but yeah, so that's just
00:39:52.960 how I view it. Yeah. It's going to be, it's going to be a wild ride, but we, we do have a choice in,
00:39:56.980 in the type of technologically advanced future that, that we have. And I think that we should be
00:40:01.540 talking, uh, especially, especially discussions within this, this MAGA coalition about what we
00:40:08.620 want that future to look like. Well, we've confirmed that Patrick Casey is a communist,
00:40:13.300 uh, Nick Land, you know, you know, would, would, would recognize that, um, online, but yeah, no,
00:40:19.220 I think ultimately it is critical for us to, again, we know that the technology is coming. We know that
00:40:26.840 these advancements are going to happen. We know this process is ongoing. We know the tech right
00:40:32.480 is going to continue to play a critical role in what's happening. And it really is to the extent
00:40:38.280 that we can, our job to turn these things towards the good to, to recognize that they can be forces
00:40:45.460 for good, that they can be something that can help people that can help the United States. But we have
00:40:50.800 to keep that focus. We have to remain vigilant, remind people that, you know, the only reason
00:40:56.900 that they're able to achieve the things they're achieving is that the American people are behind
00:41:00.820 them, that a great country and a great people are ultimately the incubator that is giving birth
00:41:05.500 to what they are growing. And I think ultimately, uh, that is a battle, uh, that can be won. Cause like
00:41:11.740 you said, I, I, you know, I disagree with Elon on a number of things, but ultimately I think the
00:41:15.980 things that he has done are good. I think he has better political instincts than many of our elites
00:41:21.660 had have for a long time. And I think that working with and helping people like him better understand
00:41:27.440 things that will help Americans flourish, uh, is critical. And so I, I think it's important to
00:41:33.440 continue up that effort. And as you said, do it in a way that is, um, you know, we're, we're standing
00:41:38.240 our ground, we're making a point, we're standing on principle, but also we're not burning bridges.
00:41:42.520 We recognize a friend that can be persuaded when we see one, we don't always have to go to the mat
00:41:48.040 and start a shooting war, uh, in order to make a point and, and help people understand, uh, and, and
00:41:54.160 evaluate interests that are important to different parts of the coalition. Uh, so that said, oh, I
00:41:59.200 probably should have said earlier, uh, this is a, uh, we are taping this early, so sorry, can't answer
00:42:04.300 any questions today from the chat. If you, if you did a super chat, I appreciate it. Uh, but I can't get
00:42:10.140 to it today. Thank you very much, of course. And Patrick, it's been great talking to you. Uh,
00:42:15.680 is there any, anywhere you would like people to go, anything you want people to check out anything
00:42:19.980 you're working on? Sure. So I do the weekly podcast. Half of it is on, is on X half of it's
00:42:25.340 available for free on, uh, my sub stack full thing for subscribers. Uh, you can, they can go to
00:42:30.740 patrickcasey.com for that. They can also follow me on X at restore order USA. And yeah, those are the
00:42:37.300 two main places. Excellent. We'll make sure that you are listening to restoring order. And if it is
00:42:43.080 your first time on this channel, you need to go ahead and subscribe, click the bell notifications.
00:42:47.940 So you know that we're going live when the stream is live. And of course, if you would like to get
00:42:53.180 these broadcasts as podcasts, then you need to subscribe to the or Mac entire show on your
00:42:57.520 favorite podcast platform. If you do leave a rating or review, it always helps with the algorithm,
00:43:03.040 which obviously we know is ruling everything. Thank you everybody for watching. And as always,
00:43:08.400 we will talk to you next time.