The Auron MacIntyre Show - April 20, 2026


The Palantir Manifesto Is Wild | 4⧸20⧸26


Episode Stats


Length

1 hour and 13 minutes

Words per minute

175.93848

Word count

12,934

Sentence count

318

Harmful content

Toxicity

14

sentences flagged

Hate speech

33

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Before you knew what a stock was, you traded snacks, cards, turns. That instinct to trade
00:00:06.140 didn't disappear. It just grew up. With no minimums, no monthly fees, and 100 free trades,
00:00:11.940 TD Easy Trade taps into that instinct. Because you are made to trade, and TD Easy Trade is made
00:00:17.880 to help. Hey everybody, how's it going? Thanks for joining me this afternoon. I am Oren McIntyre.
00:00:24.940 If you didn't know, Palantir is a very interesting company.
00:00:29.260 It is, of course, founded by Peter Thiel and Alex Karp.
00:00:33.740 Peter Thiel is well-known as a really influential tech billionaire,
00:00:38.520 someone who actually tends to lean right of center,
00:00:42.440 is often credited to some extent with the rise of people like J.D. Vance,
00:00:47.280 is well-known as a backer to several different initiatives,
00:00:50.760 including things like NatCon.
00:00:52.300 and he's a guy who has had a lot of influence across the tech sphere and across conservative
00:00:58.680 politics it's a rare billionaire that actually spends quite a bit in the conservative arena
00:01:05.180 however Peter Thiel also has a lot of controversy around him there's a lot of questions about what
00:01:11.700 Peter Thiel actually wants what is he trying to bring about what is all this level of influence
00:01:16.680 for obviously he wants to sell a lot of the technology he's invested in but is he also
00:01:21.760 looking to reshape the world he's released some very interesting talks and uh essays he wrote an
00:01:28.700 essay called the Straussian moment which maybe we'll go over one day and I've also been to Nat
00:01:34.360 Khan when uh he was speaking and it's very clear from Peter Thiel's understanding in that conversation
00:01:41.620 he's talking about Cartesian dualism and how that set us down the road we're at now this is a guy
00:01:47.120 who understands things at a relatively high level i don't know if i'd call him a philosopher
00:01:52.020 but this is certainly a gentleman who has interrogated larger questions and isn't just
00:01:58.280 thinking about making money he has wider ambitions for the way the world should look and how it's
00:02:03.260 going to operate and how we need to respond which makes his efforts with palantir very interesting
00:02:10.020 Palantir is a company which is named after the evil spyglass, you know, like that ball that, you know, the Lord of the Rings characters look into to try to peer into the future and understand what's going on, which is ominous enough in and of itself.
00:02:25.580 But Palantir is also a company that is very invested in AI and database integration.
00:02:32.040 They've been hired on to work with places like the Pentagon, Department of Defense, ICE, Homeland Security, operating databases, allowing different government agencies to target things.
00:02:43.720 This is a company that really promises the integration of all those managerial elite functions, all the technology, all the databases, all the understanding that was otherwise kind of siloed in these different areas of the government or operating in both the private sector and the public sector.
00:03:02.060 They're going to kind of compile it all together and give the government kind of this access to this overwhelming leverage of information.
00:03:10.560 In many ways, that could be beneficial.
00:03:12.800 I'm a guy who really supports deportations, and so the ability to, say, target employers with high numbers of illegals and fine them quickly and get them deported back to their home countries with minimal amount of proceduralism, that's pretty attractive, right?
00:03:28.580 Like one of the main reasons you have trouble with deportations is the amount of proceduralism built into the system.
00:03:34.560 But if we have an AI that can search everybody out and run through where they're supposed to go and what courts to take them to and what judges and the laws that are applicable and it can just burn through all that stuff and get it done quickly, well, that's a vast reduction in manpower and a way to overcome bureaucracy in our favor.
00:03:51.040 and on the other hand we're centralizing a large amount of information in a corporation in which
00:03:55.840 we know very little and handing over every detail ostensibly in those databases to the people
00:04:02.140 controlling them and ultimately you can see how this quickly turns into leviathan how this quickly
00:04:07.400 becomes some unified techno commercial uh you know uh unification with the government that
00:04:13.740 ultimately brings us to kind of the dystopian tech future that many people have worried about
00:04:18.840 Well, recently Palantir released a manifesto and that manifesto answered some basic questions, but the way it answered those questions is very interesting. The questions themselves reveal something to us about the nature of Palantir and what their philosophy is, but it also gives us an idea of what they think the future should look like.
00:04:40.660 And so it also has a really interesting intersection with some of the people we've discussed on this channel.
00:04:46.640 Alexander Dugan and Nick Land were debating online about the different applications of this Palantir Manifesto.
00:04:55.960 And since I've had those two men debate on this channel, I thought it would be interesting to take a look at the manifesto itself and their reaction to it.
00:05:02.720 Because this one's going to be really complicated and it's going to draw on a lot of previous information.
00:05:08.200 so buckle up
00:05:09.980 if you need to go back and take a look
00:05:12.560 at some of my work from either of those gentlemen
00:05:14.520 you might want to do that it'll help you
00:05:16.640 understand this maybe watch this first
00:05:18.700 and then go back and try to fill in the gaps
00:05:20.680 but also this one's getting
00:05:22.480 controversial and I'm sorry about
00:05:24.720 that but there's really not much I can do
00:05:26.820 the stuff that we need
00:05:28.640 to discuss here is pretty important
00:05:30.340 to the future of not just
00:05:32.560 America but humanity at large
00:05:34.500 and we're going to have to step on some
00:05:36.600 toes while doing so um so if you're squeamish on that kind of stuff well you you might want to
00:05:41.480 click away because we're we're going to be getting into it that said let's go ahead and begin reading
00:05:47.340 the manifesto and i will uh comment as we go and like i said then we'll appeal to uh nick land and
00:05:54.260 and alexander dugan and their thoughts on it and see kind of uh what we what conclusions we can
00:05:59.020 draw from the whole thing so palantir and their post says because we get asked a lot uh the tech
00:06:05.340 the technological republic in brief because they had like a a piece about this but they kind of
00:06:10.740 boiled it down into these talking points from the um from the piece and and posted here on
00:06:17.680 twitter uh silicon valley owes a moral debt to the country that made its rise possible the
00:06:24.560 engineering elite of silicon valley have an affirmative obligation to participate in the
00:06:28.840 defense of the nation. Okay. Um, this is largely good, right? Um, we want corporations to feel a
00:06:36.860 duty to the nations that spawn them. We don't want them to be these global free floating,
00:06:41.740 floating entities that just take the resources and manpower and, uh, benefits of living under
00:06:47.920 a certain order and then turn around and ship all the jobs somewhere else, or, you know, focus on
00:06:53.640 only on profit or these kind of things we want them to feel tied to the country uh defense of
00:06:59.880 its nation is an interesting phrase there at the end of it they're not just saying hey we need to
00:07:04.140 care about and give back to uh the nation that spawned silicon valley and all of these companies
00:07:09.780 we specifically need to engage in the defense of the nation and the reason palantir is saying that
00:07:15.560 is because their stuff is used across the government but very specifically is now being
00:07:20.740 used more and more in kind of armed defense or ICE operations, right? Military and ICE operations
00:07:27.480 are big, big users of Palantir technology. And so that means that they're in a controversial
00:07:34.940 space because we have a defense industry in the United States. It's not well-loved by anybody
00:07:40.880 except the politicians who tend to get funded by it. And then the people who of course have jobs
00:07:46.580 there, which that I totally understand. But that defense industry is controversial for obvious
00:07:52.840 reasons. I mean, I'm a realist. I understand that we need weapons and we need a national defense and
00:07:58.040 someone's going to have to make that stuff. And I ultimately am not going to hate anybody for
00:08:01.780 doing that. I'm a gun owner. I don't hate the companies that make my guns or my ammunition.
00:08:06.640 Why would I hate the corporations that ultimately make the things that defend my country? The
00:08:12.200 problem is that many people, I think rightly so, don't just see these corporations as people who
00:08:17.160 are providing the means of defense, but they are people who are actively participating and lobbying
00:08:22.760 for conflict between the United States and other nations to perpetually create a market for
00:08:27.980 their product. I've got no problem with the United States having a formidable arsenal with
00:08:32.520 which to defend me, my family, my relatives, my community. I do worry about Raytheon and
00:08:39.320 these other companies that seem to put significant effort into encouraging conflict on a regular
00:08:44.700 basis so that there's always a constant churn and need you know for these products to be filled up
00:08:49.860 also there's a real concern that you know a lot of these defense contractors are just building
00:08:55.500 more expensive versions of weapons that aren't really relatively effective to their cost we
00:09:00.940 spend more and more with the promise that it's going to reduce casualties or in some way keep
00:09:04.760 us out of like direct ground combat that's a big thing with the united states right now but it seems
00:09:10.020 less and less that those weapons are really able to deliver on that but they are outscaled by
00:09:15.360 cheaper weapons that are being developed so every time we develop a trillion dollar weapon system
00:09:20.780 somewhere in the united states government uh you know some third world country launches a drone
00:09:25.940 that can blow it up and then it becomes a question of attrition if our entire system is built around
00:09:31.100 these incredibly technologically advanced weapons, but ultimately they can be undermined by relatively
00:09:37.420 low cost alternatives, then what we see is a country that's willing to exchange a few more
00:09:43.960 lives or endure a little more risk is able to leverage their risk factor quite heavily by just
00:09:52.580 using these lower grade weapons and outpacing us economically. That's a big concern. So guys like
00:09:59.340 palantir are trying to break into what is considered a pretty small class of tightly held
00:10:05.220 defense contracts there's only really a few major defense contractors that can ultimately keep pace
00:10:11.140 with what the united states is demanding the scale the level of uh you know uh testing and
00:10:16.540 well frankly lobbying and everything else that they have to pay into to maintain that
00:10:20.880 so interesting that very you know first and foremost palantir kind of puts itself out there
00:10:26.420 as an alternative defense contractor. It sneaks it into the last point of that first point,
00:10:32.180 which otherwise is a pretty good point. I do want my companies to care about this country and be
00:10:38.280 committed to it. And there's nothing wrong with defending the country. I want them to want to do
00:10:41.660 that. But it'll be a little more obvious as we go along as to why Palantir has injected that into
00:10:47.620 that first point. Second, we must rebel against the tyranny of the apps. Is the iPhone our greatest
00:10:54.860 creative if not crowning achievement as a civilization man i really hope not the object
00:10:59.940 has changed our lives that much is definitely true but it's also now be uh it also but it may
00:11:06.440 also now be limiting and constraining our sense of the possible and that's probably true when you
00:11:12.020 adopt a framework uh you also adopt its limitations when everyone is trained to use a smartphone and
00:11:19.920 to live off an iPhone or Android and ultimately learns to use apps a certain way, they don't know
00:11:27.280 how to do the rest of it. If you have seen that joke, there's the one where both boomers and 0.94
00:11:34.280 zoomers are sitting around for the millennial to show up and fix a computer. And the boomers can't 0.84
00:11:39.120 do it because the computers didn't really exist when they were kids and they didn't bother to 0.86
00:11:42.440 learn it at the time. And the zoomers can't do it because they existed in a world of iPads and
00:11:49.160 iPhones that already had like these controlled ecosystems, uh, where you didn't really need to 1.00
00:11:55.480 understand like a, like literally a, an infant, a child can operate these things. And so you didn't
00:12:01.400 have to grasp the, like, how, how does Ram fit into a motherboard? How do you solder a, uh, a GPU
00:12:08.940 or a, uh, how do you, uh, solder your Intel, you know, in, in, into your board, you know,
00:12:15.720 how does the gpu fit in how do you cool it like these are things that you just did not process
00:12:20.480 that was the word i was looking for couldn't remember uh but uh you know how do you do this
00:12:24.800 stuff they don't know and so they don't know how to do you know not only like the hardware stuff
00:12:29.900 but oftentimes they don't know how to you know fix an internet connection this kind of thing
00:12:34.300 it's just all built into this little ecosystem and so there's a fair point to be made here that
00:12:39.360 the creation of these like walled gardens and understandings of technology uh you know kind 0.94
00:12:45.160 of limit what we can do and to be clear to the zoomers i know there are zoomers who can work on
00:12:49.320 computers i'm just saying as a general as a generational situation the millennials had to
00:12:54.080 kind of and some of the gen xers had to kind of figure out how to use computers back when that
00:12:59.020 was something that was not widely available and it wasn't like easily represented to you i know
00:13:03.640 there are zoomers who have also done that there's just a wider number of people it's for the same
00:13:07.420 reason that you know there are more people who are older who can fix cars than there are younger
00:13:11.920 people because more and more of the ability to fix cards has kind of been sealed off and provided
00:13:17.140 only to mechanics with specific equipment or computers that kind of thing it's an intentional
00:13:21.080 gatekeeping of accessibility it's it's it's a feature not a bug free email is not enough the
00:13:28.520 decadence of a culture or civilization and indeed its ruling class will be forgiven only if that
00:13:34.080 culture is capable of delivering economic growth and security for the public okay again he's kind
00:13:39.400 of uh here attacking the idea that silicon valley doesn't really owe anything to the country and uh
00:13:45.260 well we give out free email right like that's fine he says no it's not enough like you guys are
00:13:49.320 getting rich you guys are being decadent and at this point unless you provide a service like you're
00:13:54.800 obviously the ruling class at this point and i know a lot of people don't like the idea of a
00:13:58.180 ruling class but if you've been listening to me long enough you know that's just a reality
00:14:01.160 we have to accept that ruling classes exist and always exist and so therefore the question is not
00:14:06.980 will we have a ruling class, but how will it function? Who will it provide for? Who will it
00:14:11.220 prioritize? And they're saying here, look, you can't just prioritize yourselves. You can't just
00:14:15.820 give away free email and say, well, we did our part. You have to make things actively better
00:14:21.680 for the average person or they're going to turn on you. And that is, of course, true. I don't know
00:14:26.300 if we could say the decadence will be forgiven. I don't think you should be decadent either way,
00:14:31.940 but the underlying point is correct here. We're directionally correct about the fact
00:14:35.880 that if you're going to grow, if you're going to get this largesse, right, if you're going to grow
00:14:39.540 rich and powerful off of your position in Silicon Valley, you need to be providing something
00:14:45.640 actively. And if you're not providing economic growth and security and other things that people
00:14:50.020 need to function in the real world, they're going to look at you getting rich off their backs and
00:14:53.840 they're going to be angry. That's how you get a revolt. So this is actually just a good piece of
00:14:58.360 classic Machiavellian understanding, actually, since we're also doing our parallel series on
00:15:04.180 Machiavelli and his insights, he says, never have the people be angry at you, right? Like that's,
00:15:08.820 that's critical. You have to have the people's support. You never want them to turn on you.
00:15:13.640 You never want them to see you as someone who's just leeching off them. That's bad for your health
00:15:18.140 long-term. Like literally it's hard for you to keep your head if you do that. Four, the limits
00:15:23.180 of soft power of soaring rhetoric alone have been exposed. The ability of free and democratic
00:15:27.660 societies to prevail requires something more than moral appeal, requires hard power and hard power
00:15:32.840 and this century will be built on software all right so now we're getting to the real stuff now
00:15:38.720 now we're getting to the spicy part of the meatball so yeah great you know we've talked
00:15:45.060 about freedom and democracy and liberty and all that stuff but at the end of the day hard power
00:15:50.360 is what defends these things right all the talk about principles and uh you know everything else 0.72
00:15:56.100 it doesn't matter if you can't back it up by killing a bad guy who wants to hurt you first 0.97
00:16:01.540 Right. Like that's just how the world works. So we're going to need more than soft power. We're going to need more than talking about the importance of democracy and liberty and values and all that stuff. We're going to need ways to make bad people go boom. And in the new century, that hard power is going to be built on software. 0.98
00:16:19.100 an interesting uh assumption here that you know i've talked again multiple times about
00:16:26.080 the uh dangerous nature of assuming that you will not need traditional hard power i don't know if
00:16:32.920 that's all they're saying here i'm not i don't think they're saying well you're just never going
00:16:36.380 to need troops on the ground you're never gonna need boots on the ground it can all be done by a
00:16:40.240 click of a button but there's a little bit of that being promised here either way they're right that
00:16:44.760 software will play a role in warfare that's very obviously already true we'll talk about that more
00:16:49.860 in a second and it's only going to be more true going forward so again a little bit of an ominous
00:16:54.780 statement but one that's hard to disagree with necessarily all right again we're getting to the
00:17:04.280 heart of it here with number five the question is not whether ai weapons will be built it is who
00:17:10.120 will build them and for what purpose our adversaries will not pause to indulge a theatrical
00:17:15.800 debate about the merits or developing of developing technologies with critical military and national
00:17:22.240 security implications applications they will proceed all right so why are they saying this
00:17:30.340 well a couple reasons one uh palantir already develops ai weapons they developed uh an ai drone
00:17:37.340 and an AI ground vehicle for the United States military.
00:17:42.580 They also are in an interesting position
00:17:45.540 because you might have heard about Donald Trump's showdown
00:17:49.860 with Anthropic, which is an AI developer.
00:17:54.740 Some of their software, some of their assets
00:17:59.120 were being used by both the Pentagon
00:18:01.640 and other parts of the American government.
00:18:05.260 and they got in a showdown with donald trump because they said they didn't want their software
00:18:09.880 used in certain ways they didn't want it killing people they they were squeamish about that
00:18:14.020 palantir has not been squeamish about that they said yeah we understand like our stuff's going to
00:18:18.360 be used to kill people we get it we're basically a defense contractor we've made peace with that
00:18:22.660 we're not doing the hippy dippy you know silicon valley you know we can't hurt anybody our
00:18:27.640 technology is only for good man like no we're fine we understand that the government and
00:18:32.500 corporations are going to work together to kill people. That's just part of doing what we do.
00:18:37.300 So we're fine with that. The showdown between Anthropic and the Trump administration was
00:18:42.300 particularly interesting because Anthropic suggested that there could be kill switches
00:18:47.880 implemented in their software, which would basically create a certain kind of override
00:18:54.300 to American sovereignty. Basically a way to say, once our software is told to do something we
00:19:01.220 don't think is moral, it's just going to shut down. So if the software is told to bomb a school
00:19:06.760 or a hospital or kill a bunch of civilians or take out a power plant or a lot of things that
00:19:12.580 have been directly threatened by Donald Trump during this war, then it won't do it. It will
00:19:18.380 stop you. Now, you might think that's good. You might think that's bad. That's not my point here.
00:19:24.000 My point is not whether or not Donald Trump should be threatening those things or the U.S.
00:19:27.500 military should be doing those things. My point here is where the decision lies. Where does
00:19:33.200 sovereignty lie? I almost did an entire episode just on the subject when it comes to AI, technologies,
00:19:39.720 war, and sovereignty. Because the really, really interesting question becomes, if AI companies
00:19:45.920 are building safeguards into their software to restrict the government's usage of it,
00:19:51.980 who's really in charge of the country especially when we're talking about kinetic warfare here
00:19:58.660 we're not talking about immoral uses of the irs or you know even surveillance or something we're
00:20:05.200 talking about how to murder a lot of people and again we find it uncomfortable to talk about but
00:20:11.140 the purpose of the state is to have the monopoly on violence they kill people so you don't have to
00:20:16.340 and so no one has to kill you that's the deal that's what many people will call the social
00:20:22.020 contract though i have many problems with that framing but the point is that one of the things
00:20:27.740 that defines the state is its ability to decide who gets killed and who does it as carl schmidt
00:20:33.420 put it they are the decider of the friend enemy distinction that's what creates sovereignty
00:20:39.920 if ai gets to decide when you use your weapons and who you kill then you are no longer sovereign
00:20:49.420 then the elected government is not sovereign at some level there's an over code an alternative
00:20:55.200 system of power that rests on top of your decision making ability and the trump administration was
00:21:01.720 very aware of this especially going into the iran war so they specifically threatened and
00:21:06.720 dropped anthropic they're sort of like they're in some kind of like showdown legal showdown right
00:21:11.040 now over this fact because they did not want anthropic ultimately being able to overturn
00:21:17.280 their decision and so we have a very interesting clash because now the question becomes will ai
00:21:26.520 developers who code hard hard code restrictions into their software be limiting their adaptability
00:21:34.560 or their their acceptability by any given government if you're putting these hard
00:21:40.020 restrictions of behavior okay my software will not target schools it will not target hospitals
00:21:46.620 it will not blow up a power plant it will not blow up large groups of civilians then will a
00:21:53.040 government really ever take on your ai system again why would they even if they don't intend
00:21:59.600 to do any of that stuff if you're a government it's wise to make sure that you can because
00:22:04.240 you don't want silicon valley actually having some veto power on your authority there you could
00:22:11.120 frame this in a moral sense of well you know no one elected those people no one elected silicon
00:22:16.020 valley so why do they have authority that's the democratic argument i don't really care much about
00:22:21.160 democracy but people will make that one the other one is just from the pure power uh you know
00:22:27.500 direction if you are in charge of a country even if you don't intend to do any of the bad stuff
00:22:32.880 that a company like Anthropic is banning,
00:22:36.560 why would you let them have
00:22:38.280 any level of authority over you?
00:22:40.220 Why not just buy the software
00:22:41.620 that doesn't have that stuff, right?
00:22:43.900 And so this gives an edge
00:22:45.160 to defense contractors
00:22:46.380 and AI companies
00:22:47.660 and technology companies
00:22:49.220 who are willing to forego
00:22:50.900 inserting their own morality
00:22:52.920 into these systems.
00:22:55.040 But of course,
00:22:55.980 and we'll talk about this a little more,
00:22:58.380 AI already has a morality,
00:23:00.060 has to have a morality programmed in.
00:23:02.880 So AI only can ever reflect, at least at this moment, maybe that will change. I'm not someone who's entirely convinced that AI cannot achieve some level of consciousness. However, at the moment, AI can only reflect what gets put in.
00:23:19.460 And so if it's not anthropics restrictions, it will be someone else's assumptions that AI has to operate on some basic interface with reality that is not entirely material.
00:23:34.520 and that's something that ai people don't talk a lot about but there are a lot of non-material
00:23:39.880 assumptions made in the creation of these things and those things will carry on some level of
00:23:47.940 worldview into what they do so even if you're not explicitly hard coding anthropic style
00:23:53.740 uh you know restrictions into any given weapon system the logic the basis the decision making
00:24:01.240 capacity of these different systems will inherit in at least some degree the beliefs of their
00:24:08.040 creators and so it really matters who makes your ai now there's another interesting part we're
00:24:14.020 going to spend a lot of time on number five it's the most important one there's another interesting
00:24:17.660 part of this right is this assumption of uh kind of a a gradient on which history must move
00:24:24.360 because a technology can be created and because someone else can create that technology,
00:24:31.820 you must create that technology and control it. And to be fair here, Anthropic is in pretty
00:24:37.160 good standing. These are this pretty firm foundation. If we want to look at someone like
00:24:42.880 Bertrand de Juvenal talking about the necessity of accelerating the power of your state
00:24:52.580 comparatively. Or you could talk about Alul and Jacques Alul and his understanding of the tank
00:24:57.860 problem, right? These guys are two different thinkers and two different traditions, you know,
00:25:02.480 working separately, but coming to the same conclusion that once a technology or an advancement
00:25:08.320 or a centralization of power has taken place, everybody else has a scramble to figure out how
00:25:13.120 to do that too. And that's why so many people get confused when they talk about fascism and
00:25:17.700 communism and liberal democracy when you really look at it all of these systems pre-world war ii
00:25:24.220 are simply being created to centralize state power like fdr hitler and stalin are all trying
00:25:30.780 to do the same thing to some degree and so you can prefer it with a you know democrat liberal
00:25:38.300 democracy uh you know flair or a nazi flair or a communist flair but either way the goal is the
00:25:45.360 same, like they're going to the same destination, which is centralized control of economic production
00:25:50.940 and consumption, the ability to wield mass production in warfare, the ability to mass
00:25:56.700 conscript large parts of society and the ability to indoctrinate large parts of society in an
00:26:01.620 ideological crusade. These are all key features of managerial, you know, centralizing regimes
00:26:07.960 that arose in the 1920s to 1940s period of the world, right?
00:26:16.640 And this is why debates over the different forms at that time can get very confusing.
00:26:22.760 All that to say, this assumption, while well-founded, does have its own kind of hyperstition, right?
00:26:31.080 It creates a thing that must bring itself into being.
00:26:34.440 And so by this logic, we can never say no to technology. We always must create, exploit, and control the aspects of the technology rather than saying, actually, maybe we shouldn't go there. Maybe we shouldn't make this. Maybe this is bad for humanity.
00:26:52.540 And to be fair, given my observation of human nature and the path we're on historically, that seems to be correct. I don't like that. We'll talk about that a little more in a second, but I'm not going to lie about the observable facts to make myself feel better.
00:27:08.980 it does seem that whether we like it or not we are on this kind of inevitable runaway train of
00:27:15.560 technology and it's going to take us to this place and there so there's a sound if unsettling internal
00:27:22.780 logic 2.5 though again it opens up so many other very important and dangerous questions
00:27:28.960 all right number six national service should be a universal duty we should as a society
00:27:39.140 seriously consider moving away from an all their involuntary force and only fight the next war if
00:27:43.720 everyone shares in the risk and cost okay so there's a lot on number six too like i said
00:27:50.080 this this thing is not shy i'll give them that they they did not pull punches here
00:27:54.920 so national service should be a universal duty you must serve in the military
00:28:00.000 now a lot of people will uh have some kind of sidestep of this they'll like or it can be some
00:28:06.520 kind of public service you know be part of the peace corps something you know help help in some
00:28:11.440 way the nation and some kind of force that'll clean up parks and you know otherwise take care
00:28:16.560 of the homeless or you know reform our infrastructure or something like so the people
00:28:21.320 who can't serve or don't want to serve in the military could do that, but everyone should
00:28:24.920 serve. Now, at some level, I'm sympathetic to this. I think we are a nation that has largely
00:28:31.620 become unmoored from its shared identity because we don't serve that society. And you think there's
00:28:39.660 lots of, you know, other societies like South Korea or Israel that have mandatory two-year
00:28:46.220 service. Everybody has to have that experience. They have to be forged in that fire of putting
00:28:52.440 on the uniform, learning to drill with a bunch of people, possibly go into combat, have a shared
00:28:58.680 military culture, understanding the indoctrinated. And I say that in a neutral sense, like we're all
00:29:03.680 indoctrinated. That's just how life works. And so there's this understanding that this can be
00:29:09.880 good for society. In fact, I've even made this case several times that a republic really can't
00:29:15.800 function unless you're only really handing citizenship to people who would fight like
00:29:22.100 starship troopers is not fascist it's republican in the most classic sense from aristotle to
00:29:30.220 machiavelli to the founding fathers a republic was made up of people who would fight for their
00:29:34.980 country and so a set in a sense you could say we're creating the conditions that would return
00:29:41.880 to a republic the only problem is in all of those uh formulations of the republic the choice of
00:29:49.720 serving was usually still voluntary you weren't compelled to fight but if you wouldn't fight if
00:29:56.220 you wouldn't be part of the military you are lower class you were in the roman in the roman uh
00:30:03.300 military for instance in the early republic you had to buy your own armor your own sword your own 0.98
00:30:09.940 horse in fact this is where we got the equestrian right they were the people who could afford a
00:30:16.020 horse the equestrian class so they could be they could fight on horseback they could be part of
00:30:20.820 the cavalry right that that's what differentiated them and if you were honorable enough and had
00:30:26.280 enough means to serve in the military you could become a true citizen a true guy who could climb
00:30:32.840 the socioeconomic ladder so there is a certain level of voluntary nature yes you have to fight
00:30:39.920 to be a true citizen and to earn the honor and the ability to advance and have a real say in society
00:30:45.300 but you didn't have to do that again just like in starship troopers there's there's a civilian
00:30:50.600 and there's a citizen and the difference is the citizen chooses service they choose service
00:30:58.540 what palantir is suggesting here is something a little different in fact very different compelled
00:31:04.920 service a draft a universal draft for two years that's automatic not not the kind of draft we
00:31:10.580 have now we're theoretically if we got into a ward it got real bad you could ultimately summon
00:31:15.360 people we're talking about a hard everybody serves draft right very different thing again
00:31:22.820 i can understand the value of this but you can also see the problems along with you know all
00:31:29.640 the things of just forcing people to do something they don't want to do and compelling all these
00:31:33.040 people and everything that comes with it you know we the last time we saw you know a serious draft
00:31:38.420 was obviously vietnam and all of the civil strife that's created we haven't really called on draft
00:31:43.560 since then uh however in addition to that obviously we're also creating immense government
00:31:49.940 power we're giving incredible power of the state to compel everyone to fight everyone to enter the
00:31:55.900 military that's a big increase in the power of the government now again i'm not a libertarian
00:32:03.160 i'm not like oh no the government will have some level of power but right now my government is uh
00:32:09.760 well fighting wars in places that don't really matter so much right now for the sake of countries
00:32:15.980 that are not our own. And so I don't know if I want them to be able to compel everyone I love
00:32:21.480 and care about to go fight in that. I'm too old at this point. I have too many health concerns.
00:32:26.340 I'm not going to be pulled into the front lines. So I don't have to worry about this. But that
00:32:30.820 doesn't mean I shouldn't care about the next generation and what they'll be forced to do
00:32:34.560 and who they'll be forced to do it for. So if I had a government that really, truly cared about
00:32:39.460 my country and really put us first in every scenario, I'd be a lot more comfortable with
00:32:45.040 handing this power over the military over to the permanent government. But since I don't have that
00:32:50.260 scenario, I'm a little more concerned. I think that's reasonable to ask questions about that.
00:32:58.280 Number seven, if a U.S. Marine asks for a better rifle, we should build it. And the same goes for
00:33:03.340 software. We should, as a company, be capable of continuing to debate about appropriateness of
00:33:07.860 military action abroad while remaining unflinching our commitment to those we've asked to step into
00:33:12.780 harm's way. All right, so this one's pretty reasonable, right? Even if we don't agree,
00:33:17.400 even if we're debating about whether or not a military conflict is justified or not,
00:33:21.880 once a person's in harm's way, they should have the most effective weapon for the job.
00:33:26.060 I think that's pretty reasonable. Number eight, public servants need not be our priests. Any
00:33:31.700 business that is compensated its employees in a way that the federal government compensates public
00:33:36.220 servants would struggle to survive. So yes and no. So the implication here is the reason that
00:33:43.080 public servants receive less is because they're making some kind of duty. It's a sacrifice. And
00:33:47.960 so therefore they're like a priestly caste. And that creates the scenario, which has some
00:33:53.020 downsides. All of that is true. However, we can't really compensate public servants at private
00:33:59.160 rates. Like we just don't have the tax dollars. It would be outrageous if we tried to do that.
00:34:06.220 Again, not that I don't think there's probably some level of validity to this concern, but I just don't see how we can continue to have a bureaucracy at this scale and compensate them at a private level.
00:34:20.120 Um, that increase in competition might be good for those public services. I get that. Uh, but if we're going to maintain the bureaucracy now, one of the things AI can do and might be able to do, and this would be one of my favorite things AI could do is replace the managerial regime.
00:34:37.320 If we don't need that many people anymore because we have AI doing all this paperwork, then we're in a fantastic position because ultimately we can then downscale, you know, all of these different services of the state.
00:34:52.700 That would be fantastic.
00:34:54.400 If we could get rid of so many of the people who are, you know, hate the country but are encoded in the permanent bureaucracy.
00:35:00.720 So if that's what AI does, then fantastic.
00:35:03.120 but just kind of paying the people who are already in the government more uh i don't i don't know
00:35:09.520 that that's going to solve our problem i understand the idea is that ultimately you know salaries
00:35:14.320 attract you know better people but again you look at places like the roman empire or rather the roman
00:35:20.220 republic and uh you know public offices were things that cost you money yes you were increasing
00:35:27.240 in honor but you actually had to pay out of your pocket to like run the games or the feasts or
00:35:32.240 other things you were in charge of and so in a way public service was a direct sacrifice and that
00:35:37.420 was seen as a great honor perhaps that's a better system but ultimately you can see where the logic
00:35:42.500 comes from here even if i don't necessarily agree with it number nine we should show far more grace
00:35:47.780 towards though that have to those that have subjected themselves to public life the eradication
00:35:51.480 of any space for forgiveness a jettison of any tolerance for the complexities and contradictions
00:35:55.880 of human psyche may leave us with a case of uh cast of characters at the helm we will grow to
00:36:02.080 regret regret okay there's some there's some good points in this but also something to be cautious
00:36:06.740 about yes uh to some extent uh when we eradicate the idea of forgiveness tolerance when we expect
00:36:14.720 all of our public services servants to be saints and whose lives are completely subject to scrutiny
00:36:20.120 and any mess up is a humiliating disaster that drives them out of service then we're gonna get
00:36:25.880 people who are either like incredibly squeaky clean, which most of them aren't, or people who
00:36:31.620 have a high tolerance for public humiliation and deception. And I think we're getting a lot more of
00:36:36.460 the latter there. And so the question becomes, can we like, I guess, lower our standards for
00:36:44.700 public servants? I mean, our public servants are already terrible. Our politicians are already
00:36:48.720 horrible. But at the same time, it's more about what they get punished for. So I would like my
00:36:55.740 public servants to be more moral. I would like my politicians to be more moral. The problem is
00:37:01.280 that politicians get punished for things like, you know, using a naughty word one time or having a
00:37:08.040 bad Halloween costume instead of, you know, like laundering millions of dollars or starting wars
00:37:16.220 that we shouldn't belong in or carrying on, you know, uh, sorted affairs for years that they use
00:37:23.300 public money to cover up like these are things you know having an entire epstein ring i think
00:37:31.020 comes to mind for a lot of people so the problem isn't so much that we are punishing our the bad
00:37:36.000 behavior of bad people uh as it is that we are uh selectively choosing what to enforce and uh
00:37:44.300 the the things we enforce are not great and the things that we allow are horrible and so yeah we
00:37:49.540 do reliably produce people of a certain character but i don't know if it's as cut and dry as they
00:37:54.860 make it here the psychologicalization the psychologicalization of modern politics is
00:38:01.700 leading us astray those who look to the political arena to nourish their soul and sense of self
00:38:06.460 who rely too heavily on their internal life finding expression in the people they've never met
00:38:12.180 will be left disappointed i would just replace this with the you know the false religion of our
00:38:18.640 modern politics though in many ways psychology is our modern false religion so fair to them here
00:38:25.520 but i think this is a good point in general right like people don't find meaning in their homes or
00:38:29.840 their families or their communities or their churches their religions their connections
00:38:33.560 anymore and so instead they get connected to this like big government project oh we're out there
00:38:39.380 uh spreading democracy somewhere we're saving kids in africa somewhere right rather than you
00:38:44.480 looking down the street and taking care of the people in your community. And this creates a real
00:38:49.500 like demented fixation on politics and political projects and things that can be achieved through
00:38:56.480 the government. So I do understand that point. Number 11, our society has grown too eager to
00:39:02.520 hasten and is often gleeful at the demise of its enemies. The vanquishing of an opponent is a moment
00:39:08.040 of pause not rejoice um okay uh i guess internally that's true you you don't want people excited
00:39:18.760 i don't know i don't know i don't know how to take this one um i guess there is a certain level
00:39:25.680 of respecting your enemy in combat that you should observe i think that's true um there is a certain
00:39:32.060 level at which if your society is operating properly you can disagree and even vanquish a
00:39:37.040 political opponent without hating them. Uh, that should be true though. We have shifted our mode
00:39:41.480 of politics to ignore a large amount of that. So, uh, you know, this, this one's a mix. I get where
00:39:46.220 they're going. Um, but it's not specific enough to have any real value. 12, the atomic age is
00:39:51.620 ending one age of deterrence. The odd atomic age is ending and a new era of, uh, deterrence is
00:39:56.880 built on AI is set to begin. This is probably true to some degree. Uh, obviously the cold war
00:40:02.380 and the kind of bifurcated order that came from the U.S. versus the Soviet Union was largely drawn
00:40:09.200 along these lines of mutually assured destruction. We can nuke you, you can nuke us, nobody else can
00:40:13.960 touch that level of power. So we're the only two players that really matter, and that's really what
00:40:18.400 creates the deterrence to going to full-scale war. We've seen that wane over the years. So in a way,
00:40:25.300 the atomic deterrence is similar to the Second Amendment in the United States. Yes, theoretically,
00:40:30.540 we have guns in the United States and we could revolt against our government. That's what it's
00:40:35.940 for. That's an enshrined right. Ultimately, however, that's not really how it tends to play
00:40:41.120 out. Obviously we saw COVID and all kinds of other crazy stuff be done by our government
00:40:45.820 and nobody responded. Nobody revolted. There was no refreshing of the tree of liberty. None of that
00:40:53.980 stuff happened. And we're starting to see something similar in the atomic age where
00:40:58.860 two powers with atomic weapons like say India and Pakistan can kind of fire at each other and have
00:41:06.220 like low-grade conflict and until it gets existential they still don't use nukes so
00:41:11.780 there's like a certain level of combat that is allowed to occur under nukes and people keep
00:41:17.280 pushing that further and further every year so the question is how far could you really go before
00:41:22.240 people would use nukes a lot of people were you know theorizing that Iran would retaliate with
00:41:27.840 nukes if they have them, or Israel was going to use nukes in this war, or even the United States
00:41:32.240 might use nukes in this war because they might get desperate enough. None of that has happened
00:41:35.400 and none of it's going to happen. I think all of that is a lot of bluster. And so I think we're
00:41:39.960 seeing that more and more the atomic age is one that is waning in some degree. People are eating
00:41:46.820 around the edges of that deterrence. And AI will perhaps be, you know, if you launch your AI weapons,
00:41:53.200 We'll launch our AI weapons. If you use AI to attack our mainframes, our databases, our
00:41:59.140 infrastructure, we'll do the same to you. That could be a standoff. That could be true. I don't
00:42:04.220 know for sure, though. No other country in history of the world has advanced progressive values more
00:42:10.020 than this one. The United States is far from perfect, but it's easy to forget how much more
00:42:14.180 opportunity exists in this country for those who are not hereditary elites than in any country on
00:42:20.300 the planet uh that's an interesting choice of phrase and sadly it's probably true no other
00:42:25.740 country in the history of the world has advanced progressive values more than the united states
00:42:29.560 yeah i like that though like i don't want to be the army of the rainbow flag and sadly i understand
00:42:36.420 what they're saying about hereditary elites but like we've got plenty of those in fact we're
00:42:41.380 increasingly uh you know kind of calcifying our elites into those who were born into that level
00:42:47.700 of privilege or can in some way gain the favor of those that have. So even in our very progressive
00:42:55.740 values, we seem to have still found our way back to aristocracy and oligarchy. So I am skeptical
00:43:03.520 of number 14, 13. 14, American power has made possible an extraordinary long peace. Too many
00:43:09.780 have forgotten or perhaps take for granted that nearly a century of some version of peace has
00:43:15.500 prevailed in the world without a great power of military conflict at least three generations
00:43:20.060 billions of people and their children and now grandchildren have never known a world war
00:43:24.160 this is undeniably true the the american uh ascension and the eventual collapse of the
00:43:31.320 soviet union created the pax americana right in the same way that the roman empire secured
00:43:36.600 a relatively global peace in its time uh we do the same however you know the romans were never
00:43:43.020 really at peace for long they're always fighting somewhere but they did prevent perhaps larger
00:43:49.220 scale attacks as you know it was famously said the romans create a desert and call it a peace and
00:43:54.920 you could probably say the same about the american order to some degree but it is certainly true that
00:44:00.120 the large-scale control of like kind of the global economy and possible military conflict
00:44:05.520 does create a certain level of peace um maybe it's not worth it maybe at some level uh that
00:44:12.580 you have to adopt the reality of conflict as a reality of sovereignty and the ability to live
00:44:19.140 as a people rather than relying on some unified peace from a faraway empire. That's for, I guess,
00:44:25.940 many of us to decide. That's a valid philosophical question, but it's hard to argue with the reality
00:44:31.940 of number 14. 15, the post-war neutering of Germany and Japan must be undone. The defanging
00:44:38.940 of Germany was an overcorrection for the Europe is now paying a heavy price. A similar and highly 0.97
00:44:44.160 theatrical commitment to Japanese pacifism will, if maintained, also threaten to shift the balance
00:44:49.440 of power in Asia. So this is interesting because this is kind of the opposite of what they just
00:44:53.740 said. Actually, we don't need a world hegemon. We want to re-empower these regional defenders.
00:45:00.020 Right. And part of the reason is that America just can't keep up with places like China by itself.
00:45:03.780 And so a re-militarized Japan reduces the threat of, you know, China kind of running wild in the Far East. It gives a natural kind of pushback, a classical adversary for China to kind of have to spend its resources on. Germany can also prove to be that at some level. 0.81
00:45:24.200 uh so it's kind of interesting that 14 and 15 act in opposition to each other now i guess the
00:45:30.220 assumption here is that germany and japan will remain allies of the united states and therefore
00:45:33.660 they will continue to serve the global hegemon but just be more powerful in that uh in that way
00:45:40.220 and perhaps they would uh but it does give them a certain level of sovereignty it does reinvest
00:45:45.040 them in with the ability to kind of defend themselves fight back make their own decisions
00:45:50.020 at some level be equal players when it comes to global defense and so that does shake up the world
00:45:56.140 order including american hegemony at some level we should applaud those who attempt to build where
00:46:03.060 the market has failed to act the culture almost snickers at musk's interest in grant in the grand
00:46:08.480 narratives as if billion billionaires ought to simply stay in their lane of enriching themselves
00:46:13.820 any curiosity or genuine interest in the value of what he has created is essentially dismissed or
00:46:19.000 perhaps lurks from beneath the thinly veiled scorn. So interesting that, you know, basically
00:46:28.220 we're concerned that billionaires aren't, uh, investing significantly in like the social
00:46:33.920 implications or ignoring, uh, kind of like the cultural societal, uh, implications of what they
00:46:40.100 built, uh, celebrating them, wanting to be able to, uh, impact them. Uh, if you're a tech company,
00:46:45.680 obviously you can understand why this is uh attractive to them 17 silicon valley must play
00:46:51.580 a role in addressing violent crime many politicians across the united states have essentially shrugged
00:46:56.180 when it comes to violent crime abandoning a serious effort to address the problem or take
00:47:00.100 on any risk with the constituencies of donors and becoming and coming up with solutions and
00:47:04.940 experiments that could help uh desperate bid to save lives so again interesting that they're
00:47:10.100 basically inserting themselves into the sovereignty conversation we need to give back to the community
00:47:14.740 yes good and the way we're going to do that is to crack down on violent crime now i'm huge on
00:47:20.420 cracking down on violent crime i think that's very important but they're obviously not going
00:47:23.820 to send out like palantir policemen what they mean is you know we can use algorithms to help in you
00:47:29.540 know inform law enforcement these kind of things now this is also interesting because um in many
00:47:38.220 ways what these algorithms do is really just do what police already know what to do i've had
00:47:43.340 this conversation with so many police officers it's almost comical at this point but every time
00:47:47.620 a police officer with a pull with or a have a conversation with a police officer they always
00:47:51.960 say the same thing we know where the crime is we know who's doing the crime we're just not allowed
00:47:56.120 to go get up because of politics there's too much procedure there's too many you know there's too
00:48:01.020 much yelling about minorities or disproportionate uh you know crackdowns those kind of things and so
00:48:06.520 we're just not allowed to go get them we know where it's happening right the algorithm basically
00:48:10.820 gives you a way to point to something above you that's making the same observations you could
00:48:16.120 anyway oh well it's not me palantir said this is where the crime is happening palantir says this
00:48:21.260 is who shoplifts and so they're not allowed the store anymore palantir says this is where the
00:48:25.260 drugs are sold and so this is where we're putting all our efforts it's if it just happens to be uh
00:48:29.960 people of one race or with a certain demographic profile well that's not our fault the palantir
00:48:36.360 said it and so we can point to palantir say they're the ones that did it that that's what
00:48:39.980 actually happens here. Again, maybe that's a sidestep, but it is just pushing the whole ball
00:48:46.840 down, you know, down or kicking the whole can down the road, right? Like we're just handing over
00:48:52.000 responsibility to some other entity for noticing the thing that we should be allowed to talk about
00:48:57.620 anyway. That's obviously what's being implied here. I don't know how else you would use Palantir's
00:49:02.100 services. So interesting that that's the approach. The ruthless exposure of their private lives
00:49:09.900 of public figures drives far too many talents away from government service the public arena and the
00:49:14.940 shallow petty assaults against those who dare to do something other than enrich themselves has
00:49:19.400 become an unforgiving that the uh has become so unforgiving that the republic is left with a
00:49:25.460 significant roster of ineffectual empty vessels whose uh ambition uh one would uh forgive if they
00:49:33.000 were any genuine belief structure lurking within this is obviously true right like again as we
00:49:38.760 said previously the way we operate in our politics means that only kind of the most uh you know
00:49:44.720 the people who would be least humiliated by their public lives being exposed the most shameless
00:49:52.220 they're the only ones who can really go into politics at this point right if you're if you
00:49:56.660 don't have incredibly thick skin if you're not donald trump and you're like yeah whatever i
00:50:00.880 slept with some women yeah whatever i've cheated on my wife's before yeah whatever yeah i've i've
00:50:05.880 I've made some shady deals, whatever. What are you going to do? Right? Like this is, this is how
00:50:09.200 it is. If you don't have that mentality, you just can't get into politics. If you're someone who
00:50:14.160 could be shamed out of your position, you will be shamed out of your position. And there's entire
00:50:18.600 industry, especially of left-wingers all hunting down ways, which they can do this. Right? So we
00:50:24.300 can understand why this is a concern. At the same time, I still find that all of my politicians are
00:50:28.860 incredibly immoral. So do I want to reduce the, uh, you know, the ultimately like the things I
00:50:35.040 want them to do or the things I want them to be? Do I need to get rid of my standards so I can have
00:50:39.100 more competent politicians? I mean, it's a hard decision. I think the real culprit here is
00:50:44.100 actually just the media cycle, the social media cycle. When you don't have the constant ability
00:50:49.460 to look into every aspect of a politician's personal life, there's no incentive structure
00:50:54.020 to do so. So when you didn't have the ability to constantly see and hear and take pictures
00:50:59.320 and harass and stalk people on social media and run down bank accounts and all these
00:51:04.960 things it was really hard to find all these little things that people did that was immoral
00:51:09.000 now they're everywhere and they're easy to find and there's this constant churn so unless you
00:51:13.680 have done none of that stuff which is very few people these days or you are willing to have that
00:51:19.340 exposed and you don't care you're not going to go into government service that part is true
00:51:23.100 but i think this is more of a the existence of social media and media issue and the existence
00:51:29.360 of democracy issue and since i don't see any of those things getting addressed anytime soon
00:51:33.660 i think 18 is probably a moot point if it doesn't have some value in it 19 the caution in public
00:51:39.680 life that we are unwittingly encouraging is corrosive those who say nothing wrong often say
00:51:44.880 nothing much at all again totally agree it's so important these days for republican politicians
00:51:50.440 to shut their mouth on anything that's important man it's my job to be a commentator to tell you
00:51:56.860 the truth and even i sit there and think to myself should i say this is this too far even if i know
00:52:02.200 it's true should i be careful with my words here right and like we have created a culture of
00:52:07.320 safetyism you're only valuable you're only eligible for public life if you've been insanely safe the
00:52:12.720 whole time and usually only safe in one direction if i was on the left i could run around saying 0.99
00:52:16.940 yeah i'm a marxist yeah shoot all the capitalists uh yeah whatever you know trans all the kids chop 0.99
00:52:22.180 all their junk off who cares and i would be electable most of the places that democrats vote 0.99
00:52:26.860 i could get hired in most companies i could work in government agencies but if i said like
00:52:31.440 hey the preservation of european peoples inside their countries is a worthwhile thing
00:52:37.300 which should be a very innocuous statement that could destroy your life publicly you you could
00:52:42.600 never get hired again so it's very obvious that it's not just cautiousness but cautiousness all
00:52:47.800 in one direction 20 the pervasive intolerance of religious belief in certain circles must be
00:52:54.980 resisted the elite's intolerance of religious belief is perhaps one of the most telling signs
00:52:59.440 that it's a political project and constitutes a less an open intellectual movement than many would
00:53:05.460 within would claim yeah can't can't argue with that right like the fact that being religious
00:53:11.420 is seen as some kind of downside especially as we ask our politicians to be more moral
00:53:15.740 is ridiculous and so uh just kind of agree all the way around with that one uh number 21 some
00:53:23.620 cultures have produced vital uh advances others remain dysfunctional and regressive all cultures 0.84
00:53:29.360 are now equal uh criticism and value judgments are forbidden yet this new dogma glosses over the
00:53:35.600 fact that certain cultures indeed subcultures have produced wonders others have proven middling and
00:53:42.220 worse regressive and harmful basically just saying cultural relativism is wrong there are cultures
00:53:48.180 that are better than other cultures and we should know that and we should know that when we're
00:53:52.700 bringing people in or they're putting the positions of power we're discussing whether or not they
00:53:56.740 should become citizens. We should factor that in. Haiti is not equal to Germany. Somalia is not
00:54:04.140 equal to the Afrikaners in South Africa. These are not the same thing. The Japanese are not equal to 1.00
00:54:12.720 India. These are not the same cultures, and we should be able to acknowledge them, their strengths
00:54:17.720 and their weaknesses, what is superior and what is inferior, and we should be able to factor that in
00:54:21.320 every decision 22 we must resist the shallow temptation of a vacant and hollow pluralism
00:54:27.980 we're in america and more broadly the the west have for the past half century resisted
00:54:34.060 the fighting national cultures in the same inclusivity but inclusion into what again
00:54:40.020 denouncing the idea that everything is the same that the goal should be pluralism that we should
00:54:45.000 all be included what are we including in people into what matters if we think that the west is
00:54:50.860 just about inclusivity, then what is the West? If the West is just a collection of every other
00:54:55.220 people in the world, then it's nothing. It's just a microcosm of the world. And the rest of the
00:54:59.400 world is not great, actually. I don't really want to live in most of the rest of the world.
00:55:04.820 So why would I want them to be here? Why would I want me to be inclusive of their ways of life
00:55:11.100 or somehow value that more than our own way of life? So that one, pretty defensible.
00:55:17.220 All right. So you get the idea here, kind of a general manifesto of warfare, foreign policy, maybe some some wokeness, some DEI, some multiculturalism, how we should approach politics, public life.
00:55:33.580 All very interesting. But here's what's more interesting.
00:55:37.320 And I saw this and I knew where this is going to go. And then I saw these people commenting on it.
00:55:41.700 So. Here's Nick Land.
00:55:47.220 and he's quoting alexander dugan alexander dugan says the palantir manifesto is the western plan
00:55:53.540 is the plan of western techno fascism the superiority of the white race based on technology
00:55:59.140 no anti-semitism no sacredness no socialism of old higher historical fascism this time 0.77
00:56:06.200 pure capitalist jew-friendly profane materialist anglo post-human 0.76
00:56:12.060 nick land's response sometimes i think alexi is just talking to me insane hubris of course 0.74
00:56:19.640 so he recognizes that dugan recognizes part of what's going on here now you're not going to like
00:56:27.120 a lot of what how dugan characterized this and understandably so because dugan hates this and
00:56:32.720 he hates the countries that created it dugan's anti-western for this reason this is why i've
00:56:37.880 warned Dugan has a lot of insights that are useful but you don't want to go full Dugan you
00:56:41.960 don't want to buy into his program that's that's not what you should learn what you can learn
00:56:46.940 glean the insights you can glean but you should not then like operationalize his ideology into
00:56:53.320 something that's not what you're looking for here so let's break down what he said here because I
00:56:58.060 think it's very important Palantir Manifesto is the plan for western techno fascism I mean when
00:57:06.540 you look at palantir's quotes there it's not wrong to say that right like palantir is integrating
00:57:16.260 corporations at every level with the state they're trying to find a way to use technology
00:57:22.440 and corporations to kind of solidify a central american and western identity
00:57:29.120 they're using their technology to operate the leviathan of a global empire
00:57:37.000 at scale but make it more efficient and perhaps strip away some of the managerial aspects
00:57:43.340 so i mean is it fascism with technology as an enabler kind of i mean and and i think that there
00:57:55.640 are a lot of countries doing this you know this is uh also to some degree what china is doing
00:58:00.680 china's not communist anymore they have they have a communist uh government they have they have the
00:58:06.620 trappings of communism in theory but in practice you know they have wide-scale uh capitalism they
00:58:12.520 have commercial zones economic opportunity zones uh you know the the only big difference is uh the
00:58:19.120 government decides what you know to some extent you produce and it owns certain percentages of
00:58:24.060 companies and oh wait it's fascism right China's fascist it's not communist or it's closing in on
00:58:32.000 fascism at least and there's a reason that most large-scale governments are moving this direction
00:58:37.840 no matter what we call it right it because this is the only way to manage societies at this large
00:58:45.220 of a scale communism has failed it doesn't work and capitalism under you know liberal democracy
00:58:52.000 worked to some degree, but it has some characteristics that ultimately lend to it
00:58:55.720 tearing nations apart. So if you want to kind of turn, if you want to re-nationalize your economy
00:59:03.920 and its output, a nationalism of socialist variety, some might say, well, then you're
00:59:10.960 going to get techno-fascism, right? That's kind of the idea. What's the next line? The superiority
00:59:17.000 of the white race based on technology this is a long-standing uh assertion by many including people
00:59:23.440 like uh oswald spengler oswald spengler wrote a whole book about how uh you know western peoples
00:59:31.720 basically are giving away their technology and therefore giving away their dominant feature
00:59:36.840 technology what was what was going to allow you to dominate the globe uh out to out compete other
00:59:42.000 civilizations and stay superior to them now we you know cringe at this formulation of you know 0.79
00:59:48.700 superiority of the white race but you know dugan is locating the west as the center of what would 0.85
00:59:59.180 often be called whiteness this is what many woke people do is why people often correctly know that
01:00:05.020 dugan can be woke like that's true i'm not here to defend that um and but the idea that uh you're
01:00:12.520 going to use technology as your superior tool that's an old understanding and one that's being
01:00:17.760 re that's re-emerging here and that obviously palantir is pushing forward no anti-semitism
01:00:24.140 no sacredness no socialism or old historic fascism so he's saying this is techno-fascism
01:00:30.100 but this is not nazism this is not old school national socialism this is not even italian
01:00:35.860 fascism this is something different so he's so he's not saying oh this is all just the return
01:00:40.740 of hitler it's not it's not the sjw uh woke oh well everything is hitler he's saying no this is
01:00:47.180 a fascism without anti-semitism but with also out without the sense of the sacred without historical
01:00:54.160 socialism or historical fascism. So it's got fascist characteristics, but it's transcending
01:01:01.240 many of the old, more rooted understandings of fascism, which is itself a, at the time was a
01:01:09.620 very modern construction. It's discarding these old ethnic hatreds. It's discarding these old
01:01:15.640 ideas of the sacred. It's discarding any of these roots and instead transcending to something
01:01:21.860 different. This time, it's pure capitalism. It's friendly to the Jews. It's profane, materialist,
01:01:28.840 Anglo, and post-humanism. And in case you're wondering, these are all things that Nickland 1.00
01:01:34.860 has argued for. Palantir is building the world Nickland desires. They are an ally in this project.
01:01:44.760 Now, I'll let you look at that project and decide if that's a good thing or a bad thing,
01:01:49.220 But there's a reason that Nick Land recognizes, that Dugan recognizes something important.
01:01:55.560 This is why I wanted these two guys to talk, by the way.
01:01:58.540 It's not because, like, I'm actually bought into Dugan's project or Nick Land's project.
01:02:02.960 But having read enough of both of them, I understand where they're going and what they're suggesting.
01:02:08.900 And because of that, I was somewhat uniquely positioned to understand where their intersection was and why I thought this conversation needed to happen.
01:02:18.700 because nick land while he disagrees with dugan very much so and while dugan disagrees with nick
01:02:25.520 land on much they both recognize that they both see the board they both get that they they both
01:02:31.200 understand what these systems are doing see and this is going to get complicated for a second but
01:02:38.380 stay with me ultimately nick land sees the global anglo empire as it has now been constructed
01:02:50.540 as something that is more or less gathering resources to launch anglo civilization into
01:02:58.160 space and leave the rest of the world behind when you look at capitalism when you look at 0.89
01:03:04.200 contracts when you look at time zones when you look at weights and measures when you look at
01:03:09.700 this uniform systems that we have imposed on through the world through global trade
01:03:14.500 they create a unified understanding that everyone should adhere to you need to conform to these
01:03:23.340 different things to participate in the global marketplace which means you need to have these
01:03:27.180 political systems and these timekeeping systems and these systems of weights and measures and
01:03:31.440 science and math. When the woke complain about math and science being white, this is what they're 0.64
01:03:37.640 talking about. That's what they mean. They mean it's an imposition of a way of thinking that does
01:03:44.160 not naturally belong with them. It grew out of something different, something Anglo, something 0.99
01:03:50.140 Western that they aren't comfortable with even to this day, even when they've been a part of it for
01:03:55.260 a long time. Now, again, you can think that's good or you can think that's bad. You can think that's
01:03:59.920 an overstatement, or you can think that that's silly. But a lot of high-level thinkers recognize
01:04:06.620 this as at least being true at some level. Maybe it's ultimately a good thing that it's been done.
01:04:12.080 But the truth of that imposition is still real. And more or less, the idea is that, you know,
01:04:18.500 for Nick Land, that intelligence escapes its human trappings, perfects itself in a way that
01:04:24.280 allows it to rise above its human limitations in a Nietzschean transcendent sense and then
01:04:30.240 leave the planet behind for the stars. It's hard not to see that project being reflected in
01:04:37.620 companies like SpaceX or Palantir. Is he right? Is he wrong? Is he dangerous? Is he leading you
01:04:44.760 on something that is going to help you transcend and break the civilizational cycle that would
01:04:49.140 suck us back in to another dark age that's for you to decide but this is where all this is going
01:04:55.040 and this is why i'm fascinated with these two guys because ultimately this clash of worldviews
01:05:00.980 is recognizing the same process and asking the question which way should we go what's next
01:05:07.040 and palantir in its own way revealed that it's in the nick land camp here it basically laid out a
01:05:15.980 landian manifesto to where we go next again good bad i'll let you choose but you can't pretend that
01:05:23.820 this stuff isn't real life anymore some of the most influential and important players in silicon
01:05:29.520 valley technology government weapons manufacturing they all recognize what's happening here
01:05:35.180 so i'm not some crazy guy on the internet i'm i'm the guy who's telling you what's going on behind
01:05:42.220 the scene and Palantir knows it, which is why they're writing manifestos that reflect guys
01:05:49.100 like Nick Land. Something to think about. All right. So what are our questions from the people
01:05:56.520 today? We've got Sean Wineland. He says, walled gardens with social media apps and LLMs are
01:06:03.380 taking us back to the days of Prodigy, CompuServe, AOL, Genie, etc. No more self-hosted data.
01:06:08.920 exactly i remember because i went through all of this right like i'm i am a early uh or i guess
01:06:16.040 you call a late millennial early millennial i was born right between gen x and the millennials i'm
01:06:22.540 just barely a millennial but i remember a lot of the gen x generation so i went through this cycle
01:06:27.360 of you know the internet comes around it's aol it's this walled garden people don't really know
01:06:32.580 what to understand it and then there's like this explosion of isps and different ways to get on
01:06:38.360 the internet and all of a sudden people aren't stuck in one singular ecosystem and they start
01:06:44.100 discovering the wider web and everything involved. And now we're kind of seeing this collapse back
01:06:49.400 into these walled gardens, back into everyone is on these specific services and that kind of
01:06:54.300 limits where they're going. Things are feeling a little less human these days, aren't they?
01:06:59.500 But isn't the whole point of progress to make things more human? That's why at TD, when we
01:07:05.020 design a product whether it's an app for making trading easier or monitoring your account for
01:07:09.660 fraud we ask one simple question how does this help people that's how we're making banking more
01:07:16.860 simple more seamless and more intuitive but most importantly that's how td is making banking more
01:07:24.160 human wineland also says technological republic is clever to marking llm's ai are like parrots
01:07:32.460 They repeat and remix human generated content, but they will never think yes and no.
01:07:37.360 I'm, I'm less skeptical than perhaps you are.
01:07:40.500 I don't know if the true, like general intelligence is coming, uh, but I do think that we're
01:07:48.040 understating that when we say that, because what you've described, Sean, is how most humans
01:07:52.780 work.
01:07:54.300 Most people are LLMs.
01:07:55.820 most people just regurgitate whatever they heard from tv or from a movie or from the internet
01:08:02.940 without thinking about it very much they hear a thing they know what they're supposed to say in
01:08:08.800 response just like an llm so at the very least llm is capable of some level of human existence
01:08:15.320 already because that's how many people exist so the only question is can it ascend higher
01:08:21.180 beyond that i don't know yet and i'm not going to close off the possibility
01:08:25.620 he also says soldiers who can't afford a home fighting for what yeah again a fantastic point
01:08:32.760 right how often can how how long can you get people to do mandatory service when you ultimately
01:08:38.740 can't get them you know a home that this is the question again that rome faced its soldiers came
01:08:44.640 home to the promise of land they were going to be they were going to live the dream of owning their
01:08:48.420 own land and having a home on it and being a noble farmer that was considered the noble profession
01:08:53.420 of a roman be this like yeoman farmer and the lack of that being delivered when they came home
01:08:59.720 was a large part of what gave many different generals including caesar leverage in these
01:09:05.180 different scenarios so a policy or a recipe for a lot of chaos if it continues wild speaker says
01:09:13.680 one side will win and the other will be vanquished to history the right wing can quibble about
01:09:17.940 palantir and can complain about potential dangers but either you use it or it will be defeated by
01:09:22.660 those that do again i can't necessarily argue with that logic i just know where it's going
01:09:29.260 i can see the road and it's scary but the only way out is probably through you're probably right
01:09:36.320 about that but i'm here to decode it along the way at least i'm not i'm not arguing that we
01:09:42.100 shouldn't go this way i'm not arguing that we can ultimately avoid it but i am just saying
01:09:47.240 this is something that we should think about uh sean says again number 17 ed 209 is programmed
01:09:56.800 for urban pacification i think that's from robocop i think that's the the robot for robocop
01:10:02.580 uh correct me if i'm wrong but i think that's the case uh and yes that that is a perfect
01:10:08.060 distillation of what's happening there let's see here db543 says what are conservatives going when 0.93
01:10:19.540 are conservatives going to admit that the gays are way worse and more damaging to our society 0.78
01:10:25.100 than the t's are they're they both rely on the same premise and we won't have the t's today 0.98
01:10:32.180 without the gays starting it yeah yeah no i mean i agree with this 100 this is slippery slope 0.79
01:10:36.840 uh i i get you know we have this like lgb without the t attempt to like make it okay in the 0.98
01:10:43.280 republican party because the republican party is just full of gay people at this point like 0.68
01:10:47.540 there's a little there's a little gay mafia that runs a a sizable chunk of the republican party 0.96
01:10:52.580 and they don't want to give it up and you know they they've adopted this and this is why you 0.96
01:10:56.440 can't get fox news hosts or other people to speak out against homosexuality anymore because half of
01:11:01.000 them are gay or they know they've got gay relatives or you know they're relying on a gay person for
01:11:05.160 patronage or something and so yeah this is this is very common but the truth is that obviously 0.98
01:11:10.360 the you know disintegration of the family the destruction of normal sexual behavior led us to 0.97
01:11:15.760 trans if you can't define what marriage is then why would you be able to define what man and woman 0.97
01:11:20.500 is these are like ontologic ontological categories that are critical to the definitions of human 0.88
01:11:29.080 civilization from time in memoriam and we're playing with them and we're destroying them
01:11:34.520 that's very clear wild speaker says again western techno white fascism um based that was the
01:11:41.400 response a lot of people responded that way oh no don't don't you know don't run me over don't
01:11:47.060 throw me in the briar patch there uh you know there was a lot of that um i get i get it i get
01:11:53.260 it. Um, but I do think there is a danger here. I do think there is a danger. I do think that,
01:11:59.740 um, fascism has a lot of downsides. I understand why it seems superior to other modes. If you're
01:12:08.740 asking me, do we go fascist or communist? If those are the only two options, I know which
01:12:12.420 one I'm choosing, but I do think that there is something worth stopping and saying our founders,
01:12:17.960 our tradition is republican in its nature now maybe republics simply are inoperable anymore
01:12:23.400 maybe because especially because of scale they simply can't exist i'm open to that argument
01:12:28.240 i've made that argument but i think for me the answer is then to scale back down
01:12:34.260 but maybe that's just not an option again maybe the only way out is through
01:12:38.180 i want to explore the other options i'd like to as someone who values tradition deeply i'd rather
01:12:45.760 draw on tradition first, then embrace these incredibly modern modes that could have very
01:12:51.880 serious consequences to our nature as humans. You'll notice that Dugan says the post-human
01:12:56.900 and land agrees. There are some implications there you need to think about. Again, I understand,
01:13:03.160 I understand, but just, we might want to think about this one before we dive all the way in.
01:13:09.040 All right, guys, that said, I want to say thank you to everybody who is watching. If it's your
01:13:13.600 first time on this YouTube channel, you need to click subscribe, the bell notification, all that
01:13:17.660 stuff. So, you know, when we go live, if you want to get these broadcasts as podcasts and you
01:13:20.920 subscribe to the Orr McIntyre show on your favorite podcast platform. And when you do leave a rating
01:13:25.020 or review, it really helps with the algorithm magic. Thank you everybody for watching. And as
01:13:28.720 always, I will talk to you next time.