Based Camp - June 14, 2025


Palantir Terror: They Need More Power (Unironically)


Episode Stats

Length

17 minutes

Words per Minute

196.4218

Word Count

3,451

Sentence Count

286

Misogynist Sentences

3

Hate Speech Sentences

6


Summary

In this episode, we talk about Palantir, a company that has been getting contracts from the U.S. government for a variety of AI-driven projects. We talk about why we should be worried about them, and why they should be shut down.


Transcript

00:00:00.000 So I looked up like what's been going on with Palantir, like the new government contracts.
00:00:04.640 And I am horrified.
00:00:06.560 I'm genuinely horrified, but it's because I cannot believe the government has not already
00:00:10.960 implemented this tech.
00:00:12.220 Like, I love what our fans is like, are you guys worried about what Palantir is?
00:00:17.160 And you're worried about the-
00:00:17.860 Yeah, I'm worried.
00:00:19.820 What is the new tech?
00:00:22.040 There are two things that happened recently.
00:00:24.600 In May, 2024, Palantir won a $480 million contract for the Maven smart system, which
00:00:30.840 is an AI-powered prototype for military and intelligence applications with an expected
00:00:35.340 completion date of May, 2029.
00:00:37.820 And in 2025, the Pentagon increased this contract ceiling by $795 million, bringing the total
00:00:44.860 to over $1 billion, anticipating increased demand from military users for AI-driven command and
00:00:50.060 control capabilities.
00:00:51.100 I absolutely want that for a government.
00:00:52.680 Like, imagine how insecure you would feel if your government was doing nothing with
00:00:57.840 AI.
00:00:58.820 Yeah, seriously.
00:00:59.760 And then also, a $217.8 million contract was awarded to Palantir, a subsidiary of it, to
00:01:08.380 Palantir USG, by the Space Forces Space Systems Command.
00:01:11.980 Well, and also keep in mind how much more-
00:01:13.440 For joint force missions.
00:01:14.340 ...safe this is than going to, like, the NSA or something, where-
00:01:18.420 Oh, God, where they-
00:01:19.280 It's just wasteful.
00:01:21.200 Like, the NSA can try to do this.
00:01:23.700 So another thing is that, like, people are freaking out about them because a lot of these
00:01:29.360 deals-
00:01:29.800 There are other deals, too.
00:01:31.420 For example, ICE did a contract-
00:01:33.600 ICE needs all the weapons they can get.
00:01:34.940 I've seen the way people are freaking out about them.
00:01:36.720 I know.
00:01:37.100 They did a 30 million-
00:01:38.360 Palantir did a 30 million deal with ICE to provide software monitoring visas and tracking
00:01:43.040 deportations, offering, quote, near real-time visibility into migrant movements.
00:01:48.140 I'm like, wait, you didn't have that before?
00:01:49.760 Are you kidding me?
00:01:50.840 Like, what were you doing before?
00:01:54.460 You know?
00:01:54.920 And Palantir is also in discussions with the Social Security Administration and the Internal
00:01:58.720 Revenue Service to deploy its Foundry platform.
00:02:01.040 And the Foundry platform of Palantir organizes and analyzes data, so it enables the merging
00:02:08.100 of sensitive data sets across agencies.
00:02:10.680 And why people are freaked out about the Foundry platform is that it's, one, central to many
00:02:16.160 of Palantir's government contracts, but it allows agencies to integrate data from various
00:02:20.340 sources.
00:02:20.720 So you can take financial records and immigration data and health records.
00:02:24.420 Exactly the thing that led to 9-11 was not being able to integrate into the future.
00:02:28.320 Yeah, no, not enough interagency communication because they're so freaking-
00:02:31.280 Because they had knowledge that this was going to happen.
00:02:33.560 Yes, but people weren't talking with each other because they're idiots and Palantir fixes
00:02:37.480 this.
00:02:37.980 And so you put this all into a central system for analysis.
00:02:41.800 And, you know, when you have this adopted by the Department of Homeland Security and Health
00:02:45.660 and Human Services, the New York Times fears this created a master database by merging
00:02:51.200 sensitive information like bank account numbers and student debt and medical.
00:02:54.420 claims and disability status.
00:02:55.720 But I'm like, yes, please.
00:02:57.180 Why don't we have a central database of that?
00:02:59.300 We need that.
00:03:00.020 Yeah.
00:03:00.160 How do we not have that?
00:03:01.760 Like, it disturbs me because I guess I was, you know how I always like take the most charitable
00:03:06.740 interpretation of someone, right?
00:03:08.280 Like, I'm just, I assume that they're doing their best work.
00:03:11.320 And so I assume that like, if I apply for something with the government, that they're also
00:03:16.140 aware of all like my tax payment status, my social security status, they're aware.
00:03:21.700 No, they have no idea.
00:03:22.500 Like, there are just like 17 different Simones in the U.S.
00:03:25.220 government and none of them are integrated data wise.
00:03:27.740 They have no idea what the other Simones are doing.
00:03:30.040 You know, what one Simone could be a felon who this guy doesn't know.
00:03:33.000 Like, I can't believe that.
00:03:34.700 And this is how you end up getting, for example, illegal immigrants voting.
00:03:38.020 This is how you.
00:03:38.580 Or social security payments going to people who are like 500 years old.
00:03:42.620 Yeah.
00:03:42.820 Like, how are we not?
00:03:44.060 Like, we need more Palantir.
00:03:45.880 Just like, you know what?
00:03:46.880 Actually, just end the U.S. government.
00:03:49.260 Make it all.
00:03:50.140 Make it 100% Palantir.
00:03:52.540 Yeah.
00:03:52.960 Like, I'm done.
00:03:54.800 Like, I'm not going to.
00:03:55.600 The contractors, it's the only one I trust.
00:03:58.440 No.
00:03:58.920 Like, they're competently run.
00:04:00.680 Alex Karp is like extremely dedicated to the company, to the cause.
00:04:05.220 I think he's created a very enthusiastic community.
00:04:06.780 Can we, like, shut down, like, Boeing, for example?
00:04:09.780 Like, that needs to end, okay?
00:04:11.840 I know, I know.
00:04:12.400 Like, of all the organizations, I'm, like, super cool with having a little too much power.
00:04:18.400 Like, guys.
00:04:19.720 I mean, a point that J.D. Vance had made in an interview with a comedian recently who was like,
00:04:23.780 well, I'm freaked out about all this, is J.D. Vance is like, listen.
00:04:26.560 Like, every major corporation has levels of data information on you, on your purchase history,
00:04:32.240 on your thoughts, on what you're saying in your own home, that put all of this to shame.
00:04:37.360 Like, if you're worried about Big Brother listening to you, then maybe you shouldn't,
00:04:40.660 like, have all these smart devices in your home.
00:04:42.600 Maybe you shouldn't have a smartphone sitting live next to you all the time, not sitting
00:04:45.300 in a Faraday cage.
00:04:46.000 Like, it's already happened.
00:04:47.800 But this isn't in the hand even of the U.S. government that ultimately wants the best.
00:04:51.560 You know, China's getting all this information, I assure you.
00:04:54.380 Yeah, no, China.
00:04:55.280 Yeah, you've got TikTok on your phone.
00:04:56.400 China, CCP's listening.
00:04:57.920 They know what's going on in your life.
00:04:59.320 They know that you're menstruating right now.
00:05:01.020 They know that you're a little bit irritable because of your seasonal allergies, but the
00:05:04.020 U.S. government can't even figure out if you're dead or not.
00:05:06.480 Your personal menstruation notifications.
00:05:09.120 That is, that is what's going on right now.
00:05:11.800 And I'd be really excited if I heard Palantir getting, like, a big drone contract or something
00:05:15.880 because we need to get way better.
00:05:16.860 No, the more the merrier.
00:05:18.060 Like, genuinely, I'm reading this and getting increasingly concerned of, like, wait, they
00:05:22.380 weren't doing this already?
00:05:24.220 Like, mods have been asleep this whole time?
00:05:26.840 Like, I really do feel like a lot of-
00:05:28.220 Why wasn't I out there murdering people?
00:05:30.120 Honestly, that's the, you know-
00:05:34.300 Do you want me to try to split this out and make this a micro episode to see how it does?
00:05:38.700 You can, you can just throw it out as a, like, a weekend bonus or something.
00:05:42.680 Yeah, that's what I was thinking.
00:05:43.940 Yeah, I just, I think people have a completely, what's the word, dysmorphic understanding of
00:05:51.440 data security and, and privacy.
00:05:54.200 And they're like, oh, yeah, like, oh, just like my Alexa and it was everything, like,
00:05:58.320 chat GPT and it was everything.
00:05:59.440 There's this new meme where you, you ask chat GPT to, like, make a picture of who they think
00:06:05.900 you look like based on all of your queries.
00:06:07.440 And, like, everyone's really entertained by the cartoons created because they're, like,
00:06:11.000 so spot on.
00:06:11.800 Like, you have no idea the level of granular understanding that Google has about you, that
00:06:17.320 Amazon has about you, that DoorDash has about you, that your local grocery store has about
00:06:22.520 you.
00:06:22.700 If you have a loyalty number, or even if you're using the same credit card number, like, this
00:06:27.120 is just, like, so ridiculous.
00:06:29.200 We need governments.
00:06:30.260 Like, I'm genuinely concerned if the U.S. government is not integrating AI and drones
00:06:36.720 into absolutely everything that it's doing.
00:06:38.780 And as, as Dan Driscoll has pointed out, the Secretary of the Army, and as J.D. Vance
00:06:42.940 have pointed out, like, warfare.
00:06:43.580 So, so, clarification, Dan Driscoll is a personal friend of ours and has been for a long time,
00:06:47.480 and he was appointed Secretary of the Army.
00:06:48.800 Continue.
00:06:50.180 Warfare is fundamentally changing.
00:06:52.200 Yeah.
00:06:52.420 Alexander Kruhl, who is a great daily, almost daily newsletter, like, shared this meme at one
00:06:56.760 point.
00:06:57.000 Well, no, he has a half-great, half-insane pro-Ukraine propaganda.
00:07:02.500 Well, yeah, he went to this section for Ukraine.
00:07:04.600 But, I mean, he was excited about this with Ukraine, because, you know, this moment where
00:07:07.160 recently Ukraine drove in just a bunch of normal tractor, tractor-trailer trucks full
00:07:14.060 of drones no one realized, and they took out a bunch of military assets in Russia.
00:07:18.900 And he, he put this, like, Alexander Kruhl either created or used a meme that showed, like,
00:07:23.000 a U.S. aircraft carrier, like, a U.K. aircraft carrier, like, all these aircraft carriers.
00:07:27.000 It's, like, so incredibly expensive.
00:07:28.580 And then, like, a truck, a tractor-trailer.
00:07:32.120 Like, now this fundamentally changes warfare.
00:07:34.240 We have to use AI.
00:07:35.420 We have to be able to integrate data on, like, everything from, like, banking data to your
00:07:41.240 health data to everything else to, like...
00:07:43.500 So many people abuse the government right now because government workers don't care,
00:07:47.740 fundamentally.
00:07:48.400 Yeah.
00:07:48.520 And we have an episode that I was going to do on that.
00:07:50.060 Well, it's not just not caring.
00:07:51.740 It's, it's that...
00:07:52.680 But AI does care.
00:07:54.400 Like, AI will actually, like, attempt a task.
00:07:57.040 Like, if at the DMV, I knew I was going to be dealing with an AI clerk versus a human
00:08:01.160 clerk, I'd be so much happier.
00:08:03.240 Mm-hmm.
00:08:03.840 Well, and, you know, we've, we've visited Palantir's offices.
00:08:06.540 We even created fan art for Alex Karp one, one time.
00:08:09.060 Remember when we made the full body poster?
00:08:11.700 Every encounter I've had with Palantir and its staff has been incredibly positive.
00:08:18.580 Professional, competent.
00:08:19.680 Yeah, well, but also passionate.
00:08:21.480 Like, they're, they actually, and there are lots of people in, in every government, you
00:08:26.240 know, who are very passionate about, you know, protecting their country and doing a
00:08:29.060 good job.
00:08:29.640 But the incentives at Palantir, I think, are better aligned to actually create good
00:08:33.460 outcomes.
00:08:33.780 They are really excited when they make it possible for someone in, in the U.S.
00:08:38.920 military, for example, to, to end a battle before it begins.
00:08:41.700 You know, to, to extremely reduce the amount of collateral damage.
00:08:46.320 Palantir has come under a lot of fire recently because of their, their, their use in Gaza,
00:08:51.980 for example.
00:08:53.040 And they're like clearly okay with helping Israel.
00:08:57.580 So like, that's going to get them in trouble.
00:08:59.340 Why shouldn't they be?
00:09:00.040 I, I, I will never understand why.
00:09:02.840 If the, if the hostages haven't been released, Israel has comp watch to do whatever it wants.
00:09:07.880 Yeah.
00:09:08.400 It's a, well, yeah.
00:09:09.780 And as long as.
00:09:10.440 And if, if, if Greta Thornberg wants to go out there and, and complain, why isn't she
00:09:15.460 asking for the release of the hostages?
00:09:17.240 You know, that is like, you want me to like, let my daughter be continually graped by, you
00:09:24.340 know, these people and not do anything about it.
00:09:26.920 Just be like, oh, I guess I'm giving up on her.
00:09:29.220 You know, like we've seen what these people do to their hostages.
00:09:32.120 And I'd point out that we have seen what these people do to their hostages.
00:09:35.980 People like Greta Thornberg and the other people on her selfie boat have not seen what
00:09:41.320 they do to their hostages because when the Israeli forces offered to show her video of
00:09:45.980 what was happening, she and the rest of the people on the boat absolutely refused to watch
00:09:50.820 any of it.
00:09:51.420 So this just shows that from my perspective, it's not like she has seen what's happening
00:09:55.460 on one side and what's happening on the other side.
00:09:57.720 She's just like, I don't care what's happening to Jews.
00:09:59.800 I just don't care.
00:10:00.980 I don't care that there are still people down in tunnels that this is happening to, that
00:10:05.200 these are your sons and daughters, and that that is why you are continuing to fight.
00:10:09.520 I don't care.
00:10:10.240 They get to do whatever they want.
00:10:11.560 Like, this isn't like they're in some nice room or something.
00:10:14.520 They're being starved and graped to death.
00:10:17.520 Like, this is mortifying.
00:10:20.220 And the fact that you even would consider that you wouldn't do literally everything was in
00:10:26.880 your power to get the hostages back, just to me, it shows, like when you consider that
00:10:31.280 those hostages could be family members, shows a lack of humanity in an individual.
00:10:35.320 They have a right to do whatever they feel they need to do until the hostages are released.
00:10:42.200 And if you're like, well, but what about all the other people who are being hurt?
00:10:45.720 And all of those other people being hurt should be doing everything in their power to get the
00:10:49.160 hostages to be released.
00:10:50.400 Because at the end of the day, what you're asking me to do is to tell somebody who knows
00:10:54.060 that their daughter is being treated as an S slave and could be for the rest of her life
00:11:00.600 if you don't get her released to not do anything in their power to get her back.
00:11:04.980 And I can't ever ask of somebody because I couldn't ask that of myself.
00:11:09.680 If somebody came up to me and they're like, Malcolm, you've already murdered, you know,
00:11:13.360 50,000 people just trying to get your one daughter back from the grape caves.
00:11:18.220 You should stop now.
00:11:19.500 You know, it's just the one girl.
00:11:20.620 I'd be like, I'm sorry.
00:11:22.060 That's not how this works.
00:11:23.560 It's my daughter.
00:11:24.920 And if you said, well, yes, but some of their kids are hurt as collateral damage, I'd say,
00:11:28.980 I'm not going to stop until my daughter is out of the grape cave.
00:11:33.280 That's it.
00:11:34.420 Period.
00:11:34.980 And so I wouldn't ask that of anyone else.
00:11:37.300 Okay.
00:11:37.620 But bringing it back to Palantir, basically every argument that people are butthurting
00:11:42.160 about, I, I, it just really doesn't concern me at all.
00:11:46.320 Like I'm not worried about it.
00:11:47.400 I mean, one, the executive order on data sharing.
00:11:49.720 In March, 2025, President Trump signed an executive order mandating that federal agencies
00:11:54.700 share data.
00:11:55.380 I didn't realize Palantir was involved with this.
00:11:57.420 I was just stoked about it when I heard about it.
00:11:59.060 But of course, Palantir is involved because Palantir is the tech substrate that enables the data
00:12:03.440 sharing to happen efficiently.
00:12:04.900 But anyway, this, this is what makes people fear that there's like a master list of personal
00:12:09.320 information with untold surveillance power, which we should have.
00:12:13.740 And the, the foundry platform of Palantir is obviously what's going to make this happen.
00:12:17.540 So one, they're, they're afraid of that.
00:12:19.320 The data sharing.
00:12:20.500 Oh, I'm sorry.
00:12:21.160 Our government's functional now.
00:12:22.880 How dare we?
00:12:24.360 Okay.
00:12:24.500 So that's, that's argument one.
00:12:25.920 I don't care.
00:12:26.820 I I'm, I'm alarmed that we're not doing it already.
00:12:29.760 Issue two is immigration enforcement.
00:12:32.060 How dare we enforce our immigration policy?
00:12:34.940 Again, Palantir has worked with ice, particularly in for deportation tracking has drawn a lot of
00:12:39.960 criticism, even from former employees and activists who argue that it could lead to mass
00:12:43.940 surveillance and human rights violations.
00:12:45.480 And there are 13, 13, okay.
00:12:48.300 13 former Palantir workers who condemned the company's role.
00:12:51.880 Also, these are former workers.
00:12:53.400 So they would just be like fired for not doing their jobs.
00:12:56.360 People who were fired for incompetence.
00:12:57.840 That, that are, that are concerned about the company's role in supporting Trump's immigration
00:13:01.640 policies, citing violations of its founding principles, which are probably something like.
00:13:06.840 I'm clutching them so tightly.
00:13:08.160 I know.
00:13:08.500 So like one, no, like the U S has a right to enforce its laws.
00:13:12.320 Like don't, don't, don't come at me.
00:13:15.140 You know what I want to see?
00:13:16.140 I want to see the government shut down the NSA and replace it with Palantir.
00:13:19.960 The NSA has no business existing anymore.
00:13:22.580 Yeah.
00:13:23.060 I think, yeah.
00:13:24.980 Palantir could do it.
00:13:25.880 I mean, so, but keep in mind, Malcolm, that to the, the big data centers, like a lot of what
00:13:30.620 NSA just does is raw data collection, which is useful.
00:13:33.980 Palantir uses that data.
00:13:35.760 So I'm, I'm like, I say, keep, you know, keep.
00:13:38.980 No, the NSA employees have repeatedly been shown to violate the interest of the American
00:13:43.800 citizen.
00:13:44.640 Yeah.
00:13:45.060 They, they, they're like.
00:13:46.080 Oh, so you're just saying have Palantir do what the NSA does now.
00:13:48.720 Yeah.
00:13:48.940 Take over all those data centers.
00:13:50.400 Oh, okay.
00:13:51.560 Yeah.
00:13:52.520 They, they, they have people doing like searches on like former girlfriends and stuff like that.
00:13:56.920 They've been caught with this.
00:13:58.220 They have people, you know, as I've said recently, all of this in trans propaganda and
00:14:03.040 like a trans cult there that's been enforcing its message on everyone.
00:14:05.780 And I've heard no story like this from Palantir employees.
00:14:09.680 No, definitely not.
00:14:10.940 No, none of this has been, you know, the height of professionalism and, and motivated altruism.
00:14:17.840 So then the fourth argument is the classic one of just like the, the ethical and the
00:14:23.040 privacy fears, which, you know, you know how I feel about that.
00:14:25.780 How much is Palantir paying you, Simone, for this, by the way?
00:14:28.420 Not enough.
00:14:29.160 How much are we getting from Mossad?
00:14:31.000 You're just not enough.
00:14:32.180 I actually, I think I did find a Palantir apologist on YouTube and I'm like, are you being paid
00:14:36.960 by them?
00:14:37.620 Like, what's up with you?
00:14:39.100 But he just actually seemed, I think he was more of a stonk enthusiast and we can get to
00:14:42.740 that too.
00:14:43.740 Oh yeah.
00:14:44.020 The Palantir stock thing.
00:14:45.640 Well, yeah.
00:14:46.340 Palantir stock has surged over 70% in 2025 driven by these contracts of the Trump administration's
00:14:52.220 priorities.
00:14:52.580 So it's, it's the S and P 500 second best performer.
00:14:56.060 Yeah.
00:14:56.260 I think it's going to have some stands because of that.
00:14:58.580 And I don't think, yeah, I mean, yeah, kind of, they are being paid because, you know,
00:15:02.360 they, their stocks have gone way up if they were earlier holders.
00:15:05.540 Yeah.
00:15:05.860 Yeah.
00:15:06.420 It's, but it's, I mean, it's on that front, it's valuation is too high.
00:15:10.880 It's 200 times forward earnings at its current valuation.
00:15:14.780 It's, it's not, that's not a sustainable price.
00:15:17.320 Like I think it's, it's way overinflated because of all this.
00:15:19.800 And like, to your point, right.
00:15:21.080 With stocks is once something becomes pervasively discussed and people think it's a good asset,
00:15:25.620 like it should be dead to you.
00:15:27.340 You should have sold it already.
00:15:28.660 You know, like that is not when it's interesting.
00:15:31.060 You want things that don't look interesting to anyone when you know they're undervalued
00:15:33.840 anyway, though.
00:15:34.780 So back to the ethical and privacy risks.
00:15:36.700 So obviously there are people who warn that Palantir's technology could be used to build
00:15:40.040 a police state infrastructure, especially given that it can process biometric data, social
00:15:44.840 media activity, other personal information, and just combine it all together to create these
00:15:48.600 amazing footprints and profiles.
00:15:50.280 But to J.D.
00:15:50.820 Vance's argument, like I said earlier, this is already happening from companies.
00:15:55.620 Yeah, I'm sorry.
00:15:56.860 Do you want the police state to be run by China and private corporations?
00:15:59.720 Or do you want like any degree of accountability within it, right?
00:16:03.660 Yeah.
00:16:03.800 People argue that they're, they're worried about the lack of transparency potential for
00:16:07.380 algorithmic bias or data breaches.
00:16:10.280 Palantir versus NSA in China.
00:16:12.800 Yeah.
00:16:13.020 I'm like, I'm sorry.
00:16:13.820 Like you're, you're probably the same people who freaked out when TikTok was going to be
00:16:17.940 shut down.
00:16:18.460 Like I didn't see you complaining then.
00:16:20.040 This is kind of like people freaking out when we suggest a medal of motherhood, but like
00:16:24.620 they also encourage most of the socialist policies that were, you know, popular in post-Weimar
00:16:31.360 Republic Germany.
00:16:32.720 So whatever.
00:16:33.500 Right.
00:16:33.700 And then of course the final argument is that like, oh, well, they're like, they're too
00:16:37.160 close.
00:16:37.620 Like the political, like Trump isn't, well, what Trump derangement syndrome, like Trump
00:16:41.240 is involved.
00:16:41.860 So it's gotta be bad.
00:16:42.880 And also Peter Thiel is involved.
00:16:44.820 So it's gotta be bad.
00:16:46.160 And, and there are some former Palantirs who are working with Doge and they just, oh, this
00:16:50.440 is so incestuous, but like, oh, I guess it's just because maybe there aren't that many
00:16:54.020 competent people in the world.
00:16:55.740 Yeah.
00:16:56.100 Oh, competent people work in important jobs.
00:16:58.420 I'm so shocked by this.
00:17:00.880 It's a collusion, collusion.
00:17:03.840 So yeah, I just like, it's, it's insane to me that people are having these freak outs
00:17:09.580 and people have created these long YouTube videos about how Alex Karp is a demon.
00:17:15.320 And I'm just listening to them and I'm like, I'm so concerned about Palantir's lack of power
00:17:20.640 right now.
00:17:23.080 Money, please.
00:17:26.400 Oh no, no, there's no money.
00:17:28.420 Money, please.
00:17:32.220 Money, please.
00:17:33.340 I'm a money, please.