Based Camp - December 06, 2023


Prepping for Collapse vs Building the Future


Episode Stats

Length

30 minutes

Words per Minute

176.62537

Word Count

5,370

Sentence Count

405

Misogynist Sentences

1

Hate Speech Sentences

24


Summary

In this episode, we discuss the differences between dark ageism and apocalypticism, and why there's a huge divide between the two, and how to deal with it. We also talk about AI and AI apocalypticism.


Transcript

00:00:00.000 By the way, did you know that the phrase toxic masculinity came out of the mythopoetic men's movement?
00:00:05.100 It was actually a description of the type of bad behavior that comes out of a society that suppresses masculinity.
00:00:14.520 Interesting.
00:00:15.420 Yes, but we're here to talk about something else.
00:00:17.640 But I am very excited to be here with you today.
00:00:19.200 Yeah.
00:00:20.200 So what I wanted to talk about, because this is something I was thinking about, where we often point out that humanity is heading into a dark age,
00:00:28.220 but we also often really complain about apocalypticism in the Judeo-Christian canon, right?
00:00:35.360 So if you look historically, it was in the Judeo-Christian tradition, there have repeatedly been trends towards apocalyptic approaches to the world.
00:00:45.200 Yeah.
00:00:45.960 Which is to say, you can look at the Millerist movement early in the U.S.
00:00:49.020 There was this movement in the 70s that was some number code in the Bible.
00:00:53.060 There was Y2K.
00:00:54.580 There was...
00:00:55.100 And this number code in the Bible.
00:00:56.000 Remember the Mayan calendar one?
00:00:57.500 Oh, the other Mayan calendar one.
00:00:59.660 We just, as the Judeo-Christian culture is incredibly...
00:01:04.460 And it doesn't seem to happen with the Zillamite culture so much, specifically, which is part of the Judeo-Christian tradition.
00:01:09.460 So what I really mean is Jews and Christians seem really, really, really, really susceptible to apocalyptic...
00:01:15.020 Love us some end times.
00:01:16.880 Yeah.
00:01:17.480 Yeah.
00:01:18.380 And these mimetic sets, so somebody was like, well, aren't your views apocalyptic?
00:01:22.340 Because you say we are headed towards the dark age.
00:01:24.280 And I actually pointed out something that I don't think a lot of people realize, which is that dark ageism, the belief that we are about to head towards a significant and dramatic decline in culture, is actually fairly rare historically in the Western canon.
00:01:40.480 There are people who have said things are worse today than they were in the past, that it's very different than dark ageism, warning that things are about to take a dramatic decline downwards, but one that you have power over and can affect.
00:01:57.460 Right, because instead the view is that there's going to be a dramatic end, just an end.
00:02:04.000 It's end times.
00:02:05.180 It's not dark times.
00:02:08.200 Yeah.
00:02:09.240 And so why is this?
00:02:10.520 Because I think this is very interesting.
00:02:12.400 Why there's this huge split here.
00:02:14.860 And I think it's because of, well, two things.
00:02:16.560 The mimetic viability of each of these ideas.
00:02:20.100 And two, what they imply for the individual, right?
00:02:24.500 So the biggest, if I was going to like sum it all up in one little piffy quote, is that apocalypticism removes responsibility from the individual.
00:02:36.380 Yeah, yeah.
00:02:37.800 Whereas dark ageism increases the responsibility on the individual.
00:02:42.320 Oh, that's where you're going with this.
00:02:44.740 Okay, nice.
00:02:46.320 Well, no, it's true, right?
00:02:48.060 If you believe that society is about to head in a dramatically downwards direction.
00:02:53.240 Yeah.
00:02:53.460 You don't need to save money.
00:02:54.840 You don't need to build anything.
00:02:56.000 You don't need to invest in the future.
00:02:57.880 You know, you can invest all in the now.
00:03:00.260 Well, this is, no, sorry.
00:03:01.300 That's if you believe in apocalypticism.
00:03:02.660 That's if you believe in apocalypticism.
00:03:04.040 Yeah, sorry.
00:03:04.960 Yeah.
00:03:05.260 So if you believe in apocalypticism, you don't have to do shit.
00:03:07.740 Like, you can do whatever you want, right?
00:03:09.520 Like, because the world is either going to be destroyed or the only thing you need to invest in, if you're an apocalyptic, is spreading the apocalyptic message.
00:03:17.400 Yeah, getting attention.
00:03:18.360 Oh, that's so hard.
00:03:20.140 Well, it's not just getting attention.
00:03:21.580 Like, obviously, that appeals to the individual, but it also is memetically useful.
00:03:26.940 A memetic set that is spreading via apocalyptic messaging is going out there and telling people, okay, just believe in the message.
00:03:36.340 That's all you need to do to prevent it, is believe in the message.
00:03:38.720 And the number one place you see this today is with AI apocalypticism.
00:03:41.360 And if you want to see our videos on how unlikely AI apocalypticism is, you can look at our reverse grabby alien theorem video, which I think to me is the most compelling argument I've ever seen on the point, which is basically to say, if it was this easy to create a paperclip maximizing AI, we would see them out there in space everywhere.
00:04:03.640 And if the reason we don't see them is because of the anthropic principle, i.e., we only wouldn't see them in a planet that hadn't been destroyed by them, well, then we're about to see them, so it's irrelevant that we're working on them, right?
00:04:14.440 And there's a bunch of other answers, but watch the video if you're interested in that.
00:04:17.260 But anyway, the point here being is that if I, for example, think that AI apocalypticism is accurate, right, then I don't need to invest in the future, I don't really need to do anything other than general hedonism, and I can spend all of the money I raise at the financial organization, all of my time as an individual attempting to convert people to this movement.
00:04:42.940 Mm-hmm.
00:05:12.940 Yeah, I wonder, is there like a Catholic versus Protestant?
00:05:27.480 No, because there's tons of apocalyptic Protestant groups.
00:05:30.320 I'm just trying to think of this American tendency specifically, and I haven't really seen this in other cultures, though, that perhaps that's just due to my cultural ignorance.
00:05:38.080 And they get like really excited about survivalism.
00:05:41.440 I mean, in the United States, there are entire industries around, you know, the, you know, building up years worth of supplies of food with your MREs that you have in your bunker and all your guns and your bullets.
00:05:53.840 And like, you know, people are like, you know, people are like, they enjoy, there is an industry that is definitely built around the enjoyment of preparing for a dark age and being ready to go through it.
00:06:04.220 What, what did you, what did, what culture drives that?
00:06:07.320 Because I can't say, oh, like that's clearly Protestant or that's clearly Catholic or anything, right?
00:06:13.200 Where is, where is that coming from?
00:06:14.980 I mean, why, what is it that makes someone a dark ageist rather than an apocalypticist?
00:06:19.840 Well, I, so, so apocalyptic mindsets, I mean, I think through that you can see how appealing apocalyptic mindsets are, especially, and I think the group that's most susceptible to them are Jewish groups and Protestant Christian groups.
00:06:33.320 And they, they lead to different actions within these two communities, within the ultra individualistic and rural Protestant Christian groups, they lead to this bunker building, right?
00:06:44.560 But I don't think that there's any realistic vision for a societal collapse in which this bunker building is really a high utility action.
00:06:53.740 Not one that we're going to survive.
00:06:55.720 I mean, there's some like nuclear apocalypses and stuff like that.
00:06:59.120 This is why I say it's, it seems to me like it's purely recreational because as you say, that's not really how it's going to play out.
00:07:04.540 Right.
00:07:04.780 But it is, it is susceptible and seductive as an ideological set.
00:07:12.100 Now, this is a big problem.
00:07:13.620 If you're from a cultural group and you know this ideological set is severely seductive to your cultural group, you need to sort of offset all ideas that are associated with it.
00:07:23.060 I mean, you and I may indulge in prepperism, which we definitely do to an extent.
00:07:28.800 I mean, I, I, but I understand that it is largely recreational and aesthetic.
00:07:36.620 There's some useful bits.
00:07:38.320 No, actually, what am I thinking?
00:07:39.540 I have Faraday bags full of electronics because I'm so convinced there will be a solar flare.
00:07:45.100 No, come on, that's useful.
00:07:47.800 It depends.
00:07:48.740 It depends on if there's a solar flare, but right.
00:07:52.880 But that level of prepperism, like I get it.
00:07:55.440 Right.
00:07:56.000 I, but I, I think that when you're trying to prepare for actual likely futures for our species, it can really over index you towards futures, which is really interesting.
00:08:08.060 So pure apocalypticism, right?
00:08:10.920 Pure apocalypticism removes individual responsibility.
00:08:15.140 Yes.
00:08:16.220 Whereas prepperism is a seductive thing, which is different than dark ageism or pure apocalypticism in that it rewards radical self-ownership meaningfully in a way that society just doesn't.
00:08:31.200 Okay, so you're really trying to separate out prepperism and you're trying to say it's not dark ageism.
00:08:37.120 I would probably say it's just poorly educated dark ageism.
00:08:42.000 I don't think that.
00:08:42.660 No, I don't think it is.
00:08:43.780 Really?
00:08:44.780 No, so it's not dark ageism.
00:08:46.260 It's not apocalypticism.
00:08:47.480 Apocalypticism is about removing responsibility from the individual and being able to spend all your time on proselytization.
00:08:53.480 Sure, sure.
00:08:53.820 Prepperism is about a world, a fantasy of a world in which your individual actions can matter in and of themselves in regards to family preparation or family, what's the word I'm looking for?
00:09:11.160 Fortification.
00:09:11.720 So in a prepperist fantasy, right, the things that I do for my family, the trees I plant to grow food, the et cetera, et cetera, there are chickens, everything like that.
00:09:22.860 Like these things do matter to some extent, right?
00:09:26.040 But they do not actually protect my family in a meaningful context.
00:09:30.440 I think there is a fantasy, especially among men, that these things will matter in a meaningful context.
00:09:38.100 And that's what prepperism is.
00:09:39.560 It is a world in which individual actions, in which this fucked up society we live in doesn't matter because you as an individual trying to do what's best for your family while ignoring trying to change society or trying to create any sort of larger community is a thing of genuine value.
00:09:56.720 Okay, so in other words, it's cope when you feel disempowered by society and you still want to feel empowered.
00:10:03.420 So you're basically like, oh, don't worry.
00:10:05.180 It doesn't matter that I'm not empowered in society because society is going to fall apart.
00:10:09.200 And then I'll be empowered.
00:10:10.500 But what you're saying, implying then about dark ageism is that it is someone who is preparing for a worse future, but one in which they are shaping society going forward.
00:10:22.500 Is that correct?
00:10:23.480 Dark ageism is, well, I mean, it's about trying to save all of society.
00:10:29.740 Yes.
00:10:30.240 The difference is a dark ageist isn't just trying to save their own family.
00:10:33.900 They're not just creating their own bunker.
00:10:35.300 They're saying, okay, society is not moving in a good direction.
00:10:39.120 I'm going to create an alternative economy.
00:10:41.840 I'm going to create an alliance of families.
00:10:44.040 I'm going to create a city-state.
00:10:46.080 I'm going to create a new type of company or government.
00:10:49.560 And that is very different from saying, here's my bunker or here's my complex.
00:10:54.160 Is that right?
00:10:55.060 Yes.
00:10:55.860 Hmm.
00:10:56.660 I don't really know, aside from you, who's doing that?
00:10:59.940 Well, I think a lot of people are.
00:11:01.080 I think a lot of people are.
00:11:01.760 Elon Musk is doing that because he's like, we're going to bring people to Mars, blah, blah, blah.
00:11:04.740 He is actually trying to build an alternative future society, or at least a different kind of future society.
00:11:10.780 I'd also say that a lot of preppers do this as well.
00:11:13.060 The people who are called preppers.
00:11:14.340 There's two categories of preppers.
00:11:15.560 People who are only interested in their own family and thinking about that.
00:11:20.720 And people who are looking at a larger societal level while understanding that if they don't have a community, it is likely irrelevant for most realistic prepper scenarios.
00:11:31.260 And these are two very different things.
00:11:33.520 Can you give me examples, aside from you and Elon Musk, of people who are on the Dark Ages side and not just prepperism, so they're actually trying to build something for society?
00:11:45.840 Well, I mean, there aren't a lot of famous people who would fall into this category.
00:11:49.460 Can you give me maybe even a hypothetical example of a not famous person, but what they would be doing?
00:11:54.580 We have friends who are doing this.
00:11:56.740 They are specifically moving to networks of like-minded families that intend to share responsibilities, that intend to build systems like this among each other.
00:12:06.280 If you are doing anything like this outside of a network of like-minded families, it is like a personal indulgence, like a fancy car or something like that.
00:12:16.200 All right. So I'm going to say a lot of Orthodox Jewish groups and a lot of trad-cath groups are like this.
00:12:22.360 But I don't think that a lot of our friends are like this.
00:12:25.780 I think that they're more along the bunker end of the spectrum because they're really just thinking about it for the context of their families.
00:12:32.340 They're not building any larger culture that's scalable.
00:12:36.840 They're not creating any infrastructure governing-wise or economy-wise that would bring them forward.
00:12:42.500 Whereas I can see with various Orthodox Jewish groups and like trad-caths in general that there is something that would start to pick up and build an influence in a future world.
00:12:54.000 Does that make sense to you?
00:12:55.120 Yeah.
00:12:55.680 Okay.
00:12:56.960 Yeah.
00:12:57.260 That's interesting though.
00:12:58.460 What we hadn't come into this conversation with, which is really helping me see things differently, is the difference between individualistic prepperism and a sort of noblesse oblige take on collapse, which is now I must rebuild or even an excitement about rebuilding.
00:13:15.920 Well, no, I'd say it's more than that.
00:13:18.820 It's an individualistic prepperism versus community-oriented prepperism.
00:13:23.300 Are you prepping for your house or are you prepping for your church or synagogue, right?
00:13:30.940 These are two very different things to be prepping around, and they require very different types of prepperism, one of which is actually of utility if you want your family to survive intergenerationally rather than just you yourself barely clinging to life.
00:13:45.900 Yeah.
00:13:47.260 Yeah, that makes sense.
00:13:48.580 And I was thinking about this.
00:13:49.500 In and of itself, right?
00:13:50.920 Like, if you're just clinging to life a bit longer in a world that is collapsing, like, you have achieved nothing of meaning.
00:13:57.100 Even if you have kids, right?
00:13:59.200 You have achieved a little bit more of meaning then.
00:14:01.340 But if your kids don't have people to marry, if they don't have a larger community, if they don't have a larger seed of a socioeconomic structure that can come out of the collapse, then you haven't done that much.
00:14:12.440 Especially given all of the things that you could be optimizing for, given the privileges every human has access to today in this last age of abundance.
00:14:22.000 And the opportunities at play, like the fact that you really, really, really couldn't matter to a large number of future generations.
00:14:29.980 So that's interesting.
00:14:32.340 Now, where do you put Curtis Yarvin on this spectrum?
00:14:35.520 Because now, given what you've said, I feel like he's maybe more – at first, I thought he was just on the –
00:14:40.180 He's a dark ageist.
00:14:41.280 He is a dark ageist.
00:14:42.140 And he is, like, not an individualistic dark ageist at all.
00:14:45.440 No, not at all.
00:14:46.400 I disagree with his thesis on how to fix society, but he is definitely taking the harder route of the potential routes.
00:14:53.620 And he also is a real intellectual.
00:14:55.640 If you look at his work, as I've often said, if you look at, like, Elie Isaacowski's work, or you –
00:15:00.640 Oh, yeah.
00:15:01.040 That – there's the – yeah.
00:15:02.840 So Curtis Yarvin.
00:15:03.800 Yeah, he's actually kind of an idiot.
00:15:05.800 Like, he's just, like, genuinely not, like, an intellectual powerhouse.
00:15:09.240 Well, and a deep, deep, deep apocalypticist.
00:15:11.640 Right.
00:15:12.140 But he needs that to justify his lifestyle and decisions.
00:15:15.920 Whereas with Curtis Yarvin, I don't agree with everything he says, but he's very clearly an intellectual powerhouse.
00:15:21.880 If you look at his stuff, he clearly has really sought through things and understand things.
00:15:26.820 I just culturally have disagreements with him.
00:15:30.440 Yeah.
00:15:31.280 But, I mean, more in the sense that, like, your faction will eventually be competing with his faction and other factions.
00:15:38.860 I don't even think so.
00:15:39.720 I think his faction would easily fold into our faction.
00:15:42.140 So his preference for monarchism isn't really that different from our faction's preference from controlled and cyclable dictatorships.
00:15:51.620 I think governance structures that consolidate power are usually the best governing structures.
00:16:05.760 But that power needs to be expellable the moment it becomes corrupt or inefficient, which is what all of the governing hypotheses that we work on are intended to do.
00:16:16.680 I think that's what the U.S. government was originally intended to do, although it's also intended to split power a bit more.
00:16:22.700 So that's a bit of an inaccurate statement.
00:16:24.480 But, yeah, I think that the best governing structures do have responsibility lie on a single individual or a single small group of individuals.
00:16:33.780 But that individual group needs to be cyclable out.
00:16:36.580 Whereas the core difference between us and Yarvin is he doesn't believe that.
00:16:39.420 He thinks that that individual should be chosen based on their proven competence.
00:16:42.920 And if they're capable to be cycled out, that that would cause negative effects on the society, like they would be cycled out for the wrong reason.
00:16:52.720 And he's not insane for thinking this.
00:16:54.460 I mean, if we look historically, like the two, I think, greatest figures in demographic history were both betrayed by their own countries.
00:17:03.120 Demographic history?
00:17:04.040 What do you mean by that?
00:17:05.280 Democratic, I said.
00:17:06.420 Oh, democratic.
00:17:07.100 I might have said demographic.
00:17:09.200 I don't know.
00:17:09.800 A democratic history.
00:17:10.640 So specifically Winston Churchill and Samistocles.
00:17:13.520 Both were betrayed after saving their democracies because democracies are prone to do things like that.
00:17:20.660 When a single individual is so obviously right and so obviously has an understanding of how the democracy actually functions and how to make the world a better place.
00:17:31.320 Well, they are an intense threat to the powers that be within that society.
00:17:35.880 And so that society, from its media to its other power players to its other elite, has every single motivation conceivable to try to get rid of that individual and to try to move them out of society.
00:17:49.400 And in the case of Samistocles, he was exiled.
00:17:52.820 And he actually ended up – it's really funny.
00:17:54.960 A lot of people don't know this, like the story after Samistocles.
00:17:57.400 So not only did he save all of Greek from the Persians in the Greco-Persian Wars very easily, like he tricked them.
00:18:05.840 For people who don't know his story, there's this amazing moment where he essentially tricked the Persians into surrounding a collection of Greek fleets because all of the Greek city-states hated each other and some were planning to basically go back home.
00:18:20.720 And so he needed them to be surrounded so that they couldn't retreat, so that they could all fight together.
00:18:27.580 Like the level of cunning that's required to do that.
00:18:30.540 But then he got expelled from Athens afterwards because they're like, oh, the average citizen likes this guy too much.
00:18:37.100 They say, oh, he's a populist.
00:18:39.420 Anyway, he then went to this region of Persia that actually had holidays and statues dedicated to him hundreds of years after his death because he did such a good job as a local governor.
00:18:50.720 Of this, like, irrelevant region of, like, southwest Persia, I think.
00:18:55.620 So he actually went to the enemies and was like, okay, I'm not going to help you with, like, any war thing, but I can be, like, a local governing person.
00:19:03.040 Or it might have been Greek islands.
00:19:04.180 I don't know.
00:19:04.560 But anyway.
00:19:05.620 I think that's what happened to him.
00:19:06.700 Went to Churchill, for people who aren't familiar with him.
00:19:08.900 You know, he predicted everything in regards to World War II.
00:19:12.040 He predicted everything in regards to what was going to happen if Britain withdrew from India too quickly.
00:19:17.580 He was not against them withdrawing entirely, but he's like, if you withdraw too quickly, this is going to have really negative consequences.
00:19:24.600 The number of deaths involved in the war that was basically – so if I may give a bit of history here, what happened here is Britain, driven by the pussies who decided, oh, we're going to be anti-colonialists, they withdrew all at once.
00:19:36.600 Because before, the Muslims had a chance to migrate to Pakistan and the Hindis had a chance to migrate to India because that was the idea.
00:19:44.520 You're going to have a Muslim state and a Hindi state.
00:19:47.700 And as a result, there was an incredibly bloody war that was completely unnecessary and that could have been avoided if people had listened to Winston Churchill.
00:19:57.780 Just as a side note here, I am not saying that if Winston Churchill didn't have godlike powers and could do whatever he wanted, he wouldn't have kept India in the British Empire.
00:20:09.660 But when he knew that India had to leave the British Empire, which he did accept at one point, he also saw it would lead to this war that could be prevented.
00:20:18.140 And he was rushed to release India before he put in the steps to prevent the war.
00:20:23.540 Also, I'm not saying he didn't have racist views against Hindis.
00:20:26.380 He absolutely did.
00:20:27.560 Potentially.
00:20:28.380 But it is, I think, just if you look at history, I think my reading of events is he saw in the same way he saw with World War II, where he kept warning everyone before World War II.
00:20:38.160 For people who don't know this, Winston Churchill's biography actually came out before he was elected prime minister, before World War II.
00:20:46.940 Yeah, he, like, assumed he'd peaked.
00:20:48.880 Yeah, he thought his career was over because he had – and how did he destroy his career?
00:20:53.620 He destroyed his career by constantly telling everyone, this Hitler guy is a problem.
00:20:57.820 This Hitler guy is a problem.
00:20:58.880 You don't know how big it is.
00:20:59.600 God, this guy's a bummer.
00:21:01.040 Stop, Churchill.
00:21:02.500 Yeah, people were like, hey, man, you're being a bitch.
00:21:04.920 He was super against appeasement.
00:21:06.680 Yeah.
00:21:07.620 And, you know, to be fair, he didn't have a perfect track record.
00:21:10.760 You know, there was the – he got, like, obsessed with weird tanks or something in World War I and didn't work out.
00:21:16.040 But, like, you know, but still, he called that one.
00:21:18.580 Really clever shit.
00:21:19.900 I mean, it was clever.
00:21:20.780 It just, you know, it was like, you know, inventing, you know, web fan before its time, you know.
00:21:26.500 Well, okay.
00:21:27.240 I mean, he was really important to the strategy.
00:21:30.180 God, what was it called?
00:21:31.060 Operation Fortitude.
00:21:32.440 I'm referring to the giant fake Adidas operation that he helped organize.
00:21:36.940 I don't remember this.
00:21:38.960 Yeah.
00:21:39.560 Adidas was as successful as it was because of a huge campaign.
00:21:44.580 So they had inflatable tanks.
00:21:46.500 They had inflatable –
00:21:47.580 Oh, this one.
00:21:48.140 Yeah, the decoy.
00:21:49.120 The decoy campaign.
00:21:49.580 Yeah, but it wasn't just decoys.
00:21:50.860 They also had the dead body of a British high-level officer.
00:21:53.720 Wash up with, like, the code or something.
00:21:55.600 Yeah.
00:21:55.780 With a coded message that was like, oh, we're going to invade in this place.
00:21:58.820 We're totally not invading in.
00:22:00.220 Yeah, that was so smart.
00:22:01.320 Yeah.
00:22:01.680 Side note here.
00:22:02.460 I had to take pride in.
00:22:03.800 Between Simone and I, every single one of our grandfathers participated in D-Day.
00:22:10.480 They did some really clever shit in World War II.
00:22:14.220 And then he helped a British citizens get really on board.
00:22:17.300 So if you go to Britain, you will see these cut-down fences, these iron gates all over Britain.
00:22:21.800 Like, snipped off.
00:22:23.000 Snipped off.
00:22:23.680 And the question is, why?
00:22:24.480 Why would you do that, right?
00:22:25.880 Well, so the reason he did that was because he told people, give us your pots, your pans,
00:22:30.760 like, any wrought iron you have, and we'll use it for the war.
00:22:33.820 And she said it was completely unusable in the war.
00:22:36.340 Most of it's at the bottom of the ocean or lakes now.
00:22:38.560 They just dumped it.
00:22:40.120 But it was really important in allowing the average citizen to feel connected and sort
00:22:46.700 of sunk cost in the war itself, which was important in other types of regulations, like
00:22:52.920 eating less food so the food could go to the troops, like feeling connected to your kids
00:22:58.120 and everything like that, which helped keep morale up during the bombings and in situations
00:23:02.720 like that.
00:23:03.240 Which is so crucial.
00:23:03.920 He did just so many, I think, absolutely brilliant things.
00:23:08.700 I mean, had so much absolutely brilliant foresight.
00:23:10.820 And he was completely stabbed in the back politically after the victory.
00:23:15.440 And I think that this is just the nature of democracies.
00:23:18.520 You cannot be too successful at the politician in a democracy without being stabbed in the
00:23:22.360 back.
00:23:22.760 So I understand his intuition here.
00:23:25.420 I just think that the alternative is worse.
00:23:31.440 Any sort of system in which an individual can achieve power.
00:23:35.280 And I think we see this even with people who I respect.
00:23:38.120 Like when I look at wealthy people who I respect, like, okay, they've achieved power.
00:23:41.000 They've basically become monarchs of their like little techno empires.
00:23:44.120 Basically.
00:23:44.760 Yeah.
00:23:44.900 But after a while, they sort of seem to go a little crazy.
00:23:50.120 No matter how well-meaning they are.
00:23:52.820 I don't want to give names.
00:23:54.140 I'm just saying that when individuals achieve this level of power and maintain it for like
00:23:59.440 more than a 10-year period, they seem to begin to make decisions that no sane person would
00:24:07.160 make.
00:24:07.480 And I think that decisions are driven not by them, but by the way, entourages build up
00:24:13.960 around these sorts of individuals.
00:24:15.880 So they don't have anyone like whispering in their ear, remember you are mortal.
00:24:21.760 Yeah.
00:24:22.460 By the way, what she's referring to is something that historically in Rome, a specific type of
00:24:27.000 slave caste was supposed to do for Caesars during military triumphs.
00:24:32.020 Right?
00:24:33.560 Yeah.
00:24:33.980 I can't remember if it was in general or if it was like one specific guy had like, had
00:24:38.520 his man do that.
00:24:40.280 A specific guy.
00:24:41.420 And then they gave it a specific name or something.
00:24:43.640 Or then other people started copying him because it seemed fucking cool.
00:24:46.040 Yeah.
00:24:46.200 Because it was, then it probably became this like sick, humble brag flex.
00:24:50.520 It's like, don't worry.
00:24:51.180 Yeah.
00:24:51.340 I have a slave that constantly reminds me I'm not a god.
00:24:54.280 My slave keeps me so humble.
00:24:56.260 I'm so blessed.
00:24:57.700 I'm so fucking humble.
00:24:58.940 So blessed.
00:24:59.260 So humble.
00:25:00.180 Don't worry.
00:25:00.740 I have a slave for that.
00:25:01.840 It's fine.
00:25:02.840 I have a, oh, don't worry.
00:25:03.960 I'm not too arrogant.
00:25:06.040 I've got a slave that reminds me I'm not a god.
00:25:08.520 Yeah.
00:25:09.680 I'm covered on that front.
00:25:11.120 I thought through that.
00:25:12.400 Yeah.
00:25:12.680 I'm a slave for that.
00:25:13.580 It's fine.
00:25:14.540 I can see how you would think I might be a god.
00:25:18.060 I've had that problem before.
00:25:19.960 Yeah.
00:25:20.340 I'm just not one.
00:25:21.620 So the slave helps with that.
00:25:23.280 But I really do think that there is an entourage problem.
00:25:25.760 And I wonder how to get around that.
00:25:29.620 I know this is way off topic for like, for dark ageism.
00:25:34.920 But you're right in that there is this sort of success delusion that comes, especially when
00:25:39.680 you have like all these inner circles of yes men who really, really, really are incentivized
00:25:45.420 to maintain their position in the hierarchy, not to give you good ideas or like make sure
00:25:50.580 that you're not going off the rails, but to make sure that no one else is getting closer
00:25:53.740 to you than they are.
00:25:54.660 And how do you know how you do it?
00:25:57.760 You don't have any friends.
00:25:59.960 Hmm.
00:26:00.660 Spouse.
00:26:02.600 You know, I really, I don't think I don't unless people have, have, have solid spouses
00:26:08.680 that are that different for a spouse that they, the one that they respect and two that
00:26:13.520 they work closely with.
00:26:14.240 It's just hard for me to think of a very, very wealthy, successful man who actually works
00:26:19.700 closely to and listens to their spouse.
00:26:22.040 Yeah.
00:26:22.100 But if you look at history, you see this.
00:26:24.240 Okay.
00:26:24.500 And the founding fathers and stuff like that.
00:26:26.860 Yeah.
00:26:27.380 Yeah.
00:26:27.600 You have Churchill, for example, had a spouse who was there whispering always.
00:26:32.260 She was, she was a, she was a beautiful, wonderful woman.
00:26:34.980 No, but this, this matters.
00:26:36.180 So you see what I'm saying?
00:26:37.000 Even the examples I'm using positive spouses who you perceive as your equal are critical
00:26:43.860 to not going crazy when you have too much power for too long.
00:26:47.540 Yeah.
00:26:48.620 Because other than that, everyone is a minion, a spouse, at least one who you really care about
00:26:53.820 is never a minion.
00:26:55.720 Yeah.
00:26:56.120 That's true.
00:26:57.180 And they also don't have disaligned interests with you.
00:27:00.520 So some people, they want you to succeed.
00:27:02.980 Yeah.
00:27:03.140 The kid always has some benefit from you dying.
00:27:05.820 Right.
00:27:06.240 You know, whereas a spouse often does not, unless you're in something like the Chinese system
00:27:11.700 and then that's the, you know, the dowager empress or something that's a negative situation
00:27:16.480 to be in.
00:27:17.020 But that's, that's, that's, I think just because socially and culturally, that's like
00:27:20.340 a really, uh, non-ized, uh, optimization, I guess I'd say.
00:27:25.640 Hmm.
00:27:27.700 Yeah.
00:27:28.560 Interesting.
00:27:28.960 So what is the, the takeaway or meaning or importance of distinguishing between preppers,
00:27:37.620 dark agers, and apocalypticists?
00:27:40.740 Well, I think every individual, you know, culturally, if you're watching this, you're probably similar
00:27:44.440 to me.
00:27:44.900 I have a prepperist instinct.
00:27:46.240 I think you have a prepperist instinct.
00:27:48.440 Prepperism can be fun, but remember it is a hobby.
00:27:51.500 It is not often useful for real societal downturns or the most likely societal downturns in which
00:27:58.240 you will survive and your great grandchildren will survive.
00:28:01.400 Oh, I see what you're doing.
00:28:02.300 So you're socially shaming individualistic prepperism and trying to take that instinct
00:28:08.320 that a lot of people have and direct it in a constructive fashion, especially in an age
00:28:14.660 at which we actually do believe that dark age is coming because we need people to build
00:28:18.380 the future who aren't just Elon Musk because he's kind of busy.
00:28:21.960 And then I think that there is apocalypticism and I think apocalypticism can only be beaten
00:28:27.680 back by immediately and aggressively shaming it wherever you see it.
00:28:33.200 How would you advise the average person to shame an apocalypticist?
00:28:37.320 Because I mean, even we have like, there are people that we've met who've become apocalypticists
00:28:41.200 and like in the end, I feel like you and I are just like, hey man, like, I hope you get
00:28:45.400 through that.
00:28:45.880 It's like talking with someone who's deeply depressed.
00:28:47.560 I think that's the key.
00:28:48.480 I understand that apocalypticism is about avoiding personal responsibility, not about
00:28:52.920 logic.
00:28:53.960 So should we be acting differently around our apocalypticist friends and just be like,
00:28:58.240 hey, shame them pretty aggressively.
00:29:00.100 Okay.
00:29:00.500 All right.
00:29:01.160 I mean, apocalypticism is about eschewing personal responsibility.
00:29:04.440 It's about saying I am not responsible for the future outside of spreading this one meme
00:29:10.120 that has infected my brain.
00:29:11.400 No, I feel so embarrassed because in the past we've just been like really empathetic toward
00:29:15.540 them and that's, that's actually pretty bad.
00:29:18.900 I should not do that anymore.
00:29:20.220 I mean, we aren't outright mean, but I, I will say that if they're talking to me, they
00:29:24.160 definitely get a sense that I think that they're pretty pathetic.
00:29:28.560 I guess so, because you're that one kind of person where like, you just walk away from
00:29:32.460 people at parties if you feel like they're not useful.
00:29:34.640 You just turn and you walk in the opposite direction and like, and you cut the conversation
00:29:40.620 short.
00:29:41.380 So people like have no ambiguities to like how you feel about them because you just don't
00:29:46.280 talk.
00:29:46.620 It's like, what are you working on?
00:29:47.700 How are you trying to make the world a better place?
00:29:49.520 And if they're like, they start complaining and you're like, okay, great.
00:29:52.280 And then you're like, oh God.
00:29:54.480 And then I, I'm sitting there talking with them uselessly for like 15 minutes and then
00:29:58.000 you get, well, I try to pull you away.
00:29:59.500 I'm like, I know you're like, you have somewhere to be.
00:30:01.720 Yeah.
00:30:02.120 So dear friends, if you've seen us do this, I'm, I'm sorry.
00:30:09.560 Sorry.
00:30:12.280 We're dealing with short timelines.
00:30:14.060 We have to fix things and we have to prepare things for the next generation in the last
00:30:18.280 age of opulence.
00:30:20.260 Yeah.
00:30:21.340 Yeah.
00:30:22.020 Love you, Simone.
00:30:23.020 Love you too.