Based Camp - November 08, 2023


Techno-Feudalism & the Post-Collapse Network Empire


Episode Stats

Length

24 minutes

Words per Minute

188.57188

Word Count

4,695

Sentence Count

254

Misogynist Sentences

1

Hate Speech Sentences

13


Summary

In this episode, we talk about the future of the world economy, and why the technophilic states are going to be major players in the future economy. We also talk about what the future will look like in a post-technophilic world.


Transcript

00:00:00.000 What makes it a network empire instead of a network state, and I think the core thing
00:00:05.140 that Balaji, when he came up with this concept and missed, is the insecurity of a future
00:00:10.180 world when we're dealing with wide-scale economic collapse.
00:00:14.020 That is the world that we are heading into, where it is cheaper for the wealthy class
00:00:19.180 in our society to isolate themselves from everyone else than it is for them to ensure
00:00:24.880 widespread prosperity.
00:00:26.980 Would you like to know more?
00:00:27.960 I am so glad to be recording these again.
00:00:30.960 Our audience doesn't know, but it's actually been like a week and a half since we did our
00:00:35.040 last recording, because we were at this ARC conference in the UK, which is supposed to
00:00:40.360 be, I don't know, like this new alternative to Davos sort of a thing, but I think it's
00:00:43.640 just really conservative British Davos.
00:00:46.860 But one of the people who we met while we were there and we had a long conversation with
00:00:50.920 was Curtis Yarvin.
00:00:53.060 And in that conversation, I really helped me clarify some things that I think about what's
00:01:01.140 going to happen in the future of our species.
00:01:03.220 And we may release that conversation because it was recorded at some time, but it was recorded
00:01:07.300 in like a busiest restaurant with a Greek reporter interviewing us both together.
00:01:11.420 So I can't believe that some random Greek newspaper, a monarchist newspaper, by the way, is getting
00:01:17.280 the piece where it's me and you and Curtis Yarvin talking for like two hours.
00:01:21.720 But we constantly get accused of being techno feudalists in the media.
00:01:28.640 And this to me feels not just like an unfair accusation, but almost an insane accusation.
00:01:36.980 It's a bit like if, you know, I have some friends who their family were, you know, left
00:01:43.000 Germany early and they tried really hard to convince everyone the Holocaust was coming.
00:01:47.000 And they were just basically told they were crazy.
00:01:48.880 And so this family is actually descended from a guy who broke into his girlfriend's house
00:01:52.580 at night, took the girl he was dating and ran away.
00:01:55.940 Now they made the horrible mistake of running East to Russia instead of West.
00:02:03.660 And so then they, for like a 10 year period, just constantly had to flee new places.
00:02:08.640 But anyway, it would be like calling him a Holocaustian and people would be like, well,
00:02:14.520 yeah, but even if he saw it coming, you know, they could have said, well, let's try to prevent
00:02:18.960 it.
00:02:19.300 Right.
00:02:19.500 And it's like, no, there was a certain point where he was like, look at Hitler, this guy
00:02:23.060 who was elected to power.
00:02:23.840 Look at what he's writing, read his book.
00:02:26.800 Okay.
00:02:27.260 He published this, like it's not vague what his plans are.
00:02:31.440 And, you know, I feel a bit like that when I talk about techno feudalism, where I'm saying
00:02:35.580 it is almost inevitable at this point that something like a techno feudalistic state is going to
00:02:41.760 happen.
00:02:42.120 And we need to, those of us who do not want to be churned up by the system need to prepare
00:02:50.600 for how the world is going to change.
00:02:54.980 Both in terms of our culture and our families and economically, because it's going to be absolutely
00:03:01.160 catastrophic and very, very significant.
00:03:03.540 Now, first I would say when we talk about techno feudalism, we do not mean, so there's this
00:03:08.700 like Greek economist guy who keeps, he wrote like a book on quote unquote techno feudalism.
00:03:13.920 And the way that he defines the term is vague and pointless.
00:03:17.440 Basically what we already know, which is that we live in a world in which large tech companies
00:03:21.680 control a lot of the economic system.
00:03:23.540 And it's like, yes, we know that.
00:03:26.660 That's not what we talk about when we talk about techno feudalism.
00:03:30.040 We are talking about something that is much closer to literal feudalistic states.
00:03:35.440 Exactly.
00:03:36.800 But before we go further, we need to talk a bit about where we think the overall economy
00:03:42.580 is going before we can talk about the technophilic states, which are going to be major players
00:03:48.360 within this future economy.
00:03:49.580 So this actually was a point that Curtis made in the conversation.
00:03:55.760 And after hearing it really clarified a lot of how I think about things, but it was in
00:04:00.700 line was what I thought already.
00:04:02.100 It just gave me more picture as to what the future is going to look like.
00:04:05.920 And he said, the future of the future of the Western world, at least, and the future of
00:04:12.620 the Eastern world is also going to be bleak, but the way it will collapse is going to look
00:04:16.160 very different.
00:04:16.740 And we can get to that in different videos, but the future of the Western world is going
00:04:21.780 to look very similar to the current situation in South Africa.
00:04:27.460 And this has nothing to do with race or even the politics of the country.
00:04:33.380 What specifically we're talking about is the on the ground reality of what it feels like
00:04:38.540 to live there.
00:04:39.080 Yeah, well, it's what real collapse looks like from a, we'll say, a modern developed
00:04:44.960 society.
00:04:46.680 So what happens when the infrastructure has been built, when there's electricity, there's
00:04:50.020 housing, there's communities, but then things start falling apart.
00:04:54.480 Infrastructure starts falling apart.
00:04:55.820 Law enforcement starts falling apart.
00:04:58.100 And that's what collapse looks like in our world.
00:05:00.380 Because I think what most people's evoked set is of collapse is a road warrior.
00:05:04.320 You know, it's like deserts.
00:05:05.880 There's nothing out there.
00:05:06.820 But no, no, no, we're, we're starting.
00:05:08.400 Our starting point for collapse is quite different from what people typically expect in their
00:05:13.180 fantasies.
00:05:13.620 Right?
00:05:14.000 Absolutely.
00:05:14.460 Yeah.
00:05:14.740 And I think a lot of people, the other thing they think of is they go, oh no, I'm being
00:05:17.900 reasonable.
00:05:18.460 It's not going to be like road warrior.
00:05:19.720 It'll be more like a developing country.
00:05:22.760 The problem is-
00:05:23.040 Yeah, but no, no, we're starting from a developed country.
00:05:25.300 No, it won't be like a developing country.
00:05:25.900 And the thing about developing countries is that they are going up from nothing.
00:05:30.660 You know, they are building incrementally with every step of development, which is very
00:05:35.760 different from a developed country collapsing.
00:05:39.920 Developed countries collapse and lead to very different cultural institutions than a struggling
00:05:45.920 developing country.
00:05:46.980 And the, the life on the ground of a developed country collapsing is astronomically worse than
00:05:55.280 the life on the ground of an equivalent, of an economically equivalent developing country.
00:06:00.960 So I'll give an example here.
00:06:02.720 One thing you see in South Africa right now is very frequent and rolling blackouts.
00:06:08.380 In a developing country, you are less likely to see this.
00:06:11.800 The reason you are less likely to see this in a developing country is, and you do still see
00:06:16.380 it occasionally, but like the regions are known, like they make sure their major cities have
00:06:20.380 electricity and then the outskirts, you know, because they're building new electrical stations
00:06:24.240 to sort of push out their electrical power generating capacity instead of having a total net of
00:06:30.020 electrical power generating capacity that is in the process of collapsing due to lack of
00:06:34.440 maintenance, due to political infighting, et cetera.
00:06:37.140 That is very different.
00:06:38.600 And what it means when you have things like electricity regularly going out in a district is it means
00:06:43.900 that many of the things that you take for granted, like restaurants or grocery stores are going to
00:06:49.960 look dramatically differently.
00:06:51.440 Restaurants really rely on refrigeration systems.
00:06:55.320 Grocery stores really rely on refrigeration systems.
00:06:58.640 These types of industries become very, very difficult to operate without private generators,
00:07:04.340 which can be very expensive.
00:07:06.400 So that's one thing that, that, that, that just like an example of the type of thing that you may not
00:07:10.800 have that you would think, oh, I would have this in a developing country where they develop
00:07:15.760 different solutions for that sort of stuff, like icebox type of stuff and, and cuisines that are
00:07:20.260 built around, you know, local produce.
00:07:23.620 Whereas in a developing, a developed country collapsing, you're not going to have that.
00:07:27.700 So things like restaurants disappear, things like food distribution begin to disappear.
00:07:31.360 Another thing that is really interesting that you see in South Africa right now that I think is going
00:07:35.020 to be very common around the world is sort of fortresses.
00:07:39.740 Yes.
00:07:39.900 Like in South Africa, there are many, what are called gated communities, which essentially are
00:07:44.780 fortresses, but then each house itself is also fortified in a pretty significant manner.
00:07:49.920 So you've basically got fortresses within fortresses.
00:07:52.460 And then in between, of course, you have sort of more dangerous dead zones, but then you're sort of
00:07:56.420 really looking at something that looks actually quite futile.
00:07:59.400 This is, this is what, when you look at an old, like, you know, medieval or even earlier
00:08:04.240 sort of fortified area, it looks like you have sort of an outer wall and sort of a slightly
00:08:09.400 protected area and then an even more fortified and protected area within, right?
00:08:13.140 And this is because I think what a lot of people think of a world with falling apart police forces
00:08:17.860 or police forces that are bought by organized crime, what they imagine is a world was just more
00:08:24.120 like regular crime, like the type of crime we see today.
00:08:27.240 And that is not what you see.
00:08:29.460 What you see is a large systemic organized crime and aggressive organized crime that will
00:08:36.540 sometimes collude with corporations, that will sometimes collude with wealthy individuals,
00:08:40.680 that will sometimes collude with the state and not so much.
00:08:44.580 I mean, you will still see random crime and much more random crime, but there is a new
00:08:47.740 type of crime and it's the type of crime we've had in the U.S. before.
00:08:50.740 Like when we were developing, you know, Mobs of New York is the movie about back in the
00:08:55.120 days when we had more organized crime or, you know, Boss Tweed or, you know, any of
00:08:59.180 the...
00:08:59.780 You mean Gangs of New York?
00:09:01.740 Oh, yes.
00:09:02.120 Gangs of New York.
00:09:03.080 I'm thinking Boss Tweed and stuff like that.
00:09:04.420 Yeah.
00:09:04.660 I mean, you know, we had the mob, we had the mafia, we had the, you know, we've had our
00:09:09.020 share of organized crime in the U.S.
00:09:11.920 We just don't really live with that right now.
00:09:14.080 We have some organized crime institutions, but they are mostly relegated to a really high
00:09:22.160 level of prominence within lower income communities, whereas when a state is collapsing, they are
00:09:29.460 part of everyone's everyday life to an extent and being able to defend against them or ally
00:09:35.300 with them.
00:09:35.740 And this is where feudalism comes into play, is you are choosing your allies within the
00:09:42.580 state because who you are allied with within the state matters a lot.
00:09:46.720 Exactly.
00:09:49.380 But we haven't really gotten to why we call this techno-feudalism, and this is something
00:09:54.880 we're going to elaborate more on in the next episode because there's a lot to talk about
00:09:58.600 in regards to this.
00:09:59.380 But when we look at fertility rates, the two dominant strategies right now, when I say
00:10:08.060 dominant, I mean the strategies that have been very, very successful at maintaining
00:10:11.040 high fertility rates are to either culturally impose traditions which lower the income of
00:10:18.940 members who practice that culture or traditions, which increases the fertility of that group.
00:10:24.080 An example here might be Jehovah's Witnesses banning their kids from going to college.
00:10:27.000 I mean, there's also practical reasons to do that because they get brainwashed and stuff
00:10:30.000 like that.
00:10:30.320 I get that, right?
00:10:31.280 But I'm saying it also helps with their fertility rates because it lowers their income.
00:10:34.860 Another thing is to prohibit the engagement with technology.
00:10:40.080 Now, both of these practices limit the technological reach of a cultural group.
00:10:47.380 I'll say the, sorry, they limit the economic potential of a cultural group.
00:10:52.060 Well, and sort of widespread influence you can have because it is those who develop tech that
00:10:56.980 is then widely adopted that are going to have a lot of influence across groups in the future.
00:11:02.500 Yeah, yeah.
00:11:03.000 So I guess I should elaborate this word because that's going to then define economic, but I
00:11:06.160 really mean economic and cultural potential a group can have.
00:11:08.460 You are severely economically and culturally limited as you make these restrictions.
00:11:14.100 And the thing about these restrictions that's important to note is some religious organizations
00:11:19.520 are like, oh, we'll just minorly restrict this stuff, right?
00:11:22.220 Like some cultural groups.
00:11:23.800 We'll just minorly restrict it.
00:11:25.140 Or we'll just make some minor things which lower the economic potential of our members.
00:11:28.820 The problem is, is that if there are different fractions of your organization, which make
00:11:34.260 more extreme limitations of their members, they will out-compete yours.
00:11:39.220 Yeah, we've heard this, for example, with Mennonites, that the groups that are more permissive
00:11:43.600 around technology also tend to see more social ills that undermine the integrity of that particular
00:11:51.900 religious, social subculture.
00:11:53.440 So the moment you lean into this at all, intergenerationally, what will happen to your group as a cultural
00:12:00.340 strategy is you will essentially be wiped out by the most extreme Luddites.
00:12:05.240 And I don't mean Luddites as like a derogatory term.
00:12:07.600 I mean Luddites as people who disengage with technology.
00:12:11.080 You would have heard it as like the most air-gapped subcultures.
00:12:15.180 Yeah, the most air-gapped subcultures.
00:12:16.860 And it is because on a pretty linear, or I'd almost say logarithmic level from what I've seen
00:12:22.540 in the data, the more iterations of your tradition disengaged with technology, the higher their
00:12:27.840 fertility rate will be.
00:12:28.820 So as soon as you lean into this as a strategy, the iterations of your culture that go all the
00:12:33.160 way with it are the ones that will be represented in the future.
00:12:36.120 And so then there's other groups that don't lean into this strategy at all.
00:12:39.740 Like our group, you know, we might even lean in the other direction with this.
00:12:44.620 And these are the groups that are going to really determine where our species is going.
00:12:50.820 And this is where techno-feudalism comes from, because these groups will be far and in between
00:12:55.680 on the world stage.
00:12:57.580 So we talk about the concept of like a network state, right?
00:13:00.660 Like the, I don't know if you guys are familiar with the concept of it.
00:13:02.940 Do you want to go into it, Simone?
00:13:03.740 Sure.
00:13:05.700 A network state is basically a fully digital community that may use similar currency, have
00:13:11.880 similar social mores or similar regulation, but there is no particular geographic concentration
00:13:18.160 of where they're going to be.
00:13:19.380 There may be like geographic correlation.
00:13:22.240 Yeah.
00:13:22.420 Simone, you had talked to me in one of your books about like time zone based.
00:13:26.620 Yeah.
00:13:27.180 Cory Doctorow wrote a book called Eastern Standard Tribe a while ago that I think somewhat predicted
00:13:32.260 the way that communities are going to sort out.
00:13:34.760 And it sort of, it talked about a near future in which social groups correlated more by the
00:13:39.280 type of time zone that you kept, because that's when people were online at the same time you
00:13:43.980 were, and that's when you were all hanging out.
00:13:45.780 And that's obviously where the title comes from.
00:13:47.920 So network states remind me a lot of that kind of community rather than what we're really
00:13:52.500 describing, which has a strong geographical basis.
00:13:55.600 So what we would call, what we're describing is a network empire.
00:13:58.840 We actually went over a few names here and we're like, well, it's not really top down.
00:14:02.140 And I'm like, so it's more like the Holy Roman empire, but unlike the Holy Roman empire,
00:14:06.000 for people familiar with your history, the Holy Roman empire was a German empire made
00:14:11.260 up of a bunch of like feudal players that were often largely under the rule of a single
00:14:16.740 individual, but on the outskirts, they weren't.
00:14:19.180 So like 80% would basically be under a hierarchical rule and 20% wouldn't be.
00:14:23.360 What we suspect the network empire to look like is it's going to be about 10 to 15%, maybe
00:14:29.680 even as high as 25% under a single rule, but most of it will be independently ruled.
00:14:34.660 So it'll be much more like a fractured sort of Holy Roman empire.
00:14:38.120 You can think of it.
00:14:39.400 And what makes it a network empire instead of a network state.
00:14:43.780 And I think the core thing that Balaji, when he came up with this concept and missed is the
00:14:49.380 insecurity of a future world when we're dealing with wide-scale economic collapse.
00:14:54.300 By the way, if you're wondering why we're so certain there's going to be wide-scale economic
00:14:57.600 collapse, you can look at any of the videos where we talk about what falling population
00:15:02.000 means for the world economy.
00:15:03.620 If we enter a stage in which the world's economy is shrinking on average, we are entering a
00:15:09.620 stage in which most Western economies begin to collapse.
00:15:14.040 Alternatively, if we enter a stage in which the world's economy is continuing to grow, but
00:15:19.360 it's continuing to grow almost solely because of AI, then we enter a stage in which the bourgeoisie
00:15:24.500 officially no longer need the proletariat.
00:15:26.740 That chain has finally been cut.
00:15:29.400 And everyone under maybe like a standard deviation or two standard deviations from the mean in
00:15:35.380 terms of IQ or who doesn't have connections is going to be frozen out of the economic system
00:15:40.460 because people just don't need them.
00:15:42.480 And they're like, no, they'll be nice.
00:15:44.160 When have they ever been nice?
00:15:46.180 Like historically, when have ever the powerful been nice to people unless it was in their best
00:15:51.400 interest?
00:15:51.700 And then you'll be like, oh, well, the people will rebel.
00:15:54.340 And it's like, yes, there are, for example, wealthy people in South Africa.
00:15:57.900 They just buy better security.
00:16:00.260 OK, they just build better fortresses for themselves.
00:16:04.520 That is the world that we are heading into where it is cheaper for the wealthy class in
00:16:09.640 our society to isolate themselves from everyone else than it is for them to ensure widespread
00:16:15.940 prosperity.
00:16:16.700 So either AI is great, it ends up solving the economic problem, the rich end up getting
00:16:22.600 even richer, but we still end up with this wide scale systems collapse, or we end up in
00:16:28.820 a situation.
00:16:29.560 It may not happen in a few countries.
00:16:31.080 Like there may be a few countries like Sweden or Norway or something that through their like
00:16:34.820 national sovereign fund find a way out of this.
00:16:37.180 Right.
00:16:37.940 Fine.
00:16:38.580 It's going to happen in a lot of places, although it will probably happen to them due to,
00:16:43.240 well, we don't need to get into that.
00:16:45.820 But anyway, or we're just dealing with a global economic collapse because it turns out that
00:16:51.100 AI doesn't replace the fact that populations are collapsing and we end up with a shrinking
00:16:54.540 world economy.
00:16:55.700 And due to all the debt and leverage that we've taken out at every layer of the economy, things
00:16:58.580 begin to fall apart.
00:16:59.440 Again, we've talked about this in other places.
00:17:01.220 So anyway, this is just likely going to be what happens is wide scale economic collapse.
00:17:05.920 And this is what it looks like.
00:17:07.240 Now, the reason why it matters so much that the winning strategies are these technophobic strategies
00:17:12.320 and the reason why Balaji's predictions aren't going to turn out the way he thought they
00:17:17.660 would is in a world in which security is a thing of scarcity, it makes sense for economically
00:17:25.400 productive groups to cloister together.
00:17:28.520 Like just a lot of sense.
00:17:30.380 Like if you're an economically productive group, you hang out with other economically
00:17:33.580 productive people, both for your kids and family safety and for cultural reasons and for
00:17:39.200 the purposes of more economic production, because you're going to be more economically
00:17:42.800 productive as a group.
00:17:45.460 And so this will lead to small what we call havens, essentially communities where high
00:17:52.280 technology is produced, but that are otherwise defended, that likely have their own power generation
00:17:57.700 and everything like that, networked with other havens that exist around the world.
00:18:03.900 And that is what we mean by the network empire.
00:18:06.600 Each of these havens represents a city-state in sort of this network lattice of empire.
00:18:12.660 And that is where we think things are going.
00:18:15.820 We do not think it is a great thing that things are headed in that direction.
00:18:19.420 We think it will be a, during the period, this is happening, a darker world than the world
00:18:24.420 we live in today.
00:18:25.140 But I do think that from this networked connection of havens, we can rebuild a better civilization
00:18:31.460 and one that is not likely to collapse in this cycle of civilizational rise and civilizational
00:18:36.220 collapse.
00:18:36.680 We just need to go into building the next one very intentionally, using the lessons we
00:18:42.120 have learned from this one, which is-
00:18:43.680 Well, it's very much carrying the torch of free markets through what could otherwise be
00:18:48.460 described as a dark age, because you have these different techno feuds and fiefdoms, essentially,
00:18:54.800 carrying forward different economic specializations, but in a way that's far more sophisticated
00:18:59.760 thanks to the existing technology into which we're entering this age.
00:19:04.120 So we're able to sort of carry the torch of what we have.
00:19:06.800 And then everyone is still able to accelerate, I think, a lot of development, maybe even in
00:19:12.180 ways that we haven't been able to in the past, because in a post-collapse world like this,
00:19:16.520 regulatory oversight is going to ease up a little bit.
00:19:20.880 There just won't be governmental resources sufficient to police people, which could actually
00:19:26.640 lead to a sort of weird dark age plus renaissance at the same time.
00:19:30.840 So it's a dark age everywhere you look, but when you go behind some highly fortified walls,
00:19:35.680 you see some amazing innovation taking place.
00:19:38.760 And that's why we're both doomy, but also optimistic.
00:19:44.980 You can be both at the same time.
00:19:46.380 It's a weird mix of doominess and optimism.
00:19:48.200 I would also say that this prediction better explains why we are so interested in far north
00:19:54.000 charter city settlements.
00:19:55.540 So people have heard our ideas around charter cities before, and one of the queries that
00:19:59.540 it is in an inhospitable, easy to defend region, not adjacent to other population settlements.
00:20:05.260 Yeah, you don't want to be where everyone like raids to get supplies or agricultural land
00:20:09.940 or something like that, you know.
00:20:11.280 Right, yeah.
00:20:11.960 If you build like a high technology settlement in the center of an area that has a large population,
00:20:17.940 that means large, well-armed gangs or governments that want what you have, which is technology
00:20:24.000 or wealth.
00:20:25.860 Alternatively, if you have one that is in the far north somewhere that a group would have
00:20:30.060 to travel a very hard time to both get to and get out of, especially one that was known
00:20:35.940 for extreme austerity outside of their technology, there would be no reason to ever raid it.
00:20:41.300 I mean, what are they going to get are axiotal tanks, as we were joking about with Razib.
00:20:45.900 They're unusable by most other cultural groups.
00:20:49.220 So that's another reason why it makes sense to, and some people will go the other route.
00:20:55.180 Some wealthy people will create sort of opulent gated areas where they try to let in only
00:21:02.360 the best and the brightest by their definition.
00:21:04.940 And these communities will, I think, flourish for a short time until people realize how unsafe
00:21:13.760 they are in terms of a long-term place to base yourself if you want anything other than
00:21:19.080 short-term hedonism and vanity, because they are incredibly difficult to fit.
00:21:24.240 And this is especially true of islands.
00:21:26.280 Islands, you know, growing up before they lost everything, my family had an island in the
00:21:31.460 Bahamas that I would go to all the time.
00:21:33.020 And, you know, now when I go back to it, it's all shot out by pirates and stuff like
00:21:35.780 that.
00:21:35.960 And even when we were there, you know, we had to have guns and everything for pirates.
00:21:38.680 It was creepy.
00:21:40.120 It was really creepy to see.
00:21:42.120 Yeah.
00:21:42.520 Yeah.
00:21:42.820 So, I mean, people do not realize islands are about the least defendable thing you could
00:21:47.200 possibly have.
00:21:48.400 Yeah.
00:21:49.160 Well, and it's not just that.
00:21:50.920 It's the extreme weather, you know, it's climate change.
00:21:54.300 And we've spoken with people who are like, yeah, I'm going to build my, you know, city
00:21:58.160 state on an island.
00:21:58.980 And we're like, I don't know, like, what are you going to do about hurricanes?
00:22:00.940 And like, oh, we're just going to like raise everything on platforms.
00:22:04.760 And it's like, well, okay, what about the electricity?
00:22:07.800 That does not work.
00:22:08.920 You are, these are people who have, who are thinking very short term.
00:22:13.020 They're like, how are they?
00:22:13.680 They haven't spent a lot of time.
00:22:14.900 They haven't like spent a full year or a couple of years in the Caribbean, I'm guessing.
00:22:19.520 Oh my God.
00:22:20.000 And I can tell you another thing about islands.
00:22:21.580 They are not a good place to do anything technological because everything corrodes.
00:22:25.340 You can't even have like a desktop PC for more than a couple of years.
00:22:29.260 Yeah, seriously.
00:22:29.820 Like all the electronics.
00:22:31.140 Well, and not to mention like all your other supplies.
00:22:33.640 If we're talking about also a post-collapse world in which like, you know, getting more
00:22:38.240 towels is a little tough, right?
00:22:40.540 Like, you know, even we.
00:22:42.400 Your clothes, your towel, everything.
00:22:43.900 Yeah.
00:22:44.020 When we lived in Peru, like right, right on the coast, which is kind of like any island
00:22:47.980 environment would be constantly, everything was covered in mold.
00:22:51.100 Everything was degrading.
00:22:52.060 All of our electronics were breaking down.
00:22:53.680 So like, imagine this, like it's a post-collapse world.
00:22:56.520 You can't easily get electronics, supplies, like clothing, fabrics, et cetera.
00:23:00.860 And then it's just everything like you're making the lifespan of all of your supplies
00:23:06.600 artificially, you know, unnecessarily short.
00:23:09.520 It is bonkers.
00:23:11.180 But alternatively, if you are in the far North, everything is artificially links in its
00:23:16.420 lifetime, which is, and you can do sorts of processing that may not be possible in other
00:23:22.220 areas due to the cooling resources.
00:23:24.140 Like, it's just so obviously the right choice if you are optimizing for literally anything
00:23:29.140 other than immediate personal comfort.
00:23:31.640 And what, honestly, what a lot of network states right now are optimizing for without
00:23:35.240 really admitting they're optimizing for is areas with a high level of government instability
00:23:39.340 because that's where they're able to make the deals.
00:23:41.680 And typically far North governments or far South governments are the most stable governments
00:23:46.360 and the governments in warmer regions are the less stable governments.
00:23:50.280 And again, why unstable governments?
00:23:52.460 You know, yeah, they're going to say yeah now, but they're totally not going to say yes
00:23:55.380 later.
00:23:55.700 You know, they'd be taken over by some other group.
00:23:57.380 Anyway, just those are.
00:23:58.280 You're actually better off setting up an illegal settlement in the area of a stable government
00:24:02.520 where you can predict their actions than setting up a legal settlement in an unstable
00:24:06.600 government area.
00:24:07.660 Yeah.
00:24:07.840 But anyway, I'm very excited to go further into this vision for the future.
00:24:13.220 And I think the next of these episodes, it'll be a bit of a two-parter, but also a bit
00:24:17.300 separated.
00:24:18.060 And Simone, I love talking with you about this.
00:24:20.000 Do you have any final thoughts?
00:24:21.340 I love your beautiful face.
00:24:23.720 That's all.
00:24:24.320 That's a sweet thing to say.
00:24:25.740 I love your beautiful face too.
00:24:27.600 We were on just pearly things recently and we were being sweet on each other.
00:24:32.340 And I think we made everyone deeply uncomfortable because this is not, you know.
00:24:35.380 Yeah, the guy was like, should I not be in between you?
00:24:37.680 Because they sat this guy in between us.
00:24:39.420 They're like, we're choosing where everyone sits.
00:24:41.280 So more, if you're wondering why we sat apart on just pearly things, that was because we
00:24:45.140 were directed to by the producer.
00:24:47.060 Yeah, we wanted to sit together.
00:24:48.280 Not because we wanted to.
00:24:49.080 Yeah.
00:24:51.720 Love you, gorgeous.
00:24:52.980 I love you too.