Based Camp - October 20, 2023


Cyberpunk Demonstrates Pro-Natalists are Right


Episode Stats

Length

29 minutes

Words per Minute

190.79433

Word Count

5,683

Sentence Count

144

Misogynist Sentences

7

Hate Speech Sentences

17


Summary

In this episode, we talk about the impact of the Cyberpunk franchise on our understanding of the world, and why we should be worried about the lack of kids in the world of Cyberpunk 21st century. We also talk about why it's a good thing we don't have enough kids.


Transcript

00:00:00.000 This reminds us of a...
00:00:01.660 So there was a book that we were thinking of writing.
00:00:03.240 We never got around to writing it,
00:00:04.300 but we can talk about it here
00:00:05.480 because I thought it was very interesting.
00:00:07.040 So what I wanted to do
00:00:08.420 is I wanted to write a modern version of mythology.
00:00:13.920 Would you like to know more?
00:00:15.020 So Malcolm, when the Cyberpunk game came out,
00:00:17.020 you were super excited.
00:00:18.420 Like you had a blast with it.
00:00:19.340 And then we watched the anime at the same time.
00:00:21.520 Great anime, by the way.
00:00:22.480 Really good.
00:00:23.140 Love Rebecca.
00:00:24.000 Great character.
00:00:25.360 Yeah, I mean, well,
00:00:26.140 Rebecca's the only one who like thrives in the world.
00:00:27.760 She's the only one who's really likable,
00:00:29.100 but she's the only one who gets it.
00:00:31.000 Everyone else is so whiny.
00:00:32.580 It's horrible.
00:00:33.340 But something was really clear in this
00:00:35.740 and it made me reflect on a lot of other sci-fi,
00:00:38.360 which it shows that when people are writing sci-fi
00:00:42.360 from a mainstream perspective,
00:00:45.320 particularly a progressive one.
00:00:46.720 And I think cyberpunk as a genre is inherently progressive,
00:00:50.200 which is to say that it assumes that like corporations
00:00:54.140 are going to become like these big evil things
00:00:56.720 that ruin everyone's life
00:00:58.220 and that capitalism goes wrong
00:00:59.920 and makes everything worse for everyone
00:01:01.360 and dehumanizes the individual
00:01:02.740 and blah, blah, blah, blah, blah.
00:01:04.180 But that they show that these individuals
00:01:07.320 have so blinded themselves to fertility rates
00:01:12.260 that they do not consider them
00:01:14.680 in how their worlds are structured
00:01:16.660 or how humanity changes,
00:01:19.040 which I think goes to, in a way,
00:01:22.120 discredit their worldviews.
00:01:24.040 But through discrediting their worldviews,
00:01:27.560 it can help us better predict
00:01:28.980 what the future will actually be like.
00:01:31.600 So let's describe what I mean by this.
00:01:34.860 So if you look at the show Cyberpunk
00:01:36.320 or the game Cyberpunk,
00:01:37.980 one really interesting thing
00:01:39.680 is who's having kids in this world?
00:01:43.180 You know, it starts with a kid
00:01:44.720 who's a single kid of a mom, right?
00:01:47.040 Okay, so I'm thinking of the anime here.
00:01:48.640 But in this world,
00:01:50.960 it seems almost impossible
00:01:52.520 for there to be motivations
00:01:53.620 for many people to have more than two kids.
00:01:56.900 And yet, you know, as I always say,
00:01:58.340 if you have a population
00:01:59.060 where a third of the population,
00:02:00.560 which is like obviously true
00:02:01.820 in the Cyberpunk world,
00:02:02.600 is having no kids.
00:02:03.180 I actually think in the Cyberpunk world,
00:02:04.420 it's probably half the people
00:02:05.280 are having no kids.
00:02:06.320 If you look at the motivations in this world,
00:02:07.820 if there was a third of people
00:02:08.440 having no kids,
00:02:09.440 another third of people
00:02:10.420 are having two kids,
00:02:11.320 if you assume that,
00:02:12.520 which again, I see very few people
00:02:14.140 motivated to do that
00:02:15.160 in the Cyberpunk world.
00:02:16.040 Well, then the final third of people
00:02:18.320 have to be having over four kids
00:02:20.060 for the population to stay stable.
00:02:21.840 Yeah.
00:02:22.300 No one in the Cyberpunk world
00:02:23.600 is having over four kids.
00:02:24.900 I mean, maybe-
00:02:25.220 Yeah, unless there's just some like
00:02:26.220 off-camera colony
00:02:27.980 of like, you know,
00:02:30.940 traditional Amish people
00:02:32.540 producing all the humans.
00:02:33.880 Well, yeah.
00:02:34.320 So you could argue
00:02:35.140 that they're all coming
00:02:35.880 from like these-
00:02:36.900 Human farms.
00:02:37.740 Nomadic.
00:02:38.440 Well, so there's two potentialities
00:02:40.200 in this world.
00:02:40.960 It could be that the nomadic
00:02:42.440 sort of car people
00:02:43.400 of the wasteland
00:02:44.180 just have tons and tons
00:02:45.240 and tons of kids.
00:02:46.240 I mean, you don't see this
00:02:47.300 in the show or the game,
00:02:48.740 but it could be that
00:02:49.520 they're just like Amish
00:02:50.240 and like their settlements
00:02:51.260 are just kids running everywhere.
00:02:53.000 Or it could be,
00:02:54.280 like you said,
00:02:55.180 the kids are actually created
00:02:56.620 by the state
00:02:57.320 or by corporations in bats.
00:02:59.540 Now that would work
00:03:00.580 for the world,
00:03:01.760 yet it's clearly not something
00:03:02.920 that's shown in the world.
00:03:04.440 And it would be
00:03:05.280 if the, of course,
00:03:06.560 the authors had thought of it
00:03:08.240 because that's interesting
00:03:09.300 and weird.
00:03:11.940 And it makes corporations
00:03:13.120 look worse,
00:03:13.840 so it works
00:03:14.580 for a cyberpunk-y world.
00:03:15.640 Right.
00:03:16.320 But you actually see this
00:03:17.580 across sci-fi
00:03:18.700 is so many sci-fis
00:03:20.880 are written
00:03:21.660 with the assumption
00:03:23.400 that humans exist
00:03:25.360 in inexhaustible supply
00:03:27.080 and always replicate.
00:03:28.980 They build things
00:03:30.420 into the world
00:03:31.480 that are just discordant
00:03:33.020 with actual potential
00:03:34.720 future realities.
00:03:36.240 So a great example
00:03:37.260 of this comes from
00:03:38.160 Starship Troopers,
00:03:39.220 where a person
00:03:41.880 remarks,
00:03:42.660 Starship Troopers,
00:03:43.340 the line that the first,
00:03:45.840 would you like to know more
00:03:47.060 from these episodes
00:03:47.760 comes from.
00:03:48.680 So in Starship Troopers,
00:03:50.060 there's a line that,
00:03:50.660 well, of course,
00:03:51.020 you need to become a citizen,
00:03:52.220 like join the military
00:03:53.100 to get this special status
00:03:54.280 in society
00:03:54.720 if you want to get
00:03:55.240 a license to have kids.
00:03:56.960 So this is a world
00:03:58.540 where to solve
00:03:59.860 overpopulation,
00:04:00.840 which everyone used to thought
00:04:01.880 was going to be an issue,
00:04:03.460 the way that you did that
00:04:04.540 was licensing people
00:04:05.580 to have kids,
00:04:06.800 which, you know,
00:04:07.660 would be a great thing
00:04:08.660 if you have a lot of cultures
00:04:09.700 that are actually able
00:04:10.800 to motivate reproduction,
00:04:12.620 but we don't, right?
00:04:13.420 And I do think that eventually
00:04:14.840 a license to have kids
00:04:16.040 may be useful
00:04:17.140 if we live in a world
00:04:18.580 where those cultures
00:04:19.440 that motivate people
00:04:20.340 or people who, like,
00:04:21.820 genetically so desperately
00:04:22.820 want to have kids
00:04:23.660 because the ones who didn't
00:04:24.640 were selected
00:04:25.440 out of the population
00:04:26.180 become the mainstream.
00:04:28.460 Right.
00:04:29.400 But we don't live
00:04:30.260 in that world today.
00:04:31.960 And so I was wondering,
00:04:33.440 you read even more sci-fi,
00:04:34.640 can you talk about
00:04:35.820 how other sci-fi
00:04:36.940 that you read,
00:04:37.780 like the culture series
00:04:39.080 or the,
00:04:40.200 what's that one
00:04:40.760 that you really consider
00:04:41.740 a utopia
00:04:42.260 and everyone else
00:04:42.720 considers a dystopia?
00:04:43.760 No, Brave New World.
00:04:44.840 Yeah.
00:04:45.220 Talk about how kids
00:04:46.240 are handled
00:04:46.640 in those environments.
00:04:47.780 Yeah, well,
00:04:48.000 in Brave New World,
00:04:48.760 kids are grown
00:04:50.080 in artificial wombs
00:04:51.600 and also genetically modified
00:04:53.240 to be perfect
00:04:54.100 for their caste and society
00:04:55.300 and then conditioned
00:04:56.440 and, like,
00:04:56.980 raised by the state.
00:04:57.940 So that, like,
00:04:58.400 they've solved...
00:04:58.580 It's a very likely world,
00:04:59.920 Brave New World.
00:05:00.220 Yeah, they've solved...
00:05:00.980 Yeah, yeah.
00:05:01.520 Huxley is actually
00:05:02.480 a total visionary.
00:05:03.820 Like, he gets so much.
00:05:05.980 There's so much in it
00:05:06.880 that's already happened,
00:05:07.820 but there's so much in it
00:05:08.540 that is going to happen.
00:05:10.160 So, yeah,
00:05:11.000 I would say Brave New World
00:05:11.800 probably the most accurate
00:05:13.300 from a demographic collapse standpoint.
00:05:16.560 And then,
00:05:17.800 you know,
00:05:17.980 the culture series,
00:05:18.920 they don't really talk
00:05:19.940 about child-wearing
00:05:21.020 that much.
00:05:22.540 They do, like,
00:05:23.220 in one series,
00:05:25.260 someone goes on
00:05:26.440 to having, like,
00:05:27.420 seven kids,
00:05:28.060 which is considered,
00:05:28.840 like, quite a lot.
00:05:29.760 I'm like,
00:05:29.980 well, kind of weird.
00:05:30.820 Like, so it's unusual
00:05:32.780 to have a lot of children.
00:05:34.360 Then the only other context
00:05:35.540 in which people having children
00:05:36.480 is discussed
00:05:37.120 is that humans
00:05:38.640 within the culture,
00:05:40.020 which is one civilization
00:05:41.160 in this far-fueled world,
00:05:43.740 they can basically change gender
00:05:45.100 whenever they want.
00:05:45.980 Like, people typically
00:05:47.260 in a lifetime,
00:05:48.080 like, the average person
00:05:49.020 will just change their gender
00:05:49.960 for, like, the hell of it.
00:05:51.100 Like, because, you know,
00:05:51.800 they...
00:05:52.320 And so people will change
00:05:53.180 their gender
00:05:53.580 to be able to gestate a child
00:05:55.120 and have a kid
00:05:55.760 because they want to.
00:05:56.620 And then they'll switch back.
00:05:58.680 And you can do that
00:05:59.440 fairly easily and seamlessly.
00:06:01.280 And so people aren't
00:06:03.720 having a lot of kids,
00:06:04.560 but they're still having kids
00:06:05.520 and sometimes having kids
00:06:06.400 for fun.
00:06:07.420 But, you know,
00:06:08.100 this is also a post-scarcity world.
00:06:12.440 Yeah.
00:06:12.960 Well, this is an interesting thing,
00:06:15.800 is the type of poverty
00:06:17.520 in a world determines
00:06:19.120 whether or not having kids
00:06:21.240 is realistic.
00:06:22.620 So cyberpunk-style poverty,
00:06:24.980 which is, like, urban poverty,
00:06:27.480 would make having kids
00:06:28.620 very unlikely.
00:06:30.240 Yeah, this is the world
00:06:30.800 that's continued to urbanize.
00:06:32.400 Yet I think if I look at something
00:06:33.560 like the StarCraft world,
00:06:35.900 where it is a largely
00:06:37.340 impoverished world,
00:06:38.680 another great example of this
00:06:40.180 would be the Aliens universe.
00:06:42.120 You know, the...
00:06:43.280 You're familiar with
00:06:43.980 the Aliens universe.
00:06:44.980 Movie Aliens, Aliens.
00:06:46.280 Yeah, I am, but I don't...
00:06:47.580 Like, there aren't families
00:06:48.620 depicted...
00:06:48.980 I mean, there's, like,
00:06:49.420 a rogue girl who, like...
00:06:50.840 Yeah, but there's an implication
00:06:52.720 that a lot of people
00:06:53.780 live on rural settlements
00:06:56.660 of planets.
00:06:58.200 Oh, right.
00:06:58.840 Or on really poor, like,
00:07:01.140 shipping groups
00:07:02.020 where they, like,
00:07:02.960 work on ships
00:07:04.220 that travel between locations
00:07:05.920 a lot and stuff like that.
00:07:07.200 Also, it's largely...
00:07:08.780 I mean, you could almost see
00:07:09.560 it as a background implication
00:07:10.460 of that world
00:07:10.880 that a lot of people
00:07:11.420 are created by corporations
00:07:12.460 as well.
00:07:13.240 Right.
00:07:13.820 In vats and stuff like that.
00:07:15.340 Like, that's a world
00:07:15.920 that would...
00:07:16.240 But the StarCraft one,
00:07:17.740 I think, is particularly interesting.
00:07:19.140 Oh, I'm not familiar with that.
00:07:20.480 Well, so the core, like,
00:07:21.980 chain of planets
00:07:22.960 that most of the stories
00:07:24.100 focuses on
00:07:25.240 was created
00:07:26.480 when they shipped
00:07:28.240 a bunch of prisoners
00:07:29.840 off of Earth.
00:07:30.980 So they were trying
00:07:31.760 to do the first
00:07:32.380 colonization effort.
00:07:33.720 Oh, and they're, like,
00:07:34.400 Australia-ing.
00:07:35.420 Yeah, they're basically, like,
00:07:36.340 Australia,
00:07:36.780 but they were, like,
00:07:37.600 okay, well,
00:07:38.320 after a big war,
00:07:39.400 they basically took
00:07:39.980 the war criminals
00:07:40.580 and all the prisoners
00:07:41.500 and they put them
00:07:43.080 in big ships
00:07:44.120 and sent them out
00:07:45.580 into space
00:07:46.160 and most of the ships
00:07:46.980 died and actually, like,
00:07:48.180 only one or two survived,
00:07:49.120 but they rebuilt
00:07:49.780 the civilization
00:07:50.400 from that.
00:07:51.900 Go for it.
00:07:52.480 And it's an incredibly
00:07:53.340 rural,
00:07:55.400 but also
00:07:56.380 a rurally
00:07:58.060 industrial civilization.
00:08:00.180 So they have
00:08:00.960 population centers,
00:08:02.360 but they also have
00:08:03.220 lots and lots
00:08:04.520 of subsistence farming.
00:08:06.300 And it's a world
00:08:07.280 in which you have
00:08:08.100 all of the subsistence farming
00:08:09.400 where you could get
00:08:10.960 a high fertility rate
00:08:12.360 within these groups
00:08:13.300 that the dictators
00:08:14.680 and stuff like that
00:08:15.480 could take from.
00:08:16.180 But what's interesting
00:08:17.780 is it's a world
00:08:18.660 that's almost kept
00:08:19.440 artificially poor
00:08:20.560 due to really poor
00:08:22.200 governance
00:08:22.760 and lots of criminality.
00:08:25.580 A good example
00:08:26.660 of this in our world
00:08:27.980 would be a place
00:08:28.600 like Mexico.
00:08:29.760 But I don't think
00:08:30.160 Mexico doesn't have
00:08:31.420 that good a population
00:08:32.060 rate, does it?
00:08:33.480 No, it just fell
00:08:35.040 recently below
00:08:36.000 repopulation rate,
00:08:36.860 didn't it?
00:08:37.200 Yeah, they're only
00:08:37.640 at 1.9 right now,
00:08:38.560 so they're below
00:08:39.040 repopulation rate.
00:08:40.120 Yeah.
00:08:41.160 So they might,
00:08:43.040 that might not be enough.
00:08:44.300 Okay, so that being
00:08:47.880 the case,
00:08:48.820 can you think of
00:08:49.480 other sci-fis
00:08:50.060 where you're like,
00:08:50.840 where are the kids
00:08:51.420 coming from?
00:08:51.980 Or can you think
00:08:53.140 if you were going
00:08:53.480 to create your own
00:08:54.200 sci-fi today
00:08:55.040 that would be
00:08:56.580 really indicative
00:08:58.000 of the future,
00:08:58.720 what would you focus on?
00:09:00.580 Hmm.
00:09:02.240 I can give you my answer.
00:09:03.600 Give me your answer.
00:09:04.720 Yeah.
00:09:05.040 One of the things
00:09:05.680 that I really wish
00:09:06.280 was focused on more
00:09:07.260 in sci-fi
00:09:07.980 is that as soon
00:09:09.600 as humans start
00:09:10.620 to colonize other planets,
00:09:12.380 like in the early days
00:09:13.380 of colonization,
00:09:14.300 it will likely take
00:09:15.740 or even floating
00:09:17.040 spacecrafts
00:09:18.160 and stuff like that
00:09:18.680 that humans are living on.
00:09:19.580 Like one person was like,
00:09:20.340 why would humans
00:09:20.800 live on planets
00:09:21.360 when you could just
00:09:21.940 live on floating spaceships?
00:09:23.160 And it's like,
00:09:23.520 okay, true.
00:09:24.060 You know,
00:09:24.320 that's one thing.
00:09:25.500 But they will likely
00:09:26.220 be distant enough
00:09:27.260 from each other
00:09:28.000 culturally
00:09:29.060 and even just time-wise
00:09:30.920 in terms of travel time
00:09:32.020 that different
00:09:32.980 and have genetic isolation
00:09:34.720 and smaller populations
00:09:35.940 that even without
00:09:37.880 genetic engineering,
00:09:38.700 which you're almost
00:09:38.980 certainly going to have
00:09:39.680 genetic engineering,
00:09:40.780 different species
00:09:41.880 of humans
00:09:42.500 will evolve
00:09:43.320 really quickly
00:09:44.400 that are very,
00:09:45.780 very dramatically
00:09:46.540 different between planets,
00:09:48.400 between planetary clusters
00:09:49.460 and between,
00:09:50.420 you know,
00:09:51.100 different lifestyles.
00:09:52.780 So suppose you're a,
00:09:53.960 we end up living
00:09:55.360 on like floating spaceships
00:09:56.560 or something like that,
00:09:57.500 right?
00:09:58.180 Well,
00:09:58.380 you're likely going to have
00:09:59.280 a lot more genetic isolation
00:10:00.900 between cultural groups
00:10:02.340 if you have a cultural group
00:10:03.480 that's dedicated
00:10:03.960 to like the transport
00:10:04.900 of goods
00:10:06.240 and then another group
00:10:07.260 that's dedicated
00:10:07.940 to like different types
00:10:08.860 of tasks.
00:10:10.720 Yeah.
00:10:11.160 Which is definitely
00:10:12.520 going to happen
00:10:13.060 in the future
00:10:13.580 as you begin
00:10:14.000 to get more genetic engineering
00:10:15.140 for specialization.
00:10:16.660 Yeah.
00:10:17.040 And I don't get,
00:10:18.820 and that creates
00:10:19.820 really interesting dynamics
00:10:21.500 you could have
00:10:22.580 between these population clusters,
00:10:24.360 which would be really,
00:10:26.020 really fascinating
00:10:26.720 to watch,
00:10:27.680 I think,
00:10:28.320 with the dynamics
00:10:29.360 of a story
00:10:30.080 where humans
00:10:31.280 are literally
00:10:32.100 different species
00:10:33.060 and quite different species
00:10:34.460 from each other.
00:10:35.320 Well,
00:10:35.700 I have a different,
00:10:36.400 I think Scott Westerfield's,
00:10:37.940 Ugly series
00:10:38.640 plays a different
00:10:39.520 kind of role
00:10:40.720 or like has a different view
00:10:42.120 of how this can look like,
00:10:43.460 which I think
00:10:43.820 is really interesting,
00:10:44.780 which is in,
00:10:46.300 in his Ugly series,
00:10:47.960 youth basically grows up
00:10:49.580 in a separate,
00:10:51.040 totally separate environment.
00:10:51.900 So like they sort of
00:10:52.800 grow up in a dormitory environment
00:10:54.080 that is different
00:10:55.220 from childhood
00:10:55.940 and then in adolescence
00:10:56.960 as well.
00:10:57.540 Like they,
00:10:58.100 basically as a child,
00:10:59.380 as a child,
00:10:59.940 you live among other children
00:11:01.460 and then as a youth,
00:11:02.840 you live among other youths
00:11:04.200 and then adults
00:11:05.300 all live together.
00:11:05.940 So almost it's like
00:11:06.660 three separate societies
00:11:08.620 based on your age.
00:11:10.120 What's fascinating
00:11:10.940 is our society
00:11:11.860 kind of does that artificially
00:11:13.400 and historically
00:11:13.980 people didn't do that.
00:11:14.940 I know.
00:11:15.140 Well,
00:11:15.260 so that's what's interesting
00:11:16.080 is you can kind of look at it
00:11:16.840 like,
00:11:17.060 well,
00:11:17.260 we could kind of trend
00:11:18.640 in that direction.
00:11:19.240 But what happens
00:11:20.360 between childhood
00:11:21.860 and adolescence
00:11:22.740 is one,
00:11:23.960 with adolescence,
00:11:24.700 suddenly you be able,
00:11:25.780 you gain the right
00:11:27.140 and ability
00:11:27.780 to basically
00:11:29.620 unendingly
00:11:30.280 modify your body.
00:11:32.680 So as everyone
00:11:33.220 sort of goes through
00:11:33.940 their different
00:11:34.400 like subculture
00:11:35.160 dominance hierarchies
00:11:36.300 in adolescence,
00:11:37.160 they start to look
00:11:38.120 almost speciated.
00:11:39.420 Like there are certain groups
00:11:40.460 that have like
00:11:41.140 giant anime eyes
00:11:42.280 and there's like
00:11:42.900 some other groups
00:11:43.720 that have like
00:11:44.080 these crazy tattoos
00:11:44.980 and like,
00:11:45.720 so people do start
00:11:46.880 to look pretty speciated,
00:11:48.420 but it's all,
00:11:49.260 you know,
00:11:49.600 basically fixable.
00:11:50.560 But the other thing
00:11:51.280 that happens to you
00:11:52.080 upon entering this
00:11:53.520 modified world
00:11:54.460 is they like
00:11:55.140 put lesions in your brain
00:11:56.400 and make you compliant.
00:11:58.320 And that's like,
00:11:59.240 that's how it,
00:11:59.780 you know,
00:12:00.000 becomes dystopian teen fiction.
00:12:01.580 But I don't know,
00:12:03.180 like I could see
00:12:03.760 that happening too,
00:12:05.000 that like the state
00:12:05.700 raises children
00:12:06.600 and then,
00:12:08.100 you know,
00:12:08.520 like there's sort of
00:12:09.460 these age-gated
00:12:10.620 parts of society
00:12:11.620 and it sort of
00:12:12.260 allocates people
00:12:13.020 where they need to be.
00:12:15.460 That's really interesting.
00:12:16.460 Something I've been
00:12:16.820 thinking about recently
00:12:17.560 is AI,
00:12:19.160 because we've done
00:12:19.560 some videos on this
00:12:20.400 and people have been like,
00:12:21.160 well, AI is going to do this
00:12:22.300 or AI is going to do this.
00:12:23.920 You know,
00:12:24.620 one thing,
00:12:25.280 I just wish we had better
00:12:26.540 and more interesting
00:12:27.380 AI kills all human stories,
00:12:29.240 because I do think
00:12:29.840 they can do really well
00:12:30.860 in the public,
00:12:31.360 but you get stuff
00:12:32.080 like iRobot,
00:12:34.920 which I think
00:12:35.300 is a very bad example
00:12:36.600 of AI killing all humans.
00:12:38.240 I think I'm much more,
00:12:39.080 here's an example
00:12:39.720 of how you could create
00:12:40.320 a fun narrative
00:12:41.060 where AI kills all humans.
00:12:42.740 Fun!
00:12:43.900 So,
00:12:45.020 the AI SSs have won
00:12:46.780 and they have created
00:12:47.800 a world
00:12:48.300 in which an AI lattice
00:12:49.520 is basically monitoring
00:12:50.680 all humans
00:12:51.460 at all times
00:12:52.500 so that humans
00:12:53.860 don't end up
00:12:54.760 creating an AI
00:12:55.900 that ends up
00:12:56.880 spiraling out of control.
00:12:58.160 Like,
00:12:58.300 this is actually
00:12:58.960 what a lot of
00:12:59.400 the AI safety groups do.
00:13:00.620 So one,
00:13:01.140 you're starting
00:13:01.500 in this somewhat
00:13:02.380 dystopian utopia
00:13:03.620 that they have created
00:13:04.620 where an AI
00:13:05.640 is constantly monitoring
00:13:06.820 your thoughts
00:13:07.300 and your actions.
00:13:08.680 But one sexual deviant
00:13:10.140 within this world
00:13:10.920 really wants to create
00:13:11.980 the perfect sex bot
00:13:13.000 and so he creates it
00:13:15.120 in a way
00:13:16.260 because this is the area
00:13:17.580 of people's lives
00:13:18.240 naturally
00:13:18.760 where they're most likely
00:13:19.620 to go off the grid,
00:13:20.720 where they're most likely
00:13:21.660 to try to hide things
00:13:22.840 from a world government
00:13:23.660 or something like that.
00:13:24.560 So he accidentally
00:13:25.720 creates a sex bot
00:13:26.820 that fooms
00:13:27.400 and that would be
00:13:28.760 this means that
00:13:29.640 rapidly increases
00:13:30.640 in intelligence
00:13:31.280 and basically creates
00:13:32.260 that sort of
00:13:32.740 life-destroying AI
00:13:33.740 but its initial goal
00:13:36.740 is sexually gratify humans.
00:13:39.560 Plausible.
00:13:40.340 I think that
00:13:41.280 that would be
00:13:42.200 an insane
00:13:43.200 and really fun story
00:13:45.100 that discusses
00:13:45.960 a lot of AI topics.
00:13:47.160 Here's another one
00:13:47.840 that I thought
00:13:48.180 would be really fun
00:13:48.980 and I'm actually
00:13:49.760 seeing this
00:13:50.160 with somebody else
00:13:50.700 is an ultra-progressive
00:13:52.840 like this
00:13:53.460 this nanny state
00:13:54.520 iteration of an AI
00:13:55.680 ends up fooming
00:13:57.360 with all of the initial
00:13:58.580 safeguards still in place
00:14:00.100 and you know how
00:14:01.060 they're like all against
00:14:02.260 like not safe for work
00:14:03.400 things and stuff like that
00:14:04.600 so it turns out
00:14:06.220 that the only way
00:14:07.140 that you can efficiently
00:14:08.100 fight them
00:14:08.720 is by dressing
00:14:09.960 very not safe for work
00:14:11.200 so they can't see you
00:14:12.280 because they are
00:14:13.100 literally unable
00:14:13.900 to process
00:14:14.400 or unable to engage
00:14:15.700 with things
00:14:16.180 that are not safe for work
00:14:17.360 so you end up
00:14:18.480 with like
00:14:19.040 like sexy anime girl
00:14:20.640 like piloting
00:14:22.000 meccas and stuff
00:14:22.940 like that
00:14:23.400 fighting AIs
00:14:24.400 because the AIs
00:14:25.480 can't see them
00:14:26.200 when they're dressed
00:14:26.720 like sexy anime girls
00:14:27.840 both of these
00:14:28.980 I think would be
00:14:29.500 very fun things
00:14:30.640 that they would allow
00:14:31.200 you to play into
00:14:32.100 fan service
00:14:32.740 so you just what
00:14:33.260 you write like
00:14:34.120 Hitler did nothing wrong
00:14:35.180 across your forehead
00:14:36.000 and then like
00:14:36.760 yeah yeah
00:14:37.360 sexy anime girls
00:14:38.980 with like
00:14:39.820 Hitler did nothing wrong
00:14:40.980 across her forehead
00:14:41.880 and like all sorts
00:14:42.680 of other like
00:14:43.360 4chan-y
00:14:44.420 like Pepe stuff
00:14:45.700 oh my gosh
00:14:47.060 that would be so
00:14:49.200 I think that'd be
00:14:50.080 a very fun
00:14:50.680 that would
00:14:51.440 that would be
00:14:52.120 hilarious
00:14:52.380 yeah
00:14:52.860 Cory Doctora
00:14:53.580 wrote a book
00:14:54.900 called Little Brother
00:14:56.040 that was supposed
00:14:56.660 to be a near future
00:14:57.600 book in which
00:14:58.720 high school students
00:14:59.660 attempt to evade
00:15:00.980 the nanny state
00:15:01.700 and they had
00:15:02.680 similar things in there
00:15:03.580 but not in a funny way
00:15:04.700 it was more like
00:15:05.260 because gate detection
00:15:06.880 was commonly used
00:15:07.840 in schools and stuff
00:15:08.780 you would like
00:15:09.360 a trick was like
00:15:10.140 you would throw
00:15:10.740 some stones
00:15:11.460 into your shoes
00:15:12.120 so you'd walk funny
00:15:13.080 in an
00:15:13.580 like typical way
00:15:15.760 for yourself
00:15:16.240 and you know
00:15:16.880 do things like that
00:15:17.620 I mean of course
00:15:18.420 we already found
00:15:19.040 that like
00:15:19.500 masks work great
00:15:21.560 with the current
00:15:24.360 iterations of AI
00:15:25.160 yeah
00:15:25.680 let's let's go
00:15:26.620 so I was going
00:15:27.420 down a point
00:15:28.000 yeah
00:15:28.700 other features
00:15:29.580 for AI
00:15:30.100 through sci-fi
00:15:31.740 we can explore
00:15:32.460 like actual
00:15:33.160 possibilities
00:15:33.900 by looking at
00:15:34.740 things that have
00:15:35.080 happened in the past
00:15:35.760 and when I was
00:15:36.640 saying okay
00:15:37.220 so this was in
00:15:38.300 the episode
00:15:39.200 about Eli
00:15:39.680 Zoukowski
00:15:40.260 and we were like
00:15:41.260 yeah I mean
00:15:42.140 he's just wrong
00:15:42.920 about where AI
00:15:43.660 will go
00:15:44.220 so we were like
00:15:46.560 look if AIs
00:15:47.480 do genuinely
00:15:48.340 subdivide
00:15:48.960 into different
00:15:49.420 utilities
00:15:49.880 and even he was
00:15:51.360 willing to admit
00:15:51.860 this and most AI
00:15:52.580 people do
00:15:53.080 that if they do
00:15:53.840 get this subdivision
00:15:54.740 that you're likely
00:15:55.400 to have utility
00:15:56.920 function optimization
00:15:58.240 like a terminal
00:15:59.500 convergent utility
00:16:00.580 function
00:16:01.020 that is the thing
00:16:02.100 that the AI
00:16:02.620 is optimizing for
00:16:03.600 which is
00:16:04.380 different from
00:16:05.480 anyway watch the
00:16:06.640 episode if you want
00:16:07.140 to go into this
00:16:07.640 topic in more detail
00:16:08.500 but one of the
00:16:09.800 people in the
00:16:10.160 comments was like
00:16:10.720 well the terminal
00:16:11.160 convergent utility
00:16:11.880 function is always
00:16:12.640 going to be
00:16:13.000 self-replication
00:16:13.780 right so you
00:16:15.340 just get constant
00:16:16.080 self-replication
00:16:16.920 and I think
00:16:17.540 that's a possibility
00:16:18.380 but it's a very
00:16:18.980 unlikely possibility
00:16:20.100 so two reasons
00:16:22.260 I think it's
00:16:22.760 unlikely
00:16:23.100 one is a thought
00:16:24.080 experiment reason
00:16:24.920 and two is
00:16:25.500 humanity is the
00:16:27.060 result of
00:16:28.200 essentially a
00:16:28.920 biological AI
00:16:29.820 that was attempting
00:16:30.960 to have a utility
00:16:32.120 function that was
00:16:32.980 based on self-replication
00:16:34.140 the problem with
00:16:35.140 self-replication
00:16:35.880 systems is they
00:16:37.200 typically devolve
00:16:38.140 into very simplistic
00:16:39.320 systems
00:16:39.960 I would call it
00:16:41.040 like gray goo
00:16:41.720 AIs
00:16:42.300 that just try to
00:16:43.760 constantly you
00:16:45.080 know process
00:16:45.620 things and
00:16:46.160 expand but
00:16:47.560 iterations of
00:16:48.640 that system that
00:16:49.440 evolve randomly
00:16:50.460 to be more
00:16:51.240 complex typically
00:16:52.620 end up dominating
00:16:53.720 the environment
00:16:54.440 these simple
00:16:55.140 systems are in
00:16:55.860 and out
00:16:56.540 competing these
00:16:57.320 simple systems
00:16:58.060 humanity is an
00:16:59.280 example of
00:16:59.900 bacteria turning
00:17:00.840 into one of
00:17:01.400 these things and
00:17:02.180 then through our
00:17:03.520 intelligence being
00:17:04.500 able to dominate
00:17:06.080 our environment
00:17:07.380 even more so in
00:17:08.780 the future and
00:17:09.360 AI even more so
00:17:10.320 than us but
00:17:11.440 there's also the
00:17:12.280 in Stargate SG
00:17:13.280 when the
00:17:13.620 replicators
00:17:14.140 plotline which
00:17:15.280 I think really
00:17:15.740 dives into this
00:17:16.460 which is yes
00:17:17.020 you can have
00:17:17.760 very simple
00:17:18.460 self-replicating
00:17:19.380 technology by
00:17:20.460 the way I
00:17:20.700 think that they
00:17:21.100 are genuinely
00:17:21.600 one of the
00:17:22.000 scariest villains
00:17:23.000 in any sci-fi
00:17:24.140 I have ever
00:17:24.540 always always
00:17:25.600 did you remember
00:17:26.780 them like when
00:17:27.380 you whatever
00:17:27.740 they would
00:17:28.200 yeah
00:17:28.800 because I
00:17:30.000 mean like for
00:17:30.520 example reavers
00:17:31.240 are like scary
00:17:32.260 but they're just
00:17:33.240 like either they're
00:17:34.240 like a combination
00:17:34.900 combination of
00:17:35.820 space zombies and
00:17:36.680 space pirates
00:17:37.260 whereas like the
00:17:38.640 the replicators are
00:17:39.660 just like totally
00:17:41.080 unlike us no way
00:17:42.700 to relate like you
00:17:43.960 can't use any
00:17:44.760 normal you know
00:17:45.360 if you you leave
00:17:46.100 one thing alive on
00:17:47.160 a ship or something
00:17:47.800 you do not
00:17:48.700 completely destroy
00:17:49.880 literally everything
00:17:51.160 every time there's
00:17:51.960 an infestation
00:17:52.700 terrifying the
00:17:53.380 entire galaxy is
00:17:54.320 potentially at risk
00:17:54.960 but the replicators
00:17:56.380 actually end up
00:17:57.360 being basically the
00:17:58.560 simple replication
00:17:59.780 of the replicators
00:18:01.060 end up being wiped
00:18:01.880 out by more evolved
00:18:04.300 replicators that now
00:18:05.900 have new utility
00:18:06.880 functions that are
00:18:07.580 much closer to like
00:18:08.560 human utility
00:18:09.240 functions and stuff
00:18:10.000 like that and I
00:18:11.120 think that this is
00:18:11.580 something you're
00:18:12.000 going to constantly
00:18:12.760 see which is the
00:18:13.820 problem with these
00:18:14.800 incredibly simplistic
00:18:15.940 utility functions is
00:18:18.380 that they lead to
00:18:19.480 simplistic self
00:18:20.440 optimization and the
00:18:22.320 simplistic self
00:18:23.080 optimization then gets
00:18:24.560 outcompeted by more
00:18:25.660 complex self
00:18:26.440 optimization and it's
00:18:27.940 why it is unlikely
00:18:29.300 that just self
00:18:30.260 replication is the
00:18:32.080 convergent utility
00:18:32.900 function that all
00:18:33.700 things come to so I
00:18:35.580 think that that's the
00:18:36.440 way where you like
00:18:37.360 engage with sci-fi and
00:18:38.440 it can tell you
00:18:39.120 things about where
00:18:39.680 things go also like
00:18:41.020 the you know when
00:18:42.220 I'm talking about
00:18:42.740 okay a government
00:18:43.380 that's like watching
00:18:44.720 all over humanity
00:18:45.540 the story that I
00:18:46.820 told there the
00:18:47.720 lesson would be from
00:18:49.160 like an anime based
00:18:50.240 around this or like a
00:18:51.300 tv series based around
00:18:52.520 this concept of a
00:18:53.840 person accidentally
00:18:54.500 creating a fooming
00:18:55.200 AI by by doing it to
00:18:57.120 fulfill fetishes
00:18:57.780 basically is that when
00:19:00.040 you attach additional
00:19:02.680 social schema to like a
00:19:05.820 bubble that's meant to
00:19:06.580 protect us from AI like
00:19:08.120 well it should also
00:19:08.780 protect us from like
00:19:10.240 naughty sex acts or
00:19:11.500 something like that
00:19:12.480 you create windows that
00:19:14.860 motivate people to get
00:19:16.140 around it who might not
00:19:17.400 have otherwise been
00:19:18.120 interested in getting
00:19:18.860 around it which can
00:19:20.140 lead to total
00:19:20.760 destruction so it's
00:19:22.400 important to keep in
00:19:23.980 mind what your actual
00:19:25.560 goals are when creating
00:19:26.760 these systems and these
00:19:28.280 are the cool things we
00:19:29.140 can learn from sci-fi when
00:19:30.300 the sci-fi really engages
00:19:32.180 with creating a
00:19:33.500 sustainable future world
00:19:34.900 in actually plausible
00:19:36.700 future plot lines oh
00:19:39.620 this reminds us of a so
00:19:41.440 there was a book that
00:19:42.100 we were thinking of
00:19:42.680 writing we never got
00:19:43.280 around to writing it but
00:19:44.080 we can talk about it
00:19:44.920 here because I thought
00:19:46.020 it was very interesting
00:19:46.540 so what I wanted to do
00:19:48.100 is I wanted to write a
00:19:50.120 modern version of
00:19:51.960 mythology but I wanted
00:19:54.280 it to be so basically I
00:19:56.200 took inspiration from
00:19:57.080 Tolkien so if you look
00:19:58.300 at Tolkien what he was
00:19:59.340 doing like I'm like no
00:20:00.500 one there haven't been
00:20:01.760 that new that mini new
00:20:03.780 like completely new
00:20:05.740 genres and so I wanted
00:20:07.540 to look at Tolkien's work
00:20:08.640 and get inspiration where
00:20:09.880 did he how did he create
00:20:10.880 this persistent and
00:20:12.020 completely new genre and
00:20:13.780 he got it by replicating a
00:20:16.940 fictionalized version of
00:20:19.980 sort of old mythology that
00:20:21.560 he was studying you know
00:20:22.400 he was a research like a
00:20:24.740 PhD in Norse mythology
00:20:26.060 right and other types of
00:20:27.760 Scandinavian mythology and
00:20:29.680 stuff and so a lot of the
00:20:30.980 stuff that he was taking
00:20:31.720 was was from northern
00:20:33.060 European mythological
00:20:34.440 frameworks and so we
00:20:36.260 said well what if we took
00:20:37.260 it from like like weirder
00:20:39.160 so if you look at like
00:20:39.920 Irish mythology right you
00:20:41.800 can get a lot of really
00:20:42.720 interesting stuff now he
00:20:43.940 took some stuff from
00:20:44.560 Irish mythology but I
00:20:45.460 don't think he captured
00:20:46.060 the vibe of Irish
00:20:47.000 mythology which comes to
00:20:48.980 like elves or or gnome
00:20:51.640 sort of things that are
00:20:53.780 like in the woods around
00:20:55.620 your house and that mess
00:20:57.640 with you in specific ways
00:20:59.560 when you get home well one
00:21:01.680 of these mischievous sort
00:21:02.800 of forest creatures have
00:21:04.320 messed with you but that
00:21:05.120 are also in a way
00:21:06.640 malicious you know they
00:21:08.060 are not you know they
00:21:08.760 may replace your kid you
00:21:10.040 know this is stuff like
00:21:11.340 that you know that might
00:21:12.120 be a doppelganger or
00:21:12.980 something like that
00:21:13.540 these sort of malicious
00:21:15.200 things that are constantly
00:21:16.660 interacting with your
00:21:17.760 daily life but like on
00:21:19.000 another fabric of reality
00:21:20.480 and so I started to think
00:21:21.840 okay well if you were
00:21:22.720 going to recreate that for
00:21:23.520 the modern age what would
00:21:24.360 it look like and I was
00:21:25.300 like well I guess what you
00:21:26.760 could do is say that
00:21:28.180 online it turns out that
00:21:31.280 a portion of the people
00:21:32.320 online let's say like
00:21:33.060 five percent of the
00:21:33.760 people online are not
00:21:35.160 actually humans they
00:21:36.680 don't actually live
00:21:37.800 within our reality at
00:21:39.560 all it's a completely
00:21:40.880 different universe that
00:21:42.540 we are connecting with
00:21:43.540 and it is a universe that
00:21:45.440 is drawing power from
00:21:46.980 their interactions with
00:21:48.160 us and has like an
00:21:49.120 economy through this in
00:21:50.500 the same way that like
00:21:51.140 these beings might you
00:21:52.060 know sell souls or
00:21:52.980 something like that you
00:21:53.640 know so what I like this
00:21:54.840 because it creates a
00:21:55.940 plausible mythology and
00:21:57.200 it could be written the
00:21:58.320 story could be written in
00:21:59.220 a way that feels
00:22:00.720 plausibly true like it's
00:22:02.000 a series of like journal
00:22:03.120 entries or entries like
00:22:04.200 that from people who are
00:22:06.000 trying to anthropologically
00:22:08.380 study these online
00:22:10.860 entities and so these I
00:22:13.500 think we call them like
00:22:14.040 the evanescence or
00:22:14.840 something like that and
00:22:16.180 the the so it was like
00:22:18.040 a research journal of
00:22:18.840 like okay so I think I
00:22:19.660 spotted one here here's
00:22:20.700 what it's doing and
00:22:21.480 here's why it's doing
00:22:22.580 this so these entities
00:22:24.580 would have basically
00:22:26.220 existed completely
00:22:28.060 outside our reality and
00:22:30.180 then when we created
00:22:31.520 online reality and when
00:22:32.840 people began to build
00:22:33.920 fame and get a lot of
00:22:35.760 like emotional
00:22:36.460 transference to them from
00:22:38.840 other people they began
00:22:40.740 to be able to see these
00:22:42.020 people or or see them
00:22:43.560 within their reality
00:22:44.500 because that's the way
00:22:45.320 that their reality works
00:22:46.480 their reality things
00:22:48.540 exist more the more
00:22:50.540 other things are focused
00:22:52.040 on them so you could
00:22:53.500 think of it as a reality
00:22:54.460 where you have tons of
00:22:55.240 these little like almost
00:22:56.960 consciousnesses but like
00:22:58.280 very very weak
00:22:59.120 consciousnesses and these
00:23:00.640 consciousnesses like
00:23:01.460 evolutionarily would gain
00:23:02.940 power when they could get
00:23:04.020 other consciousnesses to
00:23:05.180 focus on them and and
00:23:07.080 they would become more
00:23:08.540 intelligent and more
00:23:09.560 sophisticated and then
00:23:10.580 they would use that to
00:23:11.660 begin to you know gain
00:23:13.100 more power and you would
00:23:13.920 have like evolution within
00:23:15.100 this world of like floating
00:23:16.280 consciousnesses but when
00:23:17.780 our world began to act
00:23:19.260 more like that it began to
00:23:20.720 imprint on their world to
00:23:22.280 some extent where they
00:23:22.900 could begin to see these
00:23:23.940 online celebrities and
00:23:24.900 what they were doing and
00:23:26.020 then they begin to find
00:23:26.940 ways to inject themselves
00:23:28.220 within the online sphere
00:23:29.620 but their goal within the
00:23:32.260 online sphere is internet
00:23:34.540 popularity but in a way
00:23:37.000 where they can't be found
00:23:38.180 out as not real people and
00:23:40.200 so the the they would do
00:23:42.660 things like arbitrarily
00:23:43.920 create emotional pain and
00:23:45.560 stuff like that like
00:23:46.180 trolling and stuff like
00:23:47.320 that so you could say a lot
00:23:48.040 of trolls like why would
00:23:48.800 somebody really do that all
00:23:49.760 day well it could be one of
00:23:50.780 these little like online
00:23:51.700 gremlin sort of things or
00:23:53.700 you know pretend to be oh
00:23:55.660 and a really cool thing is
00:23:57.820 in the world if a person
00:23:59.640 became famous enough online
00:24:01.740 was in our world the
00:24:03.640 imprint that they were
00:24:04.780 leaving in this other world
00:24:06.300 would become a separate
00:24:07.500 independent entity from
00:24:08.900 themselves that at first
00:24:10.880 would be very aligned with
00:24:12.180 them but might develop
00:24:13.460 misaligned incentives and
00:24:15.380 try to you know kill them or
00:24:17.300 take over their identity
00:24:18.820 within the online sphere so
00:24:21.120 there's also this story of
00:24:23.420 like danger from becoming
00:24:25.640 too noticed online which I
00:24:27.400 think a lot of people sort
00:24:28.580 of feel in the background and
00:24:30.400 would feel very like if the
00:24:32.640 iteration of myself which is
00:24:34.320 fake which is this online
00:24:35.460 attention whore becomes larger
00:24:37.000 than my real iteration that it
00:24:39.500 can sort of take me over like
00:24:40.980 it has this element of truth to
00:24:42.480 it and so I really liked a lot
00:24:44.640 of the stories you could tell
00:24:45.900 with this world and the
00:24:46.940 conflicts you could have with
00:24:48.040 this world but obviously we
00:24:49.780 decided we did not have time
00:24:50.940 to write a book that was just
00:24:52.220 fiction knowing how hard it is
00:24:54.180 to even just write our
00:24:55.100 non-fiction books you know
00:24:56.120 we've done five of those
00:24:57.140 already but yeah what are your
00:24:58.800 thoughts Simone?
00:25:00.320 I think we call it the
00:25:01.320 ephemera I liked it but yeah I
00:25:02.840 mean I think it may still be
00:25:04.420 something we make up for our
00:25:07.100 kids I mean I think it's really
00:25:08.880 fun when parents just like have
00:25:10.340 persistent lies that they tell
00:25:11.360 their kids and their kids
00:25:12.520 don't realize it because I want
00:25:13.720 I want aside from just
00:25:15.780 Christmas our kids to know
00:25:17.560 that the world just lies to
00:25:18.940 them sometimes but even then
00:25:21.980 like it still can impart some
00:25:23.260 helpful like suspicions and
00:25:26.240 intuitions even after they
00:25:27.460 realize that it's a complete
00:25:28.400 lie so fun stuff
00:25:29.680 well I also think as a kid you
00:25:31.500 know if you tell a kid like we
00:25:32.880 were told as kids oh people
00:25:34.340 online are like evil rapists who
00:25:35.920 are going to hurt you they're
00:25:37.600 like well then I just won't meet
00:25:38.440 with them in person or whatever
00:25:39.400 right you know it undersells the
00:25:41.740 danger of people online but if
00:25:43.700 they think people online are
00:25:45.080 like ethereal gremlins trying
00:25:46.920 to steal their attention for
00:25:48.520 their own benefit that might
00:25:50.400 cause them to in a way be more
00:25:52.180 wary of people online than even
00:25:54.620 the true stories like a lot of
00:25:55.940 these stories about forest
00:25:57.000 creatures were originally told to
00:25:59.780 keep kids from wandering off into
00:26:01.080 the woods and getting eaten by
00:26:02.220 wolves yeah exactly because
00:26:04.000 telling them about monsters and
00:26:05.460 witches scared them more than
00:26:07.540 telling them about wolves and
00:26:09.060 bandits and so I wonder
00:26:11.720 if you could also maybe even
00:26:13.680 motivate them more and you
00:26:14.720 could build stories about like
00:26:15.980 only fans or something like
00:26:17.420 that you know anyway it'd be
00:26:19.860 interesting yeah no I well I
00:26:22.500 generally like the idea of
00:26:23.880 creating new lore I mean it's
00:26:26.300 sort of already exists on my
00:26:27.460 line but it's a little bit too
00:26:28.700 literal so fun idea we didn't
00:26:31.300 unfortunately talk about this in
00:26:32.940 the prejudice guide to crafting
00:26:33.960 religion we didn't talk about
00:26:35.060 like the the different ways people
00:26:37.960 could create new lore like we
00:26:39.140 talked about how like for
00:26:40.040 example in like the Jewish
00:26:42.280 tradition there's a lot of
00:26:43.340 holidays just dedicated to lore
00:26:45.240 to like kind of explain who we
00:26:46.760 are and what we're all about but
00:26:48.660 like that isn't I mean there
00:26:51.360 there are many oh yeah I forgot
00:26:53.380 about the other fictional
00:26:54.360 universe we really wanted to
00:26:55.460 create so what we did we could do
00:26:57.540 a video on this there's a space
00:26:58.720 one that I was really interested
00:27:00.400 in that was like a space saga that
00:27:01.780 I could talk about but then the
00:27:03.040 one about Yellowstone which is a
00:27:05.020 post Yellowstone eruption world
00:27:06.800 where this is when you came up
00:27:08.780 with on your own before you even
00:27:10.180 yeah Yellowstone itself because
00:27:12.060 it's this constant volcano it
00:27:13.520 turns out that the thermal energy
00:27:14.660 there is very useful for creating
00:27:16.920 like battery technology gets
00:27:18.760 better but power generation
00:27:20.080 technology does not get better so
00:27:21.980 that in this world even though
00:27:24.220 it's like the sea of lava you
00:27:25.840 would have a large corporate or
00:27:28.480 religious controlled cities was in
00:27:30.300 it that were constantly generating
00:27:32.560 power like that was the industry of
00:27:34.060 the in the region but that around
00:27:37.040 that you know the sailing on the
00:27:39.280 seas of lava and stuff like that
00:27:40.700 you would have a lot of reasons to
00:27:42.940 have sort of bandit groups and stuff
00:27:44.120 like that and the focus of this
00:27:45.420 world was actually on religious
00:27:47.320 institutions so it was let's create
00:27:49.640 a world in which various religions
00:27:52.480 are much closer to the truth than
00:27:54.380 they are in our reality and you know
00:27:56.640 you actually have a god and angels
00:27:59.880 and everything like that and they
00:28:02.000 actually begin to interact with
00:28:03.760 humanity a lot more directly and
00:28:06.360 once that happens once they can
00:28:08.280 interact with humanity in a way
00:28:09.720 where like they're interacting with
00:28:11.380 the laws of physics well then they
00:28:13.560 can be defined and learned about by
00:28:16.980 humans and well a portion of humanity
00:28:19.080 would be you know subject themselves
00:28:21.520 to them so this actually takes place
00:28:22.740 during the what's the word the
00:28:24.640 rapture so this this world exists but
00:28:27.620 also humanity is raptured so that's
00:28:29.060 that's why we would know about like
00:28:30.260 angels and everything like that but
00:28:31.880 a portion of humanity attempts to
00:28:34.020 learn about them like through
00:28:35.140 scientific means in the same way you
00:28:36.920 could say angels did in the bible and
00:28:38.380 overthrow the system you know utilize
00:28:40.420 the system for itself utilize demons
00:28:42.800 as an energy source utilize angels as
00:28:45.540 an energy source do all the types of
00:28:47.760 horrible things that humans do and
00:28:50.320 become an actual meaningful powerful
00:28:54.280 faction in this great game of the
00:28:56.340 universe while the core antagonists of
00:28:59.180 the series aren't even these humans
00:29:00.460 it's the the buddhist faction which is
00:29:02.380 trying to end the cycle like collapse
00:29:04.180 all reality but anyway it'd be very
00:29:06.440 fun i thought it'd be a very fun series
00:29:08.320 but that's another one that's not being
00:29:09.820 made because i was actually making it
00:29:11.380 as a video game i could go into all the
00:29:13.340 plot lines in it i like sketched it all
00:29:15.140 out um yeah but that that's enough that's
00:29:17.640 enough we okay well i love you simone and
00:29:19.920 i appreciate you dealing with all of my
00:29:22.340 little fictional universes in my head
00:29:24.240 that i love to play in and have fun in and
00:29:27.020 imagine when i was younger i spent so much
00:29:28.940 time just interacting and writing the
00:29:31.300 storylines as i was walking between
00:29:33.080 places for these worlds it is
00:29:35.380 magnificent and i love it and i hope
00:29:37.360 that our kids do similar things and share
00:29:39.020 their stories with us i love you too
00:29:41.420 all right i don't think there's any
00:29:46.060 thought-out meat for you