Geopolitics Post AI & Birth Rate Collapse
Episode Stats
Words per Minute
175.13318
Summary
In this episode, we discuss the two biggest changes the world is facing right now, which are the development of AI and Fertility Collapse, and how they impact the future of global geopolitics and economic growth.
Transcript
00:00:00.000
hello i am excited to be here with you today today we are going to be discussing the future
00:00:05.780
of global geopolitics and economics in the face of the two biggest changes the world is facing
00:00:12.820
right now which is the development of ai and fertility collapse so this can be thought of
00:00:18.780
as sort of a fertility collapse update video and our well and ai someone was like you've been doing
00:00:24.200
so much ai recently and i'm like ai is literally going to change everything about the human
00:00:29.240
condition over our lifetimes anybody who's not thinking about ai as much as they're thinking
00:00:34.340
about rome or anybody that's that's a very good point actually yeah rome happened you you can't
00:00:42.120
change it oh dear no but i mean when as you're you know predicting out your career or what education
00:00:50.540
means for you and your kids or really anything what's happening in the space of ai is incredibly
00:00:57.700
important now simone i sent you yesterday some maps and we'll just go over these in turn with
00:01:04.200
our fans because i think that they're very important for getting a grip on how bad things are now and this
00:01:12.400
this first graph is terribly designed the darker blue areas where it gets to like purple and darker blue
00:01:20.720
those are areas of lower fertility the lighter blue areas are areas of higher fertility so the more
00:01:28.120
tan brown orange-ish the higher the fertility the more darker purple the lower the fertility
00:01:35.260
and you see i mean it's kind of obvious because all you have to do in a graph like this to calibrate
00:01:39.880
is look at south korea and then you kind of know what's going on yeah and then the lighter
00:01:44.100
man i didn't know nepal was doing so horribly we'll get to these in a second so the lighter red
00:01:49.980
is higher for lower fertility but it's higher fertility for the darker red so they're sort
00:01:55.780
of treating white the 2.1 you know replacement tfr as the midpoint and then you move either to red or
00:02:01.900
blue depending on where you are and what colorblind it's it's it's purple what you should immediately
00:02:07.900
see here is or purple okay is that the americas are now just completely effed out right like i think a
00:02:16.820
lot of people thought that south america would take longer to get to the place it is right now
00:02:21.260
but we're now at a place where we are depending on how you calculate it the if you do it like per
00:02:29.220
person so if you correct by the population of a country yeah latin america may be below the united
00:02:35.560
states in terms of tfr now yeah she's she's gone guys and if it isn't right now it's going to be soon
00:02:41.780
and this is really big because when we're talking about the future of geopolitics this is true both
00:02:47.140
within the united states and around the world and we'll get to this as we explore these different
00:02:50.540
regional maps because i think that's a better way to do this who's going to own the future
00:02:54.560
our countries and populations that while being economically and technologically productive
00:03:02.040
can still be high fertility all right a country like somalia which you can see right here i'm pretty
00:03:10.020
sure that's somalia which is super high fertility doesn't effing matter it doesn't matter if somalia
00:03:15.960
has 20 times its current population that's just going to be 20 times where people in desperate
00:03:21.380
suffering it might matter to you know surrounding companies in terms of like refugee crises and stuff
00:03:25.660
like that but it doesn't matter on like the global stage right like they're not going to suddenly become
00:03:30.380
a powerhouse and we imagine if they could choose they would probably rather first improve quality of
00:03:37.220
life before they improve which is going to lower the birth rate but yeah the place where this can
00:03:42.880
really be seen is in east asia and so here we're going to be talking about the map i have on screen
00:03:49.040
here is east asia more broadly but i'm going to talk about it along with oceania because they're
00:03:53.540
basically a connected economy when people think about fertility rate collapse even though it's a bigger
00:03:58.760
problem in east asia they often think about it in the context of themselves and their neighbors and
00:04:03.860
their trading networks but the biggest disruption that it's going to cause is is in this part of
00:04:08.980
the world and so you've got china with a fertility rate of like 1.2 japan with a fertility rate of 1.3
00:04:16.280
south korea 0.9 and taiwan i've seen all sorts of weird fertility rates for taiwan recently from
00:04:23.460
this one has it at 1.1 but i've seen it down to below south korea at 0.76 yeah what's going on
00:04:29.400
get it straight guys falling very quickly is the general gist of it but the the wider point here
00:04:36.740
is who is declining within this region it is everyone who is economically relevant is declining
00:04:43.500
within this region and well i should say not everyone so then the question is okay you're
00:04:49.660
saying there's somebody near this part of the world that isn't in a complete free fall right now
00:04:54.440
who are you talking about well the two countries that are economically relevant in this region and
00:04:59.760
are not in complete free fall are australia with a tfr of like 1.56 in new zealand with fertility rates
00:05:07.480
i've seen of between like 1.6 and 1.85 which is actually very robust i mean for for these days
00:05:14.960
for these days right yeah i mean you still need 2.1 people just to remind you if you if you're new to
00:05:21.440
this podcast and subject if you want your country to maintain a stable population you need to have
00:05:26.420
a total fertility rate of 2.1 or more right and a lot of people can say well you know what about for
00:05:33.200
example india right with its with its tfr of 2 on average or it's slightly below replacement rate
00:05:39.320
and this is again where you you sometimes even need to look sub-regionally to predict the future of
00:05:45.300
a region you know you can't think about these countries as just countries like india as just india
00:05:50.420
you need to see who in india is having lots of children in india it's it's muslim extremists
00:05:58.540
and if india does become a a country because it's a democracy where muslim extremists end up running
00:06:07.580
the country it's going to almost immediately fracture and completely change its relationship
00:06:13.260
with pakistan because that would change things pretty extremely india's in a very dangerous
00:06:19.240
demographic position right now and it's something that's like very well known on the ground in india
00:06:24.780
it's it's it's sort of a ticking clock in terms of its regional power uh because its its regional power
00:06:30.240
relies on the existing sort of demographic breakup staying largely the same and and note here people
00:06:38.640
can be like oh well wouldn't this be the same thing as like minorities becoming the majority in the
00:06:43.660
united states like suppose like hispanics became the majority in the united states and then they
00:06:47.780
started electing you know a bunch of hispanic presidents or something like that no not not even
00:06:52.540
a little bit like the the the conflict between the hindis and the muslims in india is very very much
00:07:01.140
hotter than the conflict in america between you know wasps and hispanics there is not a deep
00:07:08.400
like there is you know a little bit of tension occasionally but it's not that deep or bad of
00:07:15.480
tension and what's also interesting about the united states is it's often the regions with the um most
00:07:23.240
history with hispanic populations i.e.g. we're getting a lot of hispanic populations moving where you have
00:07:29.320
the most positive sort of perceptions of them um and integration of them with locals one of the problems
00:07:35.520
that texas republicans always have is they have a history of being more open to immigration than other
00:07:41.480
republicans which has really hurt them it hurt i forgot one of the guys who's running in the election
00:07:46.140
cycle i can't remember his name but he was a primary in donald trump a few election cycles ago
00:07:51.520
and it was a big problem for him i'm thinking of rick perry here but it is it was a george bush was that
00:07:57.360
way too so you you have this problem consistently but it's i think as a problem what it shows is that when
00:08:04.080
these populations live next to each other for a long time they often begin to become quite you know
00:08:09.280
fine with each other that is that is not what we're going to see in india and this also when people
00:08:14.640
frame everything like this in terms of like racism or whatever they're like you are racist to care about
00:08:20.900
this stuff right i'm like you do see that an entire half of the world is going to have the scales
00:08:27.660
dominated in favor of white people people in new zealand and australia and you're just ignoring
00:08:34.960
that because it doesn't fit your narrative now here i need to talk about ai and we're going to get
00:08:40.400
into a number of ai studies after we go around the globe but just to first your people can be like oh
00:08:46.700
well there's still some you know economically what about cambodia they're they're sort of have a stable
00:08:53.040
population at like 2.2 or vietnam which has a stable population at like 1.9 and don't get us wrong good
00:08:59.660
for them yeah the philippines at 2.7 and it's like well the problem is is this kind of developing economy
00:09:05.760
what type of job are they taking right like what type of job did you have in india most frequently
00:09:11.120
when it was in this period of development it was outsourcing the type of jobs that you go to upwork for
00:09:17.240
the problem is is outsourcing and call centers are the very first things that ai is being applied to
00:09:23.460
at scale you you probably already interacted with ai call bots right that's not awesome for these
00:09:31.900
regions that are hoping to spam it with just lots of people well and combine that with increasing
00:09:38.400
protectionism which people like peter zeihan expect is only going to increase that we're going to see
00:09:44.040
less global trade and exchange so there will be less interest in people even outsourcing
00:09:51.220
non-service professions things like clothing production factory production in general etc so
00:09:58.220
in general the headwinds are not they're not favorable people are wondering why that would
00:10:03.660
happen it's because the u.s has less motivation to do international deals it has a strong motivation to
00:10:11.560
be protectionist now both of the united states political parties have a protectionist bent like
00:10:15.760
joe biden and camel harris had a bunch of protectionist policies donald trump had a bunch
00:10:19.560
of protectionist policies and the even more protectionist of the two protectionist parties
00:10:23.880
one and he's shown perfectly willing to implement these protectionist policies like you look at the chip
00:10:28.920
restrictions and everything like that on china that was all under the biden administration that that
00:10:32.740
stuff was started so the the united states and i think ai might even increase the relevancy of being
00:10:39.840
protectionist going forwards because when you can totally automate a factory like why do we send
00:10:45.520
stuff to china to have it made inexpensively it's because inexpensive chinese labor the more automation
00:10:52.660
that's possible the more that can be done in the united states right uh the less of a motivation you have
00:10:58.420
to ship out and in addition if we are going to ship out now it's cheaper like if you're setting up a new
00:11:03.000
factory from scratch to just do it in mexico which is closer to us and kept artificially
00:11:07.960
inexpensive because of the gangs in a really weird economic way um it's a very weird situation like
00:11:13.980
mexico it's like geopolitically like way worse than it should be given its demographics education level
00:11:21.180
uh location next to the united states etc etc by the way did you know mexico has fallen below the u.s
00:11:27.520
in some measures in terms of tfr yeah above the u.s in terms of obesity rates now yeah yeah so mexico's
00:11:35.240
got some poor mexico yeah there's they've got a lot going on right now so next let's go to the
00:11:41.160
middle east right because i think a lot of people they look at the middle east and they're like oh
00:11:44.040
the middle east has great tfrs and i think you can immediately see the economically relevant countries
00:11:50.260
don't like look look at look at turkey here 1.4 right iran 1.6 right like those aren't great no the
00:11:59.480
only place where you see the good ones is like you know like tunisia in egypt right no but that's the
00:12:05.660
poverty then we talked about this when we did our middle east birth rates episode and and looked it
00:12:12.680
would dispel basically the rumors of oh well it's muslims and people on the middle east no their
00:12:18.760
fertility rates are not that good and when they are good it's because they haven't developed yet that's
00:12:22.800
it you know it's this classic pattern are they poor okay well that's why their fertility is high okay
00:12:27.880
get over it stop and a and ai this is the thing you could be like well i mean at least like the
00:12:34.540
rulers in the geographically regions and one of the things that we've got to keep in mind by the future
00:12:38.180
is the poxa romana the urban monoculture this is sort of the piece that's being held up by europe
00:12:42.440
right now is going to collapse europe a lot of these countries are like 20 years away from their
00:12:48.200
social security systems collapsing and when europe starts to break apart as a geopolitical force
00:12:54.420
them telling other people to not kill each other or enslave each other over stupid things
00:12:59.260
is not going to be as effective and in the middle east there's a lot of groups that want to kill or
00:13:04.640
enslave their neighbors over silly things and well not necessarily over silly things or even just like
00:13:10.000
economic right you know we see this all over the place where you basically have slavery all across
00:13:15.160
the region already if you don't literally have slavery in the region now people can say well what if
00:13:20.240
you know the super wealthy groups that aren't doing good in fertility they just sort of enslave
00:13:26.020
the populations near them that do have this high fertility rate and they can use that to sort of
00:13:30.340
economically cheese their system in a way that they they kind of are already the problem with this
00:13:36.400
is that it's twofold for this region one as soon as we have good ai bots that you can build
00:13:45.140
right they're they're just more efficient than slaves this this is you know if you think about i
00:13:52.020
have a ai scenario i created for rfab.ai when it's implemented we haven't implemented the scenario yet
00:13:57.240
where you you are you know a commander for the terran federation which is a far future earth fascist
00:14:05.260
empire and you've just conquered a planet and you need to justify why the planet's populace shouldn't just
00:14:11.340
be liquidated because you know you could just have ai workers replace them with less resistance and
00:14:17.340
and well less expensively and you can be like well what about the energy cost and it's like oh remember
00:14:22.580
you are in the middle east which means even if you didn't have inexpensive gas for energy you would have
00:14:30.280
endless solar power in some of these well and we're going to end up with much more scalable nuclear
00:14:36.700
energy too which i think is big yeah which means that we may have an opportunity for the most
00:14:42.940
economically productive of these regions to consolidate their power and have even more power
00:14:47.300
but for everyone else they're basically effed and national borders probably won't matter as much
00:14:53.660
once the urban monoculture falls like the idea of national borders is something that's become normalized
00:14:59.560
in a large part because of the pox de romana implied by europe when countries decide they
00:15:06.800
want to expand their borders going forwards historically that was something people did all
00:15:09.820
the time like now it's something you just don't do that much so going on to europe we've been talking
00:15:15.180
about what a terrible terrible situation europe is in and here if you look at a map you know across
00:15:20.820
europe you're looking at things like in the in the more catholic countries uh you know like
00:15:25.400
italy and spain 1.2 which as i pointed out it's like 1.85 actually for every 100 italians there's
00:15:31.560
going to be 20 great-grandchildren at this point and this is assuming it doesn't get worse right
00:15:35.380
and even if and then you you see you know you've got ones here's 1.2s all over the place but if you're
00:15:42.260
if you're looking at the the better ones and this is another thing where people are like oh you're
00:15:46.020
trying to you're worried about fertility rates because it's your people who are going to suffer
00:15:50.540
and i'm like it's not my people like wherever i look we're pretty fine dominating so i am you know
00:15:58.100
we are our english scottish scandinavian background right you know so if you look at this map and you
00:16:03.580
ask the question what are the places that are better off in fertility rate and still economically
00:16:11.380
productive okay yeah france 1.6 with us audience ireland 1.6 england 1.5 scandinavia and norway at
00:16:22.560
1.4 1.4 germany 1.5 denmark 1.5 it's it's switzerland 1.5 it's the places that look like me
00:16:33.060
there are some other places that have slightly higher fertility rates here like the 1.6 in turkey
00:16:38.260
right and and some of these these countries in the balkans that have decent fertility rates but
00:16:42.780
they're not economically particularly relevant which is the the problem here right like we need
00:16:48.280
to be realistic they're not gonna you know turkey isn't going to become an economic powerhouse overnight
00:16:52.900
just because this population is declining slightly slower than these other places it's also going to
00:16:57.460
have to deal with the social security issues as things start collapsing okay yeah yeah yeah even
00:17:02.320
those who are doing okay are still going to face dependency ratio cascades as time goes on
00:17:07.300
a dependency ratio cascade is a phenomenon where in a in the united states you have 1.8 people paying
00:17:14.600
into the tax system for every one person who's living off the tax system this number gets worse
00:17:19.800
as more and more people get old because you have more and more people in social security and eventually
00:17:24.300
you reach a point where the majority of the population is relying on the government and then
00:17:29.020
democracies stop working because people never vote themselves less money you know if they're if
00:17:32.820
they're living off the government and so then all the productive taxpayers leave and your country
00:17:37.280
ends up in a terrible situation but if we want to talk about terrible situations look at latin america
00:17:42.800
right like not only are they not great off economically but like look at a country like argentina that's
00:17:50.560
1.16 how do you how do you fix that when well i mean they're trying to import some mennonites so
00:17:57.720
that's a start look at chile 1.03 yeah not good brazil 1.47 mexico 1.45
00:18:13.720
venezuela oh we got we got a strong one here you can be like well some of these countries
00:18:18.460
no yeah all you have to do is just create rampant poverty and have blackouts and brownouts and little
00:18:24.940
electricity and food shortages and you know yeah who's okay here venezuela and haiti you think
00:18:30.700
because they're okay with demographics right now they're going to be economic power players in the
00:18:34.400
future everyone within education in venezuela is leaving right everyone who's economically
00:18:40.580
productive is leaving what they've left already they've left already we've seen this i mean the
00:18:45.100
travel business that we that we managed at the time that we acquired it basically its specialty was
00:18:51.440
working with venezuelan travel agencies that were either specialized in getting venezuelans
00:18:57.240
out of venezuela or literally flying suitcases full of u.s dollars into the country because
00:19:04.640
you know their currency wasn't exactly working anymore by the way simone you know i i often tout the
00:19:11.520
i've been to 50 countries statistic yeah a huge part of that is because you know it's on this map in
00:19:17.240
the island chain that there's a bunch of different numbers listed because there's so many different
00:19:20.240
countries in that chain yeah i took a boat trip through that island scuba trip so that's just
00:19:25.300
one of those like ding ding ding ding ding ding ding ding ding ding ding yeah so that's that's how
00:19:30.300
that explains a lot okay like 10 of those seems like excessive you know it does seem excessive but
00:19:36.620
yeah but i think you can see by looking at this like you you look throughout central america right you
00:19:40.960
usually see you know 1.1s 1.7s in some cases 1.3 you know this is not good
00:19:48.020
this is not a place it's better off when people are like you're concerned about white people
00:19:52.880
also i'd note in the united states remember i said what matters is the groups that are breeding
00:19:57.980
when they have money right because what's going to happen to a group over time if they're having
00:20:02.260
tons of kids and it's only the ones who don't have money who are having tons of kids is one that
00:20:06.960
population is going to be subjective to dysgenic selection which has negative effects but then two
00:20:13.220
even culturally speaking if it's making an anti-education choice if it's making a choice
00:20:20.160
to not be economically productive that leads to that group surviving and proliferating the cultural
00:20:26.260
elements in those groups that enhance that choice are going to proliferate in the future
00:20:29.960
in the u.s this becomes really important because if you're like who's going to dominate the future of
00:20:33.860
the united states if you are above 50 income in the united states above 50 income who has the highest
00:20:41.860
fertility what what demographic can anybody guess it's white people yeah not what people think
00:20:47.640
i'm actually i'll send you on whatsapp the institute for family studies graph that shows this because
00:20:53.680
it's so actually it when you look at it it actually looks as though when we're when we get beyond just
00:21:00.080
abject poverty there is actually a slight positive correlation between income and well fertility but
00:21:10.000
only for white people only for all other groups it goes down which really does
00:21:14.840
it kind of makes what they're arguing for look really bad because they keep arguing like give
00:21:20.780
people more money and they'll have more kids and it's like well that's not really true except for
00:21:24.360
just white people can you guys white people do have more kids you're showing your hand guys this is
00:21:29.720
not a good look for you the thing here is people are like oh what about scary black tfr which isn't even
00:21:36.500
that high by the way and i always point out here but it is very important to note black people in
00:21:41.360
the united states that aren't in like the bottom i'd say 15 of income for for black individuals have
00:21:47.040
the lowest fertility rate of any ethnic group in the united states yeah anybody they are at risk they
00:21:51.860
are at risk they need help yeah and anybody who is wealthy black friends knows this like black people
00:21:56.520
when they get money they have like no kids and there's a number of reasons for this we won't get into
00:22:01.720
it in this episode might be worse doing a whole episode of why black people don't have kids that
00:22:06.280
actually would would be an interesting episode yeah because i think it's similar to people being like
00:22:11.160
oh we'll just use latin america indefinitely to get population numbers up for the united states which
00:22:17.400
is as we've shown super not going to happen the other thing that people constantly say is like
00:22:22.620
well black people and africans are just going to keep having lots of babies so you know just don't
00:22:28.000
worry about it they'll take care of it and it's like well actually assuming you're not in poverty
00:22:32.160
no extra no let's work on that and so here i want to go over some studies because i have made the
00:22:40.500
assertion many times and i i do deeply believe this that as ai becomes more advanced there will be
00:22:46.460
sections of the population they just can't do anything as good as ai can do it and the question is
00:22:51.640
is what is their role was in the economy and was the geopolitical order was in which we live when
00:22:57.860
that happens and the question here is yeah but is that true because if you anecdotally think about
00:23:04.340
it you will have heard of some studies that show that ai actually increases the productivity of
00:23:09.180
less productive workers more and it increases the productivity of more productive workers less
00:23:14.980
so let's go over some of these studies in a field experiment was nearly 5200 customer support
00:23:20.580
agents a fortune 500 software firm access to a generative ai conversational assistant increased
00:23:27.140
average productivity issues resolved per hour by 14 the gains were largest up 35 for the least
00:23:33.720
skilled and least experienced agents those with under three months tenure while highly skilled agents
00:23:39.560
saw minimum or no improvement the ai dissemination tacit knowledge from top performers allowing sorry
00:23:46.500
the ai disseminated tacit knowledge from top performers allowing novices to quote unquote move up the
00:23:51.700
experience curve faster with with treated agents at two months tenure matching untreated agents at six
00:23:58.360
months tenure this also improves customer satisfaction and reduced employee turnover among the newer workers
00:24:03.660
suggesting potential to narrow base skill inequality so think about that a two months employee is now equal
00:24:09.760
to a six months employee that's almost like a prosthetic for dumb people yes and it wasn't helping the smart
00:24:19.060
people at all what i think is really happening here and we're going to see this over and over again
00:24:23.840
is midwits because if you're working at a call center even if you are a higher end producer you're probably on the
00:24:29.740
midwit side of the spectrum i mean you'd be running the company if you were on the higher end of the spectrum
00:24:34.320
midwits are over as brian chow once said uh mid is over mid artist mid musicians mid employees mid
00:24:43.000
yeah just the center if you're in the center of the bell curve watch out
00:24:47.940
yikes let's go next an online experiment was 453 college educated professionals on writing task
00:24:55.360
e.g press releases reports showed that chat gpt reduced tax time by 40 percent and improved quality
00:25:02.820
output by 18 percent on average weaker skilled participants benefited the most leading to a
00:25:08.180
convergence in performance and reduced inequality between the high and low ability workers participants
00:25:13.340
exposed to chat gpt were twice as likely to adopt their re to adopt it in their real jobs later
00:25:18.700
indicating lasting effect a study involving over 700 boston consulting group consultants on tasks like
00:25:24.360
idea generation and analysis found that gpt4 improved performance by 40 percent when used within
00:25:30.080
its capabilities lower skilled consultants saw a 43 percent boost relative to their baseline compared to 17
00:25:35.980
for higher skilled ones suggesting ai acts as an equalizer imf analytics cites research showing ai
00:25:43.840
helps less experienced workers enhance productivity more quickly with younger workers adopting it faster
00:25:49.120
in advanced economies ai could complement about half of exposed jobs boost output but this is more
00:25:56.440
pronounced for entry to high level roles now let's go to the other end the same mit sloan bcg study
00:26:03.640
found that tasks outside of gpt capabilities complex problem solving requiring human judgment ai used
00:26:10.480
decreased performance by 13 to 24 percent as users over relied on flawed outputs higher skilled models were
00:26:17.500
better at applying a cognitive effort to integrate the point here being is for the type of thing that
00:26:24.940
humans are better than ai ai makes mid worse and dramatically worse when contrasted with experts
00:26:34.180
mm-hmm so again we're seeing the but what does this mean economically it means a slight boosting and i suspect
00:26:42.100
temporary boosting of the very bottom of the economy and then it's going to be basically washed out when ai can do
00:26:48.320
everything they can do and the reason why ai can't do everything they can do is mostly just problems with
00:26:55.720
the models and the way they're set up right now because you don't have persistent memory learning
00:26:59.220
an easy way to create them to be more agentic which is a project we're going to work on with our
00:27:04.180
ai company broader analysis suggests ai could widen gaps by displacing low-skilled jobs while
00:27:10.320
complementing high income roles for instance ai may increase demand for skilled labor and knowledge
00:27:15.820
intensive fields leading to a pyramid type skilled structures autonomous ai agents eg systems that
00:27:21.320
handle routine work independently tend to benefit knowledgeable individuals more as they can orchestrate
00:27:26.640
and verify ai outputs more effectively and then anyway i'll just leave that out but yeah the the point
00:27:32.820
here being i think what we're going to see is at first a huge jump to sort of mid-intellect individuals
00:27:42.760
and then as ai sorry the very low skilled individuals like like not at the very bottom like
00:27:48.660
basically tards but but the type of people he would used to outsource to and then after that we're going
00:27:56.520
to see them become economically irrelevant as ai can do literally everything they can do especially as
00:28:01.940
ai can do physical tasks more and then the world is essentially going to be dominated by
00:28:08.660
a very narrow class of people who is having kids at a very low rate in most economies right now
00:28:15.320
yeah like maybe what i hear more people saying is that just the stock market in general
00:28:26.140
is going to see a big boom white collar jobs are going to be wiped out so generally like high achieving
00:28:33.220
mba holding you know more highly educated people are going to find themselves high and dry those with
00:28:40.040
money to invest though will be able to make a lot of money during this period those who are in the
00:28:45.620
trades are going to maintain stable jobs for the foreseeable future because it's going to take a long
00:28:51.660
time for us to catch up with robotics and so the the really big disruption is going to be
00:28:59.940
just a hollowing out of what many people think of as the middle class or the middle class is just going
00:29:05.620
to change from what historically has been seen as more of a college education holding
00:29:11.380
i don't know like a higher social class kind of group to a high earning but trade degree holding
00:29:21.140
and maybe not seen as like coastal elite culture style person which i think is kind of great and
00:29:31.120
more close to the american ideal yeah that's what i think i think people are just underestimating a few
00:29:41.220
things in a lot of their calculations here it's even within the high fertility countries how few of
00:29:46.560
those high fertility people are big economic contributors and big you know like like the type
00:29:52.660
of person who's going to be inventing the next ai system i mean take a country like israel which i do
00:29:58.160
laud like secular jewish populations conservative religious jewish populations have decent fertility rates
00:30:03.800
but they're still absolutely stomped by ultra-orthodox jews many of whom are not the type of groups that are
00:30:10.340
going to develop cutting-edge ai systems and that you know as our governments take more and more of a
00:30:18.240
burden to care for these populations that's going to hit us more and more going forwards yeah and
00:30:27.320
that's going to change like daily life a lot i mean i think a lot of kids aren't even really planning
00:30:32.340
to enter the economy right now which is such a uh an interesting phenomenon i 100 agree and can you
00:30:39.700
blame them i mean i think a lot of people in gen z or and or gen alpha have already more or less decided
00:30:47.280
like i guess that's not going to happen to me anymore so we're there this is a reality we occupy now
00:30:55.260
the question is what's going to happen after the shakeout right now the earthquake is happening
00:31:02.460
what happens after the aftershocks have mostly passed and we're in some form of new normal
00:31:09.720
that i don't know but like i said you know trades good building a public reputation and specialization
00:31:21.100
that is likely going to maintain relevancy in a post-ai age good if you want to have a job
00:31:28.360
man mechanical engineering that seems like a pretty good place to be because that's one place where i
00:31:34.960
just i know there's going to have to be so much work done in order for us to get to that full full
00:31:40.300
on a ai age we're we're just super behind in robotics and you can see but i'd be like i think
00:31:46.700
that the bigger thing that you're missing with all of this is when i'm going over all this and i'm going
00:31:51.300
over the geopolitics of all this okay is going forwards how much which country you happen to be in is going
00:31:57.740
to matter i've lived in an age with very little war like the fact that there is a war in europe at
00:32:03.940
all right now people are acting like is absolutely insane when you know historically wars were very
00:32:09.320
common between regions if you look at the rate of war in the world right now like it's like a super
00:32:13.940
low rate historically speaking i don't think that is so it's not just that we might have now are we
00:32:20.480
going to have more quote-unquote traditional wars maybe not maybe not even traditional terrorism
00:32:25.460
because keep in mind older people don't do this stuff as much they're just not as radical as other
00:32:29.480
groups but we've got to ask what does happen once countries begin to become economically unfeasible
00:32:35.220
like what is actually going to happen across latin america and europe when they hit a
00:32:39.760
dependency ratio cascade like i i think we don't know or actually i i can sort of i'll try to map this
00:32:48.460
out i think that japan and korea are going to figure this out in some way it's going to be hard and it's not
00:32:54.360
going to look nice but they'll figure out some solution not for saving their populations i think
00:33:00.760
that their fertility rates are going to stay pretty bad writing them off yeah i'm just saying the
00:33:06.480
preventing a complete government collapse due to their social security system not working
00:33:10.860
but i think that then some other countries are going to be like oh well we in europe will be able
00:33:16.580
to do what they did and i'm like no like south koreans are much more disciplined than you i've worked
00:33:20.280
with both of these populations before i lived in south korea i lived in europe something that you
00:33:24.500
implement in south korea isn't necessarily going to work in the uk and so i think that that's where
00:33:29.360
we're going to see some really brutal collapses europe and latin america and i actually might even
00:33:38.080
recommend people to seriously think about immigrating out of countries that are in really really bad
00:33:44.740
dependency ratio situation right now so you know maybe we should do a whole episode on this but i
00:33:49.900
feel like many latin american countries not due to some wise form of preparation but perhaps due to a
00:33:59.200
lack of trust and conscientiousness when it comes to filing taxes or things like that or like an
00:34:05.500
expectation of corruption meaning that you've had to make a more trustless society have set up social
00:34:11.060
services in a way that could possibly be accommodated by demographic collapse and reduces the incidence
00:34:19.380
of dependency ratio cascades for example in the united states right now people paying into social
00:34:25.060
security actively like us as active taxpayers our money is being used by people who are draining the
00:34:32.740
system like we don't you know you can log in and see your account but that's your money ain't
00:34:41.020
sitting there safe whereas in peru where we worked for a while we learned through having to manage all
00:34:48.660
of the account deposits and reporting that citizens have their dedicated bank account in banco nacional de
00:34:59.140
peru that is its own bank with its own branches where employers have to deposit money into your retirement
00:35:07.500
account into an unemployment account and they're real accounts that you can see and that's your money
00:35:15.060
sitting there so in that sense like whereas we're in the united states going to see social security
00:35:21.780
collapse between 2032 and 2036 probably what's going to happen is they're just going to start
00:35:27.780
devaluing the u.s dollar to a certain extent and and you know are the the purchasing power of a dollar is
00:35:34.380
going to go down is they just kind of make up for it by being like oh don't worry we're just printing
00:35:38.280
more money it's fine whereas in peru at least when it comes to retirement accounts they don't need to
00:35:43.520
like find the money to pay people out if i understand how things are being run correctly so i wonder if
00:35:52.160
maybe based on the way that some society's social safety nets have been set up it's going to be okay
00:35:58.560
for example like similar thing happens in peru with unemployment the way that unemployment works is that
00:36:03.700
you as an employer have to deposit money into your employees individual unemployment accounts
00:36:11.620
and we have had employees ask us to fire them so that they can liquidate those accounts and use the
00:36:19.780
money for like other stuff and then like rehire them when we're like we're not we're not doing that
00:36:24.240
we're not going to do that but like that the money is real whereas what we do in the united states is pay
00:36:30.480
into this like vague state unemployment system and it's not really clear what happens when you lose
00:36:36.780
your job you have to apply for it there's all these other hoops you have to jump through maybe
00:36:40.780
you'll get it maybe you won't it kind of depends on like how you were fired and how long whereas like
00:36:44.940
no it's super clear in peru money gets deposited into that periodically you can log in and see exactly
00:36:51.680
how much money is in that account so like the longer you've worked somewhere the bigger your
00:36:56.080
unemployment account gets and when you are when you lose your job or are fired that money is now
00:37:03.600
yours and you know exactly how much there is and there's no application process there's no
00:37:08.160
you know would you qualify i don't know because it's your freaking money so i do wonder if there are
00:37:13.920
some countries that are counterintuitively perhaps or due to the the nature of their infrastructure
00:37:21.040
not as screwed because they didn't use as much of the ponzi scheme not model as other countries what
00:37:28.420
do you think i mean peru just has a fantastically like designed a lot of parts of its it's it's like
00:37:35.280
for example when i pay my local taxes there's a little envelope for putting your credit card
00:37:40.580
information in and they're just like we'll just charge you your taxes and they give you a brochure of
00:37:44.880
everything yeah you can go on autopay it's just like a netflix subscription but they also give
00:37:50.400
you a beautiful booklet that shows you all the nice things that have been used to to improve your
00:37:55.680
local municipality when you pay those taxes which is just so nice i'm like yeah take my absolutely
00:38:01.200
you're doing something that's great i don't know set up yes but so yeah there's there's systems that
00:38:10.320
are better and you might be right about this but peru more broadly if we're just going to talk
00:38:13.440
about peru is completely because basically all of peru is lima lima is like a third of the
00:38:19.520
population of peru it's a huge chunk of their economy and lima is i think the largest city in
00:38:25.260
the world in the middle of the desert or one of the largest but it's in a giant desert yeah you
00:38:29.660
really feel like as soon as you step out you're like am i on mars it looks like mars it's this red
00:38:35.340
yeah outside of of the city itself in this meeting it feels like you're in like a a jungle oasis of
00:38:42.040
of you know yeah really nice stuff all the cliff edge the ocean but the problem is the problem is
00:38:48.440
is that the water that the city relies on is going to dry up in something like 20 years not not that
00:38:55.440
long it's coming unless they get really good i feel like ai could really help with desalinization
00:38:59.560
technology and then they're fine yeah elon has mentioned that they might be able to do something
00:39:03.300
with desalinization we'll see but yeah i wonder we have to hope we really need nuclear to work we
00:39:10.740
really need desalinization to work a lot of the developing world is propped up by demand from
00:39:17.320
china right like a lot of these countries like like peru a lot of latin american countries a lot
00:39:22.760
of african countries make a lot of their money selling commodities a lot of these commodities are
00:39:26.480
basically building materials which are used to build this giant real estate bubble that happened
00:39:30.420
in china um with china being at a tfr of 1.2 you know you need to look like you want to know the
00:39:36.240
future of your country don't just look at your fertility rate look at a breakup of who's buying your
00:39:40.440
stuff and then look at their fertility rates um because if their fertility rates are bad that's
00:39:45.760
going to hit you really bad as well that's the future of your economy yeah yeah yeah what else is
00:39:54.780
there to say i mean that's well it's just it's just such a thing when you're thinking about the
00:39:59.120
future of geopolitical control and ai and all that to not get high high fertility rates caused by
00:40:05.540
poverty confused with actual high fertility rates as a lot of people do that's fair yeah that's
00:40:11.680
underrated your relevancy anyway i love you simone this is a very interesting conversation and a good
00:40:18.700
sort of catch up for people who haven't heard this stuff in a while and very excited to be giving this
00:40:24.320
speech to some very important people in the near future me too yeah looking forward to our drive
00:40:30.420
time together all right bye i think i figured out why i've been so irritable today and it's i was
00:40:40.460
deprived of my morning walk with you i'm an addict that's what happens even well tomorrow we're going
00:40:48.500
to be in the car dc and back yeah so that's going to be a good day although we may have calls on the
00:40:56.780
car ride but oh really that's okay maybe so many hours we'll have yeah well i mean we're gonna need
00:41:02.900
to memorize that speech and blah blah blah blah blah yes i appreciate your work on putting together
00:41:10.120
the slide deck make sure you get that for me so i can review it tomorrow morning as i get an episode
00:41:13.560
ready to go yeah after dinner i'm gonna finish it up all right that works for me the comments on the
00:41:21.420
episode today i was so surprised that it did bad the one on ai and like the silicon valley people
00:41:27.140
becoming religious because of ai and i had another one prepped on the same topic because i found a
00:41:32.420
really interesting article on this trend and it did really bad and so i i don't get it i thought i
00:41:37.240
thought our audience would love that i mean you could make it something that we put on
00:41:41.620
patreon for one of our patreon only weekend episodes that's a really good idea yeah that's
00:41:48.640
the type of thing that they would like all right let's do that simone won't let me we created a book
00:41:52.720
out of her personal diaries from the first year we met called into the heart of dorkness which is
00:41:58.080
the best title book we published but she wouldn't let me i was like oh i'll release it on patreon and
00:42:04.420
she's like you absolutely will not our platform is too big now what it was going to be our debut book
00:42:10.640
and now i can't even publish it on patreon yeah i mean someone might convince me someday but
00:42:16.160
hey i really trust our friends who are like whatever you do don't publish no what she said
00:42:26.440
was is it just wasn't entertaining to read as we thought it was but i think that that was the case
00:42:31.960
because she had no not the personal investment in us that somebody who subscribed to our patreon would
00:42:36.940
they would be very interested to what are simone's inner thoughts when she's first meeting malcolm
00:42:42.020
in the first year they're dating when they she's promising to break up you know i'll only date you
00:42:46.460
if you break up with me etc like what what was she thinking during their breakup like this is all i think
00:42:52.280
from a you know what our our podcast fans can can weigh in on this and and you can tell simone
00:42:57.080
whether or not she needs to publish us to patreon maybe maybe to a special tier
00:43:02.620
and then the comments on the ai religion video were just like not not that interesting it was just
00:43:11.300
like oh those dumb atheists taking too long to figure it out some people were sharing specific
00:43:16.240
thoughts on specific elements of religion some people were trying to argue that ai is a false god
00:43:26.400
but you were never saying that ai was a god so i think they weren't really paying attention
00:43:30.380
it was a mix and there there were some people who just a lot of people just really enjoyed the
00:43:36.040
subject they're like you so i think this is where people are being stupid ai is not a god nor would
00:43:42.680
we argue that it's ever going to be a god but it is an intelligence and it is the like us a token
00:43:49.180
predictor and it is the first orthogonal intelligence we have ever encountered which means that it is very
00:43:56.220
useful in attempting to model because god god a supernatural entity from the perspective of most
00:44:04.840
people certainly an entity that is that is different from us in some incredibly meaningful way
00:44:11.040
like as as different as we are from ai by looking at how ai is different from us we can begin to
00:44:16.400
understand the different ways that thought machines can be different from each other and this is useful
00:44:22.740
when attempting to to model out supernatural entities and stuff like that uh so i just i don't think
00:44:28.760
it's something to just throw away and be like hey this is a pointless thing to to not engage with
00:44:32.960
anyway i love you simone and i will get started i love you too
00:44:38.120
they are they are yes i buy my new house oh hi this is my new house hey that's my dad wait i knew hey
00:44:54.140
that's our dad because we because see that's our house and you can't hey you break our house
00:45:03.920
what is my name anymore that's a house hey stop that is a house that doesn't like the house looks like
00:45:14.420
that's yeah that's yeah that's what our house looks like and that's
00:45:21.420
look at our house wait a minute that's all right that's in here wait a minute wait a minute wait a minute
00:45:28.420
wait a minute wait a minute wait we can connect that cross tonight through the second room
00:45:35.420
room there was me but that's our daddy in here so that's our house it's not your house our house
00:45:46.420
okay so now i'm neighbors with you i'm neighbors okay yeah neighbors my house is going to your house actually