Wild Speculation: How Will Life Change in 20 Years?
Episode Stats
Words per Minute
191.71936
Summary
In this episode of the podcast, I sit down with my good friend and co-founder of a company that's making a ton of money in the toy and toy manufacturing space. We talk about the future of AI and how it will change the way we live and work in the future.
Transcript
00:00:00.000
If you want to found the company that makes you a butt ton of money within our generation,
00:00:05.720
it is simple, large-scale, replicable technology that allows AI to interact with physical reality as we understand it.
00:00:15.600
So what a lot of people are doing in this space, which is why they're failing, is they're basing this on old models of robotics.
00:00:23.000
These are the robotics that we were using in factories.
00:00:25.520
I could imagine early founders co-opting toys, like literally just co-opting toys because they are-
00:00:31.760
No, that's what I think it will be based on, is toy manufacturing.
00:00:35.740
These will be the things that are watching our kids, that are making our meals at home and stuff like that, right?
00:00:41.780
And I think that right now when people are looking at doing stuff like this,
00:00:45.480
they are basing them off of these basically hard-coded models in terms of how they walk, how they interact with things.
00:00:51.560
Instead of taking advantage of the leaps that AI has given us in terms of opportunity.
00:01:15.640
I'm not one of those, but I do know a number of memes.
00:01:20.460
If I was transported into the past, I would not be able to speed up the speed of invention,
00:01:37.820
Oh, and I was just about to make fun of you for being on Facebook and therefore being old,
00:01:41.520
but then I recently looked up the demographics of Facebook and the majority of users are below 36 years old,
00:01:54.260
I guess. I really cannot understand how that could be the case,
00:02:00.820
But the internet is full of lies, so what can I say?
00:02:03.600
So speaking of which, actually, we are going to do some just wild speculation
00:02:07.460
as to the future of humanity, 5, 10, 50, 100 years, et cetera.
00:02:12.060
And I do think that we are going to have more than ever a crisis of reality online
00:02:16.540
as more and more AI comes online, more and more AI content is created.
00:02:25.400
Okay. As much as we've had recent mishaps with Google's Gemini, for example,
00:02:30.200
show that trying to throttle or otherwise control AI to make it more politically correct
00:02:35.200
or to have it feed into ideas of a more aspirational world can backfire,
00:02:45.000
I mean, ultimately, the urban monoculture is the urban monoculture,
00:02:49.420
It's going to create an internet that has a crisis of reality,
00:02:52.220
meaning that our friends who are creating banks of pre-2020 Wikipedia articles
00:02:58.120
are actually spot on in doing this because I just can't know what will be true
00:03:07.600
You know, one of the things I've heard is that, you know, in 20 years or so,
00:03:10.680
a lot of people, like right now, Gemini, we look at it and we laugh.
00:03:13.600
Ha ha ha, thinks that Washington was a black guy.
00:03:16.660
And that in 20 years, there might actually be genuine crises about this.
00:03:21.040
Yeah, I think people are going to be like, wait, no, but he was black.
00:03:28.360
It's told me that he was black since kindergarten.
00:03:36.080
Actually, the idea of not having historic photos of something might become weird to people.
00:03:40.180
People might sort of blur the lines when we started getting historic photos
00:03:42.860
because they can so easily be generated from AI.
00:03:46.940
So I agree that this is one thing that is definitely going to change as a crisis of information.
00:03:52.400
where a lot of information that starts being generated from now on is going to be tainted
00:04:00.320
Because when you train AI on training data that comes from other AI, at least the models
00:04:04.140
we've had now, it creates problems and it generally degrades over time.
00:04:09.400
And so it's almost like, so people who know when they're building Geiger counters and stuff
00:04:15.100
like that, they need to use metal from, I think it's lead or something, from submarines.
00:04:23.020
Because that's never been above the surface of the water since we started testing nuclear
00:04:28.620
Because all metal above the water since that period has this level of radioactive contamination,
00:04:33.280
which makes it not useful for certain activities.
00:04:35.360
And so what you're basically saying is it makes a lot of sense to start sort of banking
00:04:41.900
And this doesn't even make sense as a company, like trying to bank as much of the pre-internet
00:04:52.100
Even beyond what you see with something like the internet archive, like really good banking,
00:05:00.360
Which might have some level of value to people who are, ironically, probably the primary value
00:05:06.360
is people who are training AIs and stuff like that.
00:05:08.740
But then I think related to this issue is we're also going to see what we've referred
00:05:12.220
to in other podcasts and all over the place as techno feudalism, where you're going to get
00:05:16.860
a sort of a massive acceleration and expansion of what we're beginning to see on Substack.
00:05:22.300
So as people are now having a crisis of quote unquote fake news and not being able to trust
00:05:27.020
mainstream news media as much as they used to before, what we're seeing a lot of people,
00:05:31.840
especially including like very smart people, but all sorts of people of pretty much every
00:05:36.760
walk of life and income starting to turn to or continuing to turn to specific people for
00:05:45.700
And they aren't necessarily as engaged with the news for their primary source of information
00:05:53.840
So I think we're going to see that a little bit more accelerated.
00:05:57.580
Like we might actually just see people break off into communities around certain realities
00:06:02.320
that are built around certain people or certain collections of people.
00:06:07.120
Yeah, this is really interesting that you mentioned this because, you know, now that we live in
00:06:11.360
a world where you can build communities around any group that you share a similar culture
00:06:18.400
or ideology to, it's much more easy to create these tech features.
00:06:24.420
Well, imagine like the YouTuber houses that sprouted up, you know, when YouTubers started
00:06:30.980
Imagine that, except also it includes an entire community that like also people start to either
00:06:35.780
come to gather together a certain number of times each year or actually live together.
00:06:41.100
I think there's a lot less value in actual in-person communities that people pretend, except when it comes
00:06:48.320
And I think that we have repeatedly seen these communities fail and fail to stay stable.
00:06:54.260
Everybody always says that they want community.
00:06:57.620
I'm not saying living together, but I am saying seeing each other.
00:07:00.200
And there's a big reason in 20 years why this is going to be necessary because...
00:07:06.340
You won't actually know if something that you're following online is real or not, even if they
00:07:11.620
have visible pores and they look kind of weird, which I think is going to be one of the first
00:07:16.600
like early giveaways that someone's still human.
00:07:19.480
But also that like, yeah, I mean, even just seeing them in person once every now and then
00:07:24.540
and having them verify you, because I think a lot of people are also not going to be able
00:07:28.720
to get sponsors or other forms of payment unless they know it's like a real natural,
00:07:37.060
And another thing I've seen is on this really, you know, is something I didn't expect when
00:07:42.380
But if you're talking about the people who are making like changes in the world and like
00:07:46.280
a lot of people are, I think, a little surprised by the access we have to other influencer,
00:07:51.960
like huge influencer type peoples or ultra wealthy people or et cetera.
00:07:56.800
They're like, how do you nobody's, you know, they look at our follower count and stuff like
00:08:02.000
Um, and it's because the world right now in terms of communities that interact with each
00:08:08.380
other is sort of naturally sorting based on a few metrics, one of which is intelligence.
00:08:13.060
Um, and like, I don't mean to be arrogant, but I think pretty obviously we're very smart,
00:08:20.740
I don't want to cause any spiciness there, but I'm very smart for a human.
00:08:33.260
But you know, what it has allowed us to do is, is, is because of this, you know, we've
00:08:37.780
been sort of found by sort of top thinkers within various fields, but it allows us, you
00:08:42.100
know, somebody will come to us and they'll be like, Oh, you should talk to the guy who
00:08:48.500
And I'm like, I have a weekly call with him or, or somebody recently was like, Oh, you
00:08:52.780
should, have you ever heard of this Samo Borgia guy?
00:08:56.400
And I'm like, yeah, I'm meeting with him in a couple of weeks and Scott Alexander.
00:09:00.400
And, you know, I would say the number of people who are both smart, but also among the 1% on
00:09:08.740
the internet who actually contribute content is extremely, extremely small.
00:09:12.100
There are, there are many, many, many more smart people who just don't contribute.
00:09:16.100
And that's, that is also a meaningful factor here, but I also want to point out in terms
00:09:20.460
of my predictions, what I'm kind of pointing to the point of this is actually pretty important.
00:09:25.620
Is that what it means is that you are getting, you have sort of fiefdoms forming within reality.
00:09:32.060
One fairly small fiefdom contains most of the world's scientifically, economically, and
00:09:44.340
You say that, but I mean, I also think that there's going to be like, you know, the CCP
00:09:47.900
has its own little fiefdom of like incredibly influential people.
00:09:51.960
And in Europe, there are separate little fiefdoms of different, you know, cultural.
00:09:59.500
People I know who are like mimetically interesting coming out of Europe, they are largely connected
00:10:04.940
to our fiefdom and they largely commune with our fiefdom.
00:10:08.780
Like if they're working on like genuinely interesting stuff, that's not just completely
00:10:14.940
If you're talking about the CCP, nothing interesting is coming out of the CCP.
00:10:19.020
This would be one of my predictions that we haven't really gotten to, but I expect the
00:10:25.100
Like people are like, when do you think things begin to start changing in a major way?
00:10:30.960
It's going to start changing in a major way when China collapses.
00:10:39.620
And you think that's going to happen within 20 years?
00:10:44.340
No, I have a bet with the founders of Level Health.
00:10:50.980
And this, I made this bet like a half a year ago or something.
00:10:53.900
It was in 10 years of making the bet that China's economy on the books, like the economy
00:10:59.320
that they were telling people they had in terms of the size of the economy, that it would
00:11:02.520
be recognized as smaller than whatever they said it was.
00:11:05.160
I'm not saying the absolute numbers was in 10 years.
00:11:08.020
Not that what they were reporting, but like that their economy would not only experience
00:11:11.320
no growth from that point, but that it would be generally recognized as smaller.
00:11:15.440
And everyone at the party that I was at thought that this was just an insane bet to make,
00:11:22.280
like that I was really putting my reputation on the line.
00:11:28.380
Because I had argued at the party that I expect China to have a systems collapse and that anyone
00:11:32.580
who's not expecting that is just not familiar with their current economic circumstances.
00:11:36.160
If you want to learn more about this, you can look at our video on the future of East
00:11:39.660
But I think that that begins to precipitate what happens after this.
00:11:43.620
But continue is what you're saying, Simone, before I interrupted you.
00:11:46.500
That despite the fact that we are going to see fiefdoms that are built around a crisis of
00:11:51.980
reality, a crisis of trustworthiness, but also importantly, a crisis of needing to know that
00:11:57.360
you're a carbon-based life form because I can monetize you and or like feel a different
00:12:03.960
I think people are going to be super bullish on AI and that people will have primarily,
00:12:10.100
in many cases, AI friends and not biological friends.
00:12:21.860
And then we're going to have, of course, then even more of a crisis of fertility because
00:12:27.760
Yeah, no, I've heard from some people who are like AI is like a fad or it's oversold at
00:12:34.500
The people who tell you this are stupid people.
00:12:39.960
The people who tell you this are stupid, stupid people.
00:12:43.720
They are much more stupid than the people who said Web3 and crypto were the fad.
00:12:48.500
Like Web3 and crypto, like the way it changed humanity was genuinely fairly limited.
00:12:53.160
I remember I had one guy and he's like, oh, you can't really do anything with AI these
00:12:57.140
He's like, everyone who's using AI, it's mostly as like a gimmick or like a fun thing.
00:13:02.760
I'm like, no, you're just describing yourself because you're an idiot who doesn't understand
00:13:09.560
I use AI in my work and genuinely, I use AI almost every day and it's probably sped up
00:13:16.980
Multiple times a day on multiple different things.
00:13:20.000
Yeah, I use it in every single video that you are watching.
00:13:23.300
All of the tags come from AI, all the descriptions come from AI, a lot of the video titles come
00:13:36.500
It's remarkable that somebody could be this stupid and pigheaded to not realize how much
00:13:41.920
actually productive people are using AI in this economy.
00:13:46.020
But I think a lot of people assume that humans are going to be carbon fascists and are going
00:13:54.220
And you even see this in sci-fi, although I understand why, because it's harder to depict
00:13:57.600
human-robot relations like in a very compelling way.
00:14:00.760
But although more movies are doing it now, I should say, if anyone has doubts that people
00:14:06.580
are going to fall for AI girlfriends and boyfriends and AI children and AI everything, counselors,
00:14:12.700
teachers, religious leaders, et cetera, imagine a world in which people were incapable of being
00:14:20.400
And if we lived in a world like that, okay, fine.
00:14:25.140
It's because something or someone who is beautiful and kind and attentive to you and who tells
00:14:31.140
you what you want to hear, you're going to fall for it.
00:14:36.260
And AI is going to be way better at that than any catfisher ever.
00:14:40.020
And I don't care how old you are, how young you are, if you're, and they will, if your
00:14:45.900
grandchildren are neglecting you, if your children are neglecting you, if no one at
00:14:49.780
work is listening to you, you're going to find either you're going to seek out or you're
00:14:54.660
going to succumb to AI companions that are just way more comforting and enjoyable to be
00:15:00.560
around and who care about you way more than humans.
00:15:04.420
So there are going to be many, many, many, very, very deep human computer relations, which
00:15:11.700
is going to cause even more of a like pinching point in demographic collapse.
00:15:16.340
And this is also something, you know, when I see as AI, when, when, when people are like,
00:15:19.700
oh, we're not at AGI yet, or AIs make a mistake, therefore you can't use it.
00:15:28.920
Like, do you, have you never hired like normal humans, normal humans make mistakes too at
00:15:35.700
rates consummate whiz or higher than AI when we're hiring like a normal distributed workforce.
00:15:42.280
And they're like, well, you know, we're far from AGI.
00:15:48.740
They're like, well, you know, when AI is smarter than humans.
00:15:51.960
And I'm like, like, I feel like a lot of these people who say these things say them because
00:16:01.360
They are at Stanford or they're in Silicon Valley and everyone they know is a fellow
00:16:05.880
rich person, or they are in a dumb community and everyone they know is dumb and they don't
00:16:12.700
But if you are a smart person and you interact with normal people, as we regularly do, because
00:16:18.320
we hire inexpensive people for a lot of the jobs we have, people just like throughout our
00:16:23.280
other work, like political work and work and AI right now is much smarter than the average
00:16:34.960
We are so far past the question of what happens when AI is smarter than the average human.
00:16:40.260
And in terms of the bottom 25% of the human population, I think almost anyone would agree
00:16:45.340
that AI can do almost any job better than the bottom 25% of the population.
00:16:53.160
Because AI can't do that yet easily because we don't have the right form factors.
00:16:58.100
But the reason it can't do that yet is because we don't have the right inexpensive form factors
00:17:05.300
So like in 20 years, do you think this is going to be less of an issue?
00:17:08.000
I mean, there are tons of robotics companies and people making huge financial bets as investors
00:17:12.300
on robotics companies, but as much as we've seen huge breakthroughs in image generation,
00:17:16.860
chat, programming, video, you know, I'm not seeing anything with like physical-
00:17:21.720
I think that is going to be the, the, the big Apple, Google, whatever founded within our
00:17:27.500
If you want to found the company that makes you a butt ton of money within our generation,
00:17:33.260
it is simple, large-scale replicable technology that allows AI to interact with physical reality
00:17:43.500
And so what a lot of people are doing in this space, which is why they're failing, is they're
00:17:50.920
These are the robotics that we were using in factories.
00:17:53.720
These are the robotics that you see with something like, um, what's that goofy company that keeps
00:17:59.960
Like, uh, yeah, I mean, it's, it's impressive what they're doing.
00:18:07.340
See, they were doing this in this earlier world where you sort of needed to hard code
00:18:11.640
I think the group that wins, what they're going to do is they're going to be one or two guys
00:18:16.440
who basically puts together something fairly simple and inexpensive, almost more like a
00:18:27.980
It will probably be on treads or something like that when they're going to put this together
00:18:31.820
using sort of outsource factory labor and somewhere like China or Vietnam or something like that.
00:18:37.200
They're just going to, their designs over and they're going to be like, can you make
00:18:40.480
It's going to cost something like $50 to a hundred dollars to make each unit.
00:18:48.120
I think from our perspective, like not like the best, you know, simple metal, maybe like
00:18:57.060
So I expect them to be flimsy, like coat hangers or something like that.
00:19:03.040
Not robust, especially expensive, custom designed.
00:19:05.280
And the way that they interact with things will not be governed by very expensively and
00:19:11.380
expertly designed Boston Dynamics engineers or something like that.
00:19:15.500
It will all be determined by transformer model and determined organically in terms of how
00:19:23.340
Actually, I could imagine like early founders co-opting toys, like literally just co-opting
00:19:29.900
No, that's what I think it will be based on is toy manufacturing.
00:19:33.880
And these will be the things that are watching our kids that are making our meals at home
00:19:39.920
And I think that right now when people are looking at doing stuff like this, they are
00:19:43.940
basing them off of these basically hard coded models in terms of how they walk, how they
00:19:48.900
interact with things instead of taking advantage of the leaps that AI has given us in terms
00:19:57.520
Some of our toys that are like incredibly affordable, like that all-terrain green RC
00:20:03.040
car that we have can really get around and do not get stuck very easily.
00:20:07.280
And if you combine that with AI and just allowed it to move around and give it some additional
00:20:16.380
And that's what people aren't recognizing, right?
00:20:19.000
Is they're basing the AIs today off of the AIs of yesterday where everything needed to
00:20:25.980
Yeah, or a humanoid or animaloid looking robot.
00:20:33.520
You know, they're going to be drone swarms and stuff like that, which is quite different
00:20:42.000
And I think economically, a lot of people can be like, well, if you think AIs are going
00:20:47.980
to be able to do so much, you know, why are you worried about fertility collapse in terms
00:20:55.080
And the answer here is, you know, I mean, we try not to oversell that it matters that
00:20:59.920
smart people aren't having kids and that there is a genetic component to human intelligence.
00:21:04.440
But in a world of AI, this becomes uniquely elevated as a problem because a lot of humans,
00:21:12.000
once we get this toyish AI thing that we're talking about, are just going to become irrelevant
00:21:18.280
from the perspective of what they can provide to an economy.
00:21:21.620
They will not be able to at all compete with AI.
00:21:24.640
When these individuals, you know, who knows what we're going to do as a society?
00:21:29.280
But it's going to be tough when this happens economically, because we're going to have
00:21:34.500
And we've talked about this, you know, people are like, oh, they'll get UBI or something.
00:21:41.780
They're just going to live lives of incredible poverty and sadness.
00:21:45.940
And so the question is, well, then why are you encouraging fertility at all?
00:21:49.520
Because we need more fertility of the types of humans who have the ambition and self-agency
00:21:56.940
to work with AI and to build the next iterations of our species?
00:22:01.520
Because those are the people that are being mimetically selected out of the population most
00:22:06.960
But it's also why our belief in the capacity of AI, why we don't really need to reach everyone.
00:22:14.180
We only need a small, self-sustaining, incredibly intelligent community to defend ourselves.
00:22:20.580
People are like, oh, you won't be able to defend yourself.
00:22:22.620
And it's like, uh, like I know people who are already in our community who are building
00:22:34.060
Well, I mean, I guess I'm worried for the outside world, but not really because they
00:22:37.900
don't have to worry about anything if they're not trying to kill us.
00:22:40.460
And I think that this is increasingly what we're going to see.
00:22:43.860
You know, as we've said, I think when you begin to have a large scale economic collapse
00:22:47.920
around the world, what it's going to look like is small communities of economically
00:22:51.360
productive individuals that are fortified against the outside world and outside forces
00:22:56.380
that want to use force to extract value from them.
00:23:00.920
And I think that broadly the AI, like we've talked in another video that we did recently
00:23:06.380
on like utility convergence in AI, that AI will see utility in these communities, but it
00:23:14.820
Yeah, or I think broadly that AI is going to be nice to humanity.
00:23:21.260
It's just going to, it's going to be humane toward those who just want pleasure and who
00:23:26.240
will self extinguish and it will give them, they will self extinguish in a state of extreme
00:23:33.520
So great outcome for them, I guess, as far as you're concerned.
00:23:37.400
I think kind of, I actually think the best description I've seen of how AI is going to
00:23:41.740
treat that iteration of humanity is the South Park future episode where they're married
00:23:47.140
to AIs that's constantly offering them ads and everything like that.
00:23:51.320
I think it will give them hedonic bliss in the same way that like YouTube, when you're not
00:23:56.180
paid for a premium subscription does, where it gives you like two ads every 15 seconds.
00:24:03.160
It's going to try to extract the maximum value from them possible.
00:24:10.280
Maybe, I'm more picturing the WALL-E universe of rotund soft humans.
00:24:18.380
Yeah, I don't think that's going to happen as much.
00:24:20.340
I think most of them are going to be genetically selected out of the population pretty quickly.
00:24:26.620
I mean, I also think that they're going to be very technophobic subsets of society.
00:24:30.920
You know, the Amish, but more, like more groups like that, that just really, really,
00:24:36.460
really opt out because they see where things are going and that those things do not align
00:24:42.580
So as AI becomes more important within the economy, as it begins to do more things, the
00:24:47.380
groups that are opting out in terms of like the world thought space and economic space
00:24:56.020
You know, the world is multiple spheres of power and as humans matter less in terms of
00:25:00.400
the spheres of power, those groups will matter less.
00:25:06.920
And of course, I mean, in 20 years, can you, do you think that movies will still be made
00:25:14.640
Not, it's hard for me to imagine that being economically viable, viable, especially because
00:25:19.600
there's really heavy unionization in the entertainment space.
00:25:26.160
Today, people still pay for paintings that were painted by a human.
00:25:37.720
Why are you paying for a painting when you can get a poster?
00:25:41.000
I think that we will still have some movies that are made as a form of art, where the
00:25:48.300
art and the human labor that went into it is a status symbol and is like an affirmed
00:25:57.120
If we're talking about the pop movies that the general public listens to, I think most
00:26:03.240
Not totally, but AI will be a major player in their writing, their, their public testing,
00:26:11.120
And I think that one of the main forms of entertainment will move away from movies.
00:26:14.740
I think movies might be seen as a little, you know, old, which will be fully immersive
00:26:19.500
and iteratively created entertainment environments.
00:26:23.840
So these are games where like you put them on and they create the world around you and respond
00:26:36.920
And then keep in mind, these can be done simply.
00:26:39.180
Like you might be thinking of a fully created world.
00:26:41.300
The early ones might be like those early text adventures.
00:26:44.480
Though I do think that those will be less popular than you imagine, because I still think that
00:26:48.480
people really like, sometimes they watch shows and movies that they hate just so that
00:26:53.840
they can talk about with them with other people.
00:26:55.800
And if it's super highly customized and different for every single person at every time, people
00:27:04.760
Man, I feel like so much of entertainment is just because people read books because
00:27:10.560
People talk about movies that they have watched for a shared cultural context, not because
00:27:15.660
they want that shared cultural context, but because they happen to have watched them
00:27:20.660
To prove you wrong, what I would say is if you look at the age of the internet era, right,
00:27:26.540
it used to be that there were tons and tons and tons of cultural touchpoint shows that anyone
00:27:32.620
This has gotten less and less and less where we get maybe one of those shows every like
00:27:38.020
three years at this point, which just isn't the way it used to be.
00:27:41.580
There used to be about two or three of those ongoing at any point in time.
00:27:56.060
Where you'd watch it because people are talking about it.
00:27:58.060
No, we were just speaking with Brian Chow the other day.
00:28:05.020
Yeah, because you're not, you don't care about social cohesion.
00:28:07.740
You're not trying to connect with people online.
00:28:09.700
And so far as I watch a lot of pop media, I have heard none of the people I listen to
00:28:14.940
And a lot of people also talk about Succession.
00:28:16.820
And before that, they were talking about things like Deadwood.
00:28:19.280
People were talking about Westworld a lot when it was coming out.
00:28:23.300
Game of Thrones was a huge point of connection for people.
00:28:25.280
The point I'm making is Game of Thrones and Westworld penetrated into my circle.
00:28:29.520
But those were almost half a decade ago at this point.
00:28:32.580
These other shows that you're talking about, I've literally heard of nobody talking about
00:28:38.120
No, I've heard of a ton of people talking about them.
00:28:43.920
What you're not realizing is the communities that you are listening to are the communities
00:28:48.000
that are less technophilic and therefore is not as far along this progression that humanity
00:28:55.540
The point I'm making is that as I have seen these shows enter the communities I engage with
00:29:01.180
less and less and less over time, it will just be a decade or so before the communities
00:29:08.640
Well, we're not talking only about how the super smart elites are going to live in 20 years.
00:29:12.860
We're also just talking about humanity, which is well, it matters if it turns out the super
00:29:17.840
smart elites are the major economic players and increasingly the economic players.
00:29:22.820
And this is what we're seeing from our VC friends, right?
00:29:24.900
They're like, when I invest in companies now, it is one guy was an outsourced team and a
00:29:31.060
Or maybe like five people in the US team and then like nobody else.
00:29:34.700
This is really different from the Silicon Valley of 30, 40 years ago.
00:29:38.440
We're like, even a janitor would, you know, get equity and make a ton of money and end up
00:29:44.400
And we were around when that was happening, when you, when Silicon Valley, when San Francisco
00:29:49.060
was able to maintain its wealth, because they knew that the community that was around the
00:29:53.600
founding of a startup would also get unimaginable amounts of wealth.
00:29:56.640
What we are going to see going forward is an unimaginable concentration of how that wealth
00:30:04.740
So that's another big prediction you're making for humanity 20 years from now is huge.
00:30:10.740
Like we think the wealth disparities now are insane, but you're saying they're going to
00:30:19.080
I wanted to ask, like, how do you think people are going to live differently?
00:30:21.900
Are we talking like capsule apartments for single people?
00:30:24.540
No, we're talking, the core difference is going to be wealthy people will be much more fortified
00:30:31.380
and in much more defended settlements than you have today, similar to South Africa today.
00:30:35.900
So either communities of wealthy people that are in these defended environments or, you
00:30:40.780
know, much more wealthy people are spending much less time in cities and much more time
00:30:53.000
Because that's where the social services that many of them are going to depend on them.
00:30:56.080
No, I think cities are going to begin to rot pretty soon.
00:31:01.520
So New York, LA, you know, like, et cetera, like the big, big, big ones are going to become
00:31:22.620
I don't, what are you, where are you going with this?
00:31:26.080
Oh, like, oh, a political, like killing, actively killing people.
00:31:30.920
Nobody cares about it because economically it doesn't matter.
00:31:34.320
We as humans never care about the people who economically and culturally don't matter.
00:31:39.820
The reason why we don't care about this genocide is because the people involved in it
00:31:49.240
They don't matter from the world's perspective.
00:31:51.860
If you look at like the people in Gaza don't affect people's lives.
00:31:55.560
The reason why people are focused on what's going on in Gaza, in Israel right now is because
00:32:02.800
But if you look at the scale of the killings that are happening in the Gaza war right now,
00:32:07.460
they are genuinely trivial compared to what's going on in the Sudan right now.
00:32:15.420
It's the Darfur situation is basically reignited.
00:32:22.820
When I was in college, though, many, many, many people I knew were intensely focused on
00:32:34.180
No, you think people, like if I were a college student again today, you don't think that
00:32:46.140
Like, by the way, if people want to learn about this situation, I think it's the real
00:32:51.520
I'll put a video on a screen that you can search on YouTube that has a really great description
00:32:56.900
of what's going on, who's putting money into it, who are the different
00:33:07.680
Basically, this dictator didn't want to lose power.
00:33:10.780
And he thought the way he would prevent himself from losing power was to create two separate
00:33:14.960
military groups that were completely separate from each other.
00:33:18.720
So neither one could coup him without the other one coming to defend him.
00:33:21.700
But then he like died or lost control in some way.
00:33:26.280
And now the two military groups are at war with each other.
00:33:33.080
The point here being is that we as humanity have always not cared about the people who
00:33:38.920
were not economically or socially relevant to our lives.
00:33:42.800
And when you look at the people whining about Gazans, it's really honestly just an excuse
00:33:47.500
for anti-Semitism, which has always been a major leftist thing.
00:33:57.220
They don't give a shit about the people of Gaza.
00:33:59.220
They probably, you know, couldn't have even pointed to it on a map before this.
00:34:10.840
They care about it because they can use it to besmirch the Jewish people who actually
00:34:14.700
do matter in a historic context, broadly speaking, because Israel is one of the few
00:34:19.220
high fertility, high economic productivity places in the world.
00:34:22.620
And so they want to shit on them as the same way they want to shit on any successful group.
00:34:32.720
I said, well, when I talk about like the what's going on deferred and then like the
00:34:35.420
Sudan now, you know, it's like you go to a leftist and you're like, did you know that
00:34:39.040
there is a black group being genocided right now?
00:34:48.100
And you're like, it's actually another black group.
00:35:05.280
They care about how they can use it for their political points.
00:35:09.160
They don't actually care about these communities or these people.
00:35:12.560
Do you think polarization politically is going to be worse, better?
00:35:19.840
People wonder why we are so confidently siding with the right.
00:35:23.280
Like, we're like, oh, you're doing things that really make it hard for you.
00:35:26.240
And it's because, one, I know that they ultimately are going to win this.
00:35:29.760
And two, because in the future, you're going to need a side.
00:35:40.200
And I hope I'm not being too spicy in this episode.
00:35:44.940
I hope that our children create a future that is a little brighter than what we're describing, for the most part.
00:35:52.700
And that they do not succumb to AI girlfriends and boyfriends, as disgusting as humans are.
00:36:00.480
You might need to at least have artificial wombs by then.
00:36:10.480
So, you know, they're going to need to find real breeding partners, which means we're going to need to find and convert real breeding partners into our cultural group or within our network of families do an arranged marriage for them.
00:36:23.460
Like, you are such a romantic person who is so sweet and loving.
00:36:27.720
I think, hopefully, I think, hopefully it will be in their denna, just, like, genetically inherited to be romantics who seek out a carbon-based partner, I hope.
00:36:48.800
It'll be a third, and they'll have a polycule with all AIs.
00:36:53.980
It'll be a monogamous polycule, because it will be them and their partner and then another one.
00:36:58.640
I haven't thought about our kids being poly, and I'm going to have to, like, put on a mask of just not cringing.
00:37:11.680
If you want to see a good poly story that I found really interesting recently, look up the wacky Portland polycule.
00:37:17.940
Well, it likely wasn't real, but, like, it felt so real to a lot of people that they were like, oh, this must be true, because I just didn't like this.
00:37:33.320
Yeah, it all unfolded in an Am I the Asshole post.
00:37:39.100
So, yeah, we'll leave you with that, ladies and gentlemen, in case you're too depressed about our future.
00:37:52.920
Yeah, actually, the people who watch this are, like, uniquely high agency, so congrats to you guys.
00:37:58.920
We love you, but I love Malcolm more, so I'm going to start your pork chops.
00:38:11.320
You see, this is the great thing about YouTube, looking up how to cook pork chops.
00:38:14.080
It's a lot like pan-seared steak, is the way I'm going to do it, but with a lot of butter in the cast iron skillet, so.