Based Camp - March 07, 2024


Wild Speculation: How Will Life Change in 20 Years?


Episode Stats

Length

38 minutes

Words per Minute

191.71936

Word Count

7,378

Sentence Count

476

Misogynist Sentences

5

Hate Speech Sentences

10


Summary

In this episode of the podcast, I sit down with my good friend and co-founder of a company that's making a ton of money in the toy and toy manufacturing space. We talk about the future of AI and how it will change the way we live and work in the future.


Transcript

00:00:00.000 If you want to found the company that makes you a butt ton of money within our generation,
00:00:05.720 it is simple, large-scale, replicable technology that allows AI to interact with physical reality as we understand it.
00:00:15.600 So what a lot of people are doing in this space, which is why they're failing, is they're basing this on old models of robotics.
00:00:23.000 These are the robotics that we were using in factories.
00:00:25.520 I could imagine early founders co-opting toys, like literally just co-opting toys because they are-
00:00:31.760 No, that's what I think it will be based on, is toy manufacturing.
00:00:35.740 These will be the things that are watching our kids, that are making our meals at home and stuff like that, right?
00:00:41.780 And I think that right now when people are looking at doing stuff like this,
00:00:45.480 they are basing them off of these basically hard-coded models in terms of how they walk, how they interact with things.
00:00:51.560 Instead of taking advantage of the leaps that AI has given us in terms of opportunity.
00:00:57.100 Would you like to know more?
00:01:00.540 Are you looking at memes on Facebook?
00:01:02.920 Yeah, you know me.
00:01:06.580 You know me? I love my memes!
00:01:09.180 I love my memes! I'm a meme man.
00:01:13.140 There's men out there who have various skills.
00:01:15.640 I'm not one of those, but I do know a number of memes.
00:01:20.460 If I was transported into the past, I would not be able to speed up the speed of invention,
00:01:27.060 but I'd be able to trade them some sick memes.
00:01:34.000 My brain is a library of memes.
00:01:37.820 Oh, and I was just about to make fun of you for being on Facebook and therefore being old,
00:01:41.520 but then I recently looked up the demographics of Facebook and the majority of users are below 36 years old,
00:01:48.660 which blows my mind.
00:01:50.180 Yeah, huh? What's going on there, huh?
00:01:51.800 No, that's a lie. Mirror world nonsense.
00:01:54.260 I guess. I really cannot understand how that could be the case,
00:01:58.640 but that's what the internet told me.
00:02:00.820 But the internet is full of lies, so what can I say?
00:02:03.600 So speaking of which, actually, we are going to do some just wild speculation
00:02:07.460 as to the future of humanity, 5, 10, 50, 100 years, et cetera.
00:02:12.060 And I do think that we are going to have more than ever a crisis of reality online
00:02:16.540 as more and more AI comes online, more and more AI content is created.
00:02:21.780 And also, I do think that as much as-
00:02:24.300 Let's start with that. Go.
00:02:25.400 Okay. As much as we've had recent mishaps with Google's Gemini, for example,
00:02:30.200 show that trying to throttle or otherwise control AI to make it more politically correct
00:02:35.200 or to have it feed into ideas of a more aspirational world can backfire,
00:02:40.700 I don't think that's going to stop people.
00:02:42.260 I don't think that's going to stop businesses.
00:02:43.700 I don't think that-
00:02:45.000 I mean, ultimately, the urban monoculture is the urban monoculture,
00:02:47.560 and it runs-
00:02:48.320 What's going to happen because of that?
00:02:49.420 It's going to create an internet that has a crisis of reality,
00:02:52.220 meaning that our friends who are creating banks of pre-2020 Wikipedia articles
00:02:58.120 are actually spot on in doing this because I just can't know what will be true
00:03:05.600 and what will not be true on the internet.
00:03:06.840 It's interesting that you put this out.
00:03:07.600 You know, one of the things I've heard is that, you know, in 20 years or so,
00:03:10.680 a lot of people, like right now, Gemini, we look at it and we laugh.
00:03:13.600 Ha ha ha, thinks that Washington was a black guy.
00:03:16.660 And that in 20 years, there might actually be genuine crises about this.
00:03:21.040 Yeah, I think people are going to be like, wait, no, but he was black.
00:03:24.380 I don't think you understand.
00:03:26.440 The AI tells me he's black.
00:03:28.360 It's told me that he was black since kindergarten.
00:03:30.660 I've seen the pictures.
00:03:31.760 Yeah.
00:03:32.480 Yeah, exactly.
00:03:32.880 I've seen the pictures.
00:03:33.780 I've seen all the historic photos, you know?
00:03:36.080 Actually, the idea of not having historic photos of something might become weird to people.
00:03:40.180 People might sort of blur the lines when we started getting historic photos
00:03:42.860 because they can so easily be generated from AI.
00:03:45.680 Yeah.
00:03:46.000 Okay.
00:03:46.940 So I agree that this is one thing that is definitely going to change as a crisis of information.
00:03:52.400 where a lot of information that starts being generated from now on is going to be tainted
00:03:57.700 by AI and also in terms of draining data.
00:04:00.320 Because when you train AI on training data that comes from other AI, at least the models
00:04:04.140 we've had now, it creates problems and it generally degrades over time.
00:04:08.140 Yeah.
00:04:09.400 And so it's almost like, so people who know when they're building Geiger counters and stuff
00:04:15.100 like that, they need to use metal from, I think it's lead or something, from submarines.
00:04:20.200 Oh, right.
00:04:21.400 Or wrecked ships.
00:04:23.020 Because that's never been above the surface of the water since we started testing nuclear
00:04:27.760 bombs.
00:04:28.620 Because all metal above the water since that period has this level of radioactive contamination,
00:04:33.280 which makes it not useful for certain activities.
00:04:35.360 And so what you're basically saying is it makes a lot of sense to start sort of banking
00:04:39.960 internet stuff right now.
00:04:41.900 And this doesn't even make sense as a company, like trying to bank as much of the pre-internet
00:04:47.440 stuff as you can right now.
00:04:48.640 Oh, because then you can sell it, sell access.
00:04:50.820 Oh, shit.
00:04:51.720 Yeah.
00:04:52.100 Even beyond what you see with something like the internet archive, like really good banking,
00:04:58.440 pure human created information.
00:05:00.360 Which might have some level of value to people who are, ironically, probably the primary value
00:05:06.360 is people who are training AIs and stuff like that.
00:05:08.740 But then I think related to this issue is we're also going to see what we've referred
00:05:12.220 to in other podcasts and all over the place as techno feudalism, where you're going to get
00:05:16.860 a sort of a massive acceleration and expansion of what we're beginning to see on Substack.
00:05:22.300 So as people are now having a crisis of quote unquote fake news and not being able to trust
00:05:27.020 mainstream news media as much as they used to before, what we're seeing a lot of people,
00:05:31.840 especially including like very smart people, but all sorts of people of pretty much every
00:05:36.760 walk of life and income starting to turn to or continuing to turn to specific people for
00:05:44.220 specific pieces of advice.
00:05:45.700 And they aren't necessarily as engaged with the news for their primary source of information
00:05:50.460 as they are with this fiefdom of information.
00:05:53.840 So I think we're going to see that a little bit more accelerated.
00:05:57.580 Like we might actually just see people break off into communities around certain realities
00:06:02.320 that are built around certain people or certain collections of people.
00:06:06.520 Almost like...
00:06:07.120 Yeah, this is really interesting that you mentioned this because, you know, now that we live in
00:06:11.360 a world where you can build communities around any group that you share a similar culture
00:06:18.400 or ideology to, it's much more easy to create these tech features.
00:06:22.120 Sorry, that's what you were saying.
00:06:23.760 I don't want to...
00:06:24.420 Well, imagine like the YouTuber houses that sprouted up, you know, when YouTubers started
00:06:29.940 like getting together.
00:06:30.980 Imagine that, except also it includes an entire community that like also people start to either
00:06:35.780 come to gather together a certain number of times each year or actually live together.
00:06:41.100 I think there's a lot less value in actual in-person communities that people pretend, except when it comes
00:06:47.760 to childbearing.
00:06:48.320 And I think that we have repeatedly seen these communities fail and fail to stay stable.
00:06:54.260 Everybody always says that they want community.
00:06:56.520 What they really mean is they want...
00:06:57.620 I'm not saying living together, but I am saying seeing each other.
00:07:00.200 And there's a big reason in 20 years why this is going to be necessary because...
00:07:04.700 Oh, AI.
00:07:05.820 Uh-huh.
00:07:06.340 You won't actually know if something that you're following online is real or not, even if they
00:07:11.620 have visible pores and they look kind of weird, which I think is going to be one of the first
00:07:16.600 like early giveaways that someone's still human.
00:07:19.480 But also that like, yeah, I mean, even just seeing them in person once every now and then
00:07:24.540 and having them verify you, because I think a lot of people are also not going to be able
00:07:28.720 to get sponsors or other forms of payment unless they know it's like a real natural,
00:07:33.320 actual like meat puppet human.
00:07:35.040 That's really interesting.
00:07:36.020 Yeah.
00:07:36.240 I think you're right about that.
00:07:37.060 And another thing I've seen is on this really, you know, is something I didn't expect when
00:07:41.880 I was younger.
00:07:42.380 But if you're talking about the people who are making like changes in the world and like
00:07:46.280 a lot of people are, I think, a little surprised by the access we have to other influencer,
00:07:51.960 like huge influencer type peoples or ultra wealthy people or et cetera.
00:07:56.800 They're like, how do you nobody's, you know, they look at our follower count and stuff like
00:08:00.340 that has so much access.
00:08:02.000 Um, and it's because the world right now in terms of communities that interact with each
00:08:08.380 other is sort of naturally sorting based on a few metrics, one of which is intelligence.
00:08:13.060 Um, and like, I don't mean to be arrogant, but I think pretty obviously we're very smart,
00:08:18.440 like you're incredibly smart for a woman.
00:08:20.740 I don't want to cause any spiciness there, but I'm very smart for a human.
00:08:30.820 You're very smart for an exceptional human.
00:08:33.260 But you know, what it has allowed us to do is, is, is because of this, you know, we've
00:08:37.780 been sort of found by sort of top thinkers within various fields, but it allows us, you
00:08:42.100 know, somebody will come to us and they'll be like, Oh, you should talk to the guy who
00:08:46.780 does what if all his, for example.
00:08:48.500 And I'm like, I have a weekly call with him or, or somebody recently was like, Oh, you
00:08:52.780 should, have you ever heard of this Samo Borgia guy?
00:08:56.400 And I'm like, yeah, I'm meeting with him in a couple of weeks and Scott Alexander.
00:09:00.400 And, you know, I would say the number of people who are both smart, but also among the 1% on
00:09:08.740 the internet who actually contribute content is extremely, extremely small.
00:09:12.100 There are, there are many, many, many more smart people who just don't contribute.
00:09:16.100 And that's, that is also a meaningful factor here, but I also want to point out in terms
00:09:20.460 of my predictions, what I'm kind of pointing to the point of this is actually pretty important.
00:09:24.940 Okay.
00:09:25.180 Then get to it.
00:09:25.620 Is that what it means is that you are getting, you have sort of fiefdoms forming within reality.
00:09:32.060 One fairly small fiefdom contains most of the world's scientifically, economically, and
00:09:41.760 mimetically productive population.
00:09:44.340 You say that, but I mean, I also think that there's going to be like, you know, the CCP
00:09:47.900 has its own little fiefdom of like incredibly influential people.
00:09:51.960 And in Europe, there are separate little fiefdoms of different, you know, cultural.
00:09:58.020 I disagree.
00:09:59.500 People I know who are like mimetically interesting coming out of Europe, they are largely connected
00:10:04.940 to our fiefdom and they largely commune with our fiefdom.
00:10:08.780 Like if they're working on like genuinely interesting stuff, that's not just completely
00:10:12.560 subsumed by wokeism.
00:10:13.880 They're in our community.
00:10:14.940 If you're talking about the CCP, nothing interesting is coming out of the CCP.
00:10:19.020 This would be one of my predictions that we haven't really gotten to, but I expect the
00:10:23.340 major change in our society.
00:10:25.100 Like people are like, when do you think things begin to start changing in a major way?
00:10:30.960 It's going to start changing in a major way when China collapses.
00:10:35.320 Okay.
00:10:35.920 So interesting.
00:10:37.660 And the stuff that's happening in the CCP.
00:10:39.620 And you think that's going to happen within 20 years?
00:10:41.380 Because we're talking 20 years.
00:10:42.860 Yes.
00:10:44.340 No, I have a bet with the founders of Level Health.
00:10:48.000 It's a $1 bet, but it's a bet nonetheless.
00:10:50.980 And this, I made this bet like a half a year ago or something.
00:10:53.900 It was in 10 years of making the bet that China's economy on the books, like the economy
00:10:59.320 that they were telling people they had in terms of the size of the economy, that it would
00:11:02.520 be recognized as smaller than whatever they said it was.
00:11:05.160 I'm not saying the absolute numbers was in 10 years.
00:11:08.020 Not that what they were reporting, but like that their economy would not only experience
00:11:11.320 no growth from that point, but that it would be generally recognized as smaller.
00:11:15.440 And everyone at the party that I was at thought that this was just an insane bet to make,
00:11:22.280 like that I was really putting my reputation on the line.
00:11:24.500 But I do not feel I am.
00:11:26.140 I think China will collapse.
00:11:28.380 Because I had argued at the party that I expect China to have a systems collapse and that anyone
00:11:32.580 who's not expecting that is just not familiar with their current economic circumstances.
00:11:36.160 If you want to learn more about this, you can look at our video on the future of East
00:11:39.340 Asia.
00:11:39.660 But I think that that begins to precipitate what happens after this.
00:11:43.620 But continue is what you're saying, Simone, before I interrupted you.
00:11:46.500 That despite the fact that we are going to see fiefdoms that are built around a crisis of
00:11:51.980 reality, a crisis of trustworthiness, but also importantly, a crisis of needing to know that
00:11:57.360 you're a carbon-based life form because I can monetize you and or like feel a different
00:12:02.920 type of relationship with you.
00:12:03.960 I think people are going to be super bullish on AI and that people will have primarily,
00:12:10.100 in many cases, AI friends and not biological friends.
00:12:14.160 They're going to have AI girlfriends.
00:12:15.640 They're going to have AI teachers.
00:12:17.700 And they're just, because it's better.
00:12:19.700 It's going to make them feel better.
00:12:21.860 And then we're going to have, of course, then even more of a crisis of fertility because
00:12:25.420 people are not going to want to meet real-
00:12:27.760 Yeah, no, I've heard from some people who are like AI is like a fad or it's oversold at
00:12:33.300 its current stage.
00:12:34.500 The people who tell you this are stupid people.
00:12:37.500 I mean this like as nice as I can.
00:12:39.960 The people who tell you this are stupid, stupid people.
00:12:43.720 They are much more stupid than the people who said Web3 and crypto were the fad.
00:12:48.500 Like Web3 and crypto, like the way it changed humanity was genuinely fairly limited.
00:12:53.160 I remember I had one guy and he's like, oh, you can't really do anything with AI these
00:12:56.760 days.
00:12:57.140 He's like, everyone who's using AI, it's mostly as like a gimmick or like a fun thing.
00:13:02.760 I'm like, no, you're just describing yourself because you're an idiot who doesn't understand
00:13:08.660 how to use AI.
00:13:09.560 I use AI in my work and genuinely, I use AI almost every day and it's probably sped up
00:13:15.200 the speed and quality of our projects.
00:13:16.980 Multiple times a day on multiple different things.
00:13:20.000 Yeah, I use it in every single video that you are watching.
00:13:23.300 All of the tags come from AI, all the descriptions come from AI, a lot of the video titles come
00:13:28.180 from AI.
00:13:29.400 Like what is this?
00:13:30.760 The videos are edited using AI.
00:13:32.880 I use Descript to edit all our videos.
00:13:36.500 It's remarkable that somebody could be this stupid and pigheaded to not realize how much
00:13:41.920 actually productive people are using AI in this economy.
00:13:46.020 But I think a lot of people assume that humans are going to be carbon fascists and are going
00:13:51.360 to continue to care about human relationships.
00:13:54.220 And you even see this in sci-fi, although I understand why, because it's harder to depict
00:13:57.600 human-robot relations like in a very compelling way.
00:14:00.760 But although more movies are doing it now, I should say, if anyone has doubts that people
00:14:06.580 are going to fall for AI girlfriends and boyfriends and AI children and AI everything, counselors,
00:14:12.700 teachers, religious leaders, et cetera, imagine a world in which people were incapable of being
00:14:19.820 catfished.
00:14:20.400 And if we lived in a world like that, okay, fine.
00:14:22.480 But like people are catfished all the time.
00:14:24.360 And why is that?
00:14:25.140 It's because something or someone who is beautiful and kind and attentive to you and who tells
00:14:31.140 you what you want to hear, you're going to fall for it.
00:14:34.820 And this is a really big problem.
00:14:36.260 And AI is going to be way better at that than any catfisher ever.
00:14:40.020 And I don't care how old you are, how young you are, if you're, and they will, if your
00:14:45.900 grandchildren are neglecting you, if your children are neglecting you, if no one at
00:14:49.780 work is listening to you, you're going to find either you're going to seek out or you're
00:14:54.660 going to succumb to AI companions that are just way more comforting and enjoyable to be
00:15:00.560 around and who care about you way more than humans.
00:15:04.420 So there are going to be many, many, many, very, very deep human computer relations, which
00:15:11.700 is going to cause even more of a like pinching point in demographic collapse.
00:15:16.340 And this is also something, you know, when I see as AI, when, when, when people are like,
00:15:19.700 oh, we're not at AGI yet, or AIs make a mistake, therefore you can't use it.
00:15:26.120 I'm like, are you an idiot?
00:15:27.660 Like, I know, I'm sorry.
00:15:28.920 Like, do you, have you never hired like normal humans, normal humans make mistakes too at
00:15:35.700 rates consummate whiz or higher than AI when we're hiring like a normal distributed workforce.
00:15:42.280 And they're like, well, you know, we're far from AGI.
00:15:45.520 And I'm like, how the are you defining AGI?
00:15:48.740 They're like, well, you know, when AI is smarter than humans.
00:15:51.960 And I'm like, like, I feel like a lot of these people who say these things say them because
00:15:59.600 they don't interact with normal humans.
00:16:01.360 They are at Stanford or they're in Silicon Valley and everyone they know is a fellow
00:16:05.880 rich person, or they are in a dumb community and everyone they know is dumb and they don't
00:16:10.920 know how to judge intelligence.
00:16:12.700 But if you are a smart person and you interact with normal people, as we regularly do, because
00:16:18.320 we hire inexpensive people for a lot of the jobs we have, people just like throughout our
00:16:23.280 other work, like political work and work and AI right now is much smarter than the average
00:16:28.800 human at almost any job.
00:16:32.640 Like it's, it's, it's remarkable.
00:16:34.960 We are so far past the question of what happens when AI is smarter than the average human.
00:16:40.260 And in terms of the bottom 25% of the human population, I think almost anyone would agree
00:16:45.340 that AI can do almost any job better than the bottom 25% of the population.
00:16:50.480 That doesn't involve like tactile-ness, right?
00:16:53.160 Because AI can't do that yet easily because we don't have the right form factors.
00:16:58.100 But the reason it can't do that yet is because we don't have the right inexpensive form factors
00:17:02.160 and nothing else.
00:17:04.040 Actually-
00:17:04.300 Yeah, question about that.
00:17:05.300 So like in 20 years, do you think this is going to be less of an issue?
00:17:08.000 I mean, there are tons of robotics companies and people making huge financial bets as investors
00:17:12.300 on robotics companies, but as much as we've seen huge breakthroughs in image generation,
00:17:16.860 chat, programming, video, you know, I'm not seeing anything with like physical-
00:17:21.720 I think that is going to be the, the, the big Apple, Google, whatever founded within our
00:17:26.780 generation.
00:17:27.500 If you want to found the company that makes you a butt ton of money within our generation,
00:17:33.260 it is simple, large-scale replicable technology that allows AI to interact with physical reality
00:17:41.940 as we understand it.
00:17:43.500 And so what a lot of people are doing in this space, which is why they're failing, is they're
00:17:47.860 basing this on old models of robotics.
00:17:50.920 These are the robotics that we were using in factories.
00:17:53.720 These are the robotics that you see with something like, um, what's that goofy company that keeps
00:17:58.500 doing the dog that people kick?
00:17:59.960 Like, uh, yeah, I mean, it's, it's impressive what they're doing.
00:18:04.540 I just don't see it scaling.
00:18:05.920 Right.
00:18:06.320 But this is the problem.
00:18:07.340 See, they were doing this in this earlier world where you sort of needed to hard code
00:18:10.780 everything.
00:18:11.640 I think the group that wins, what they're going to do is they're going to be one or two guys
00:18:16.440 who basically puts together something fairly simple and inexpensive, almost more like a
00:18:21.820 toy, like a drone toy.
00:18:23.760 Like a little cheap drones you buy.
00:18:25.200 Yeah.
00:18:25.880 Yeah.
00:18:26.280 But, but it will, it will not be a drone.
00:18:27.980 It will probably be on treads or something like that when they're going to put this together
00:18:31.820 using sort of outsource factory labor and somewhere like China or Vietnam or something like that.
00:18:37.200 They're just going to, their designs over and they're going to be like, can you make
00:18:39.960 this?
00:18:40.480 It's going to cost something like $50 to a hundred dollars to make each unit.
00:18:44.660 These units are going to be fairly flimsy.
00:18:48.120 I think from our perspective, like not like the best, you know, simple metal, maybe like
00:18:52.980 almost out of coat hangers.
00:18:54.180 You could think of them as being anyway.
00:18:56.660 Yeah.
00:18:57.060 So I expect them to be flimsy, like coat hangers or something like that.
00:19:00.760 Like not, I think what we can imagine.
00:19:03.040 Not robust, especially expensive, custom designed.
00:19:05.280 And the way that they interact with things will not be governed by very expensively and
00:19:11.380 expertly designed Boston Dynamics engineers or something like that.
00:19:15.500 It will all be determined by transformer model and determined organically in terms of how
00:19:20.340 these things are.
00:19:21.940 So basically the entire.
00:19:23.340 Actually, I could imagine like early founders co-opting toys, like literally just co-opting
00:19:28.980 toys because they are.
00:19:29.900 No, that's what I think it will be based on is toy manufacturing.
00:19:33.880 And these will be the things that are watching our kids that are making our meals at home
00:19:38.500 and stuff like that.
00:19:39.420 Right.
00:19:39.920 And I think that right now when people are looking at doing stuff like this, they are
00:19:43.940 basing them off of these basically hard coded models in terms of how they walk, how they
00:19:48.900 interact with things instead of taking advantage of the leaps that AI has given us in terms
00:19:53.640 of opportunity.
00:19:55.400 Yeah.
00:19:55.700 But also like durability.
00:19:57.520 Some of our toys that are like incredibly affordable, like that all-terrain green RC
00:20:03.040 car that we have can really get around and do not get stuck very easily.
00:20:07.280 And if you combine that with AI and just allowed it to move around and give it some additional
00:20:11.980 functionality, it'd be pretty impressive.
00:20:14.840 Yeah, it could do a great deal.
00:20:16.380 And that's what people aren't recognizing, right?
00:20:19.000 Is they're basing the AIs today off of the AIs of yesterday where everything needed to
00:20:24.500 be sort of hard coded by engineers.
00:20:25.980 Yeah, or a humanoid or animaloid looking robot.
00:20:30.680 Yeah.
00:20:31.160 And that's not the AIs of tomorrow.
00:20:33.520 You know, they're going to be drone swarms and stuff like that, which is quite different
00:20:39.260 from anything we're looking at in terms of AI.
00:20:42.000 And I think economically, a lot of people can be like, well, if you think AIs are going
00:20:47.980 to be able to do so much, you know, why are you worried about fertility collapse in terms
00:20:52.820 of its long-term economic impact?
00:20:55.080 And the answer here is, you know, I mean, we try not to oversell that it matters that
00:20:59.920 smart people aren't having kids and that there is a genetic component to human intelligence.
00:21:04.440 But in a world of AI, this becomes uniquely elevated as a problem because a lot of humans,
00:21:12.000 once we get this toyish AI thing that we're talking about, are just going to become irrelevant
00:21:18.280 from the perspective of what they can provide to an economy.
00:21:21.620 They will not be able to at all compete with AI.
00:21:24.640 When these individuals, you know, who knows what we're going to do as a society?
00:21:28.560 I don't know yet.
00:21:29.280 But it's going to be tough when this happens economically, because we're going to have
00:21:33.540 a huge transition.
00:21:34.500 And we've talked about this, you know, people are like, oh, they'll get UBI or something.
00:21:38.020 I'm like, not really.
00:21:39.800 That's not what we've seen historically.
00:21:41.780 They're just going to live lives of incredible poverty and sadness.
00:21:45.940 And so the question is, well, then why are you encouraging fertility at all?
00:21:49.520 Because we need more fertility of the types of humans who have the ambition and self-agency
00:21:56.940 to work with AI and to build the next iterations of our species?
00:22:01.520 Because those are the people that are being mimetically selected out of the population most
00:22:05.400 aggressively.
00:22:06.960 But it's also why our belief in the capacity of AI, why we don't really need to reach everyone.
00:22:14.180 We only need a small, self-sustaining, incredibly intelligent community to defend ourselves.
00:22:20.580 People are like, oh, you won't be able to defend yourself.
00:22:22.620 And it's like, uh, like I know people who are already in our community who are building
00:22:27.940 like automated AI defense systems.
00:22:29.920 Like I'm really not worried.
00:22:32.460 For them at least.
00:22:34.060 Well, I mean, I guess I'm worried for the outside world, but not really because they
00:22:37.900 don't have to worry about anything if they're not trying to kill us.
00:22:40.460 And I think that this is increasingly what we're going to see.
00:22:43.860 You know, as we've said, I think when you begin to have a large scale economic collapse
00:22:47.920 around the world, what it's going to look like is small communities of economically
00:22:51.360 productive individuals that are fortified against the outside world and outside forces
00:22:56.380 that want to use force to extract value from them.
00:23:00.920 And I think that broadly the AI, like we've talked in another video that we did recently
00:23:06.380 on like utility convergence in AI, that AI will see utility in these communities, but it
00:23:12.760 might not see utility in the rest of humanity.
00:23:14.820 Yeah, or I think broadly that AI is going to be nice to humanity.
00:23:21.260 It's just going to, it's going to be humane toward those who just want pleasure and who
00:23:26.240 will self extinguish and it will give them, they will self extinguish in a state of extreme
00:23:31.200 bliss and hedonic happiness.
00:23:33.520 So great outcome for them, I guess, as far as you're concerned.
00:23:37.400 I think kind of, I actually think the best description I've seen of how AI is going to
00:23:41.740 treat that iteration of humanity is the South Park future episode where they're married
00:23:47.140 to AIs that's constantly offering them ads and everything like that.
00:23:51.320 I think it will give them hedonic bliss in the same way that like YouTube, when you're not
00:23:56.180 paid for a premium subscription does, where it gives you like two ads every 15 seconds.
00:24:00.920 I think that that's what they're going to get.
00:24:03.160 It's going to try to extract the maximum value from them possible.
00:24:08.420 Yeah, I mean, I hope not.
00:24:10.280 Maybe, I'm more picturing the WALL-E universe of rotund soft humans.
00:24:18.380 Yeah, I don't think that's going to happen as much.
00:24:20.340 I think most of them are going to be genetically selected out of the population pretty quickly.
00:24:25.140 Yeah, maybe, I don't know.
00:24:26.620 I mean, I also think that they're going to be very technophobic subsets of society.
00:24:30.920 You know, the Amish, but more, like more groups like that, that just really, really,
00:24:36.460 really opt out because they see where things are going and that those things do not align
00:24:40.160 with their values.
00:24:40.400 They won't be economically relevant.
00:24:41.940 Yeah, right.
00:24:42.580 So as AI becomes more important within the economy, as it begins to do more things, the
00:24:47.380 groups that are opting out in terms of like the world thought space and economic space
00:24:52.760 become increasingly, increasingly irrelevant.
00:24:55.840 Right.
00:24:56.020 You know, the world is multiple spheres of power and as humans matter less in terms of
00:25:00.400 the spheres of power, those groups will matter less.
00:25:03.280 Hmm.
00:25:03.800 Yeah.
00:25:05.180 Yeah, that's, that's fair.
00:25:06.920 And of course, I mean, in 20 years, can you, do you think that movies will still be made
00:25:11.820 with humans and shows and whatnot?
00:25:14.640 Not, it's hard for me to imagine that being economically viable, viable, especially because
00:25:19.600 there's really heavy unionization in the entertainment space.
00:25:23.020 No, I think you're asking the wrong question.
00:25:26.160 Today, people still pay for paintings that were painted by a human.
00:25:30.540 Right.
00:25:31.040 But because they can own them.
00:25:32.720 Listen, but pictures exist.
00:25:34.800 Okay.
00:25:35.420 Yeah.
00:25:35.740 You can take a picture.
00:25:36.640 You can mass produce it.
00:25:37.720 Why are you paying for a painting when you can get a poster?
00:25:40.100 Right.
00:25:41.000 I think that we will still have some movies that are made as a form of art, where the
00:25:48.300 art and the human labor that went into it is a status symbol and is like an affirmed
00:25:52.700 status symbol.
00:25:53.940 It's kind of like we still have opera today.
00:25:56.320 Yes.
00:25:57.120 If we're talking about the pop movies that the general public listens to, I think most
00:26:00.740 of those will be heavily produced by AI.
00:26:03.240 Not totally, but AI will be a major player in their writing, their, their public testing,
00:26:09.220 et cetera.
00:26:09.940 Yes.
00:26:10.700 Okay.
00:26:11.120 And I think that one of the main forms of entertainment will move away from movies.
00:26:14.740 I think movies might be seen as a little, you know, old, which will be fully immersive
00:26:19.500 and iteratively created entertainment environments.
00:26:23.840 So these are games where like you put them on and they create the world around you and respond
00:26:29.920 to your choices organically using AI.
00:26:34.020 I mean, they might be in a phone.
00:26:35.980 They might be in a computer.
00:26:36.920 And then keep in mind, these can be done simply.
00:26:39.180 Like you might be thinking of a fully created world.
00:26:41.300 The early ones might be like those early text adventures.
00:26:44.280 Yeah.
00:26:44.480 Though I do think that those will be less popular than you imagine, because I still think that
00:26:48.480 people really like, sometimes they watch shows and movies that they hate just so that
00:26:53.840 they can talk about with them with other people.
00:26:55.800 And if it's super highly customized and different for every single person at every time, people
00:27:00.560 cannot talk about it the same way.
00:27:02.120 I disagree.
00:27:02.700 I agree.
00:27:03.400 Really?
00:27:04.760 Man, I feel like so much of entertainment is just because people read books because
00:27:07.960 they like to smell.
00:27:09.400 No, they don't.
00:27:10.560 People talk about movies that they have watched for a shared cultural context, not because
00:27:15.660 they want that shared cultural context, but because they happen to have watched them
00:27:18.900 and people are talking about them.
00:27:20.660 To prove you wrong, what I would say is if you look at the age of the internet era, right,
00:27:26.540 it used to be that there were tons and tons and tons of cultural touchpoint shows that anyone
00:27:30.480 could reference and everyone else would know.
00:27:32.620 This has gotten less and less and less where we get maybe one of those shows every like
00:27:38.020 three years at this point, which just isn't the way it used to be.
00:27:41.580 There used to be about two or three of those ongoing at any point in time.
00:27:45.660 And that's just not the world we live in.
00:27:47.420 You have something like Game of Thrones.
00:27:49.560 You have, what do we have today?
00:27:51.200 What would be that show?
00:27:53.720 That a lot of people are talking about?
00:27:55.720 Yeah.
00:27:56.060 Where you'd watch it because people are talking about it.
00:27:58.060 No, we were just speaking with Brian Chow the other day.
00:28:01.220 I mean, like.
00:28:02.360 You said White Lotus?
00:28:03.320 I've literally never even heard of it.
00:28:05.020 Yeah, because you're not, you don't care about social cohesion.
00:28:07.740 You're not trying to connect with people online.
00:28:09.260 I do, Simone.
00:28:09.700 And so far as I watch a lot of pop media, I have heard none of the people I listen to
00:28:14.360 talking about it.
00:28:14.940 And a lot of people also talk about Succession.
00:28:16.820 And before that, they were talking about things like Deadwood.
00:28:19.280 People were talking about Westworld a lot when it was coming out.
00:28:23.300 Game of Thrones was a huge point of connection for people.
00:28:25.280 The point I'm making is Game of Thrones and Westworld penetrated into my circle.
00:28:29.520 But those were almost half a decade ago at this point.
00:28:32.580 These other shows that you're talking about, I've literally heard of nobody talking about
00:28:36.420 them.
00:28:37.160 I think that increasingly.
00:28:38.120 No, I've heard of a ton of people talking about them.
00:28:40.380 But I'm more socializing.
00:28:41.500 Like you stay in a very small nation.
00:28:43.140 No, no, no, no, no, no.
00:28:43.920 What you're not realizing is the communities that you are listening to are the communities
00:28:48.000 that are less technophilic and therefore is not as far along this progression that humanity
00:28:53.860 is going.
00:28:54.540 Perhaps.
00:28:55.080 Yeah, perhaps.
00:28:55.540 The point I'm making is that as I have seen these shows enter the communities I engage with
00:29:01.180 less and less and less over time, it will just be a decade or so before the communities
00:29:07.400 that you are engaging with.
00:29:08.640 Well, we're not talking only about how the super smart elites are going to live in 20 years.
00:29:12.860 We're also just talking about humanity, which is well, it matters if it turns out the super
00:29:17.840 smart elites are the major economic players and increasingly the economic players.
00:29:22.820 And this is what we're seeing from our VC friends, right?
00:29:24.900 They're like, when I invest in companies now, it is one guy was an outsourced team and a
00:29:29.740 number of AIs, right?
00:29:31.060 Or maybe like five people in the US team and then like nobody else.
00:29:34.700 This is really different from the Silicon Valley of 30, 40 years ago.
00:29:38.260 Yeah.
00:29:38.440 We're like, even a janitor would, you know, get equity and make a ton of money and end up
00:29:42.080 a billionaire.
00:29:42.860 It's true.
00:29:43.920 Right.
00:29:44.400 And we were around when that was happening, when you, when Silicon Valley, when San Francisco
00:29:49.060 was able to maintain its wealth, because they knew that the community that was around the
00:29:53.600 founding of a startup would also get unimaginable amounts of wealth.
00:29:56.640 What we are going to see going forward is an unimaginable concentration of how that wealth
00:30:01.840 is distributed, which is-
00:30:04.700 Okay.
00:30:04.740 So that's another big prediction you're making for humanity 20 years from now is huge.
00:30:10.740 Like we think the wealth disparities now are insane, but you're saying they're going to
00:30:14.760 be off the chain.
00:30:16.340 Especially in cities.
00:30:18.580 Oh yeah.
00:30:19.080 I wanted to ask, like, how do you think people are going to live differently?
00:30:21.900 Are we talking like capsule apartments for single people?
00:30:24.540 No, we're talking, the core difference is going to be wealthy people will be much more fortified
00:30:31.380 and in much more defended settlements than you have today, similar to South Africa today.
00:30:35.900 So either communities of wealthy people that are in these defended environments or, you
00:30:40.780 know, much more wealthy people are spending much less time in cities and much more time
00:30:44.000 on their private defended islands.
00:30:46.180 And what about normal people?
00:30:49.500 Are they going to be more in cities?
00:30:51.020 I'm assuming so.
00:30:52.220 Hair?
00:30:53.000 Because that's where the social services that many of them are going to depend on them.
00:30:56.080 No, I think cities are going to begin to rot pretty soon.
00:30:59.080 No, no, no.
00:30:59.500 I agree.
00:31:00.160 But it's going to be some cities.
00:31:01.520 So New York, LA, you know, like, et cetera, like the big, big, big ones are going to become
00:31:06.360 the wrong question.
00:31:07.420 Okay.
00:31:07.860 So right now there's a genocide going on.
00:31:10.780 Do you know where it's going on?
00:31:11.800 Like an actual big scale genocide.
00:31:13.960 In cities.
00:31:14.400 No, no, no, no.
00:31:15.780 I mean one in the world today.
00:31:17.860 Oh, you mean like an ethnic genocide?
00:31:20.380 Yeah.
00:31:21.520 South Koreans?
00:31:22.620 I don't, what are you, where are you going with this?
00:31:24.360 It's in the Sudan.
00:31:25.680 It's, it's.
00:31:26.080 Oh, like, oh, a political, like killing, actively killing people.
00:31:30.000 Nobody knows about it.
00:31:30.920 Nobody cares about it because economically it doesn't matter.
00:31:34.320 We as humans never care about the people who economically and culturally don't matter.
00:31:39.820 The reason why we don't care about this genocide is because the people involved in it
00:31:44.020 do not matter.
00:31:45.600 I mean, do not affect our lives.
00:31:47.320 They don't.
00:31:47.760 No, no, no, no, no, no, no.
00:31:49.240 They don't matter from the world's perspective.
00:31:51.860 If you look at like the people in Gaza don't affect people's lives.
00:31:55.560 The reason why people are focused on what's going on in Gaza, in Israel right now is because
00:32:00.540 Jews do matter.
00:32:02.540 Okay.
00:32:02.800 But if you look at the scale of the killings that are happening in the Gaza war right now,
00:32:07.460 they are genuinely trivial compared to what's going on in the Sudan right now.
00:32:15.420 It's the Darfur situation is basically reignited.
00:32:18.940 Wiz, I didn't know that.
00:32:19.940 That's horrible.
00:32:21.220 Yeah.
00:32:21.820 I know.
00:32:22.320 It's a large scale.
00:32:22.820 When I was in college, though, many, many, many people I knew were intensely focused on
00:32:28.060 trying to do something about that.
00:32:29.480 Apparently that failed.
00:32:30.240 Yeah, but nobody cares anymore.
00:32:32.960 Not anymore.
00:32:34.180 No, you think people, like if I were a college student again today, you don't think that
00:32:37.600 I would find people.
00:32:38.320 You would not know that this is happening.
00:32:40.520 On a college campus.
00:32:41.760 On a college campus.
00:32:42.940 You would not know.
00:32:44.600 And this is important, right?
00:32:46.140 Like, by the way, if people want to learn about this situation, I think it's the real
00:32:49.960 life history or whatever.
00:32:51.520 I'll put a video on a screen that you can search on YouTube that has a really great description
00:32:56.900 of what's going on, who's putting money into it, who are the different
00:33:00.180 sides are.
00:33:01.120 It's actually really interesting.
00:33:07.680 Basically, this dictator didn't want to lose power.
00:33:10.780 And he thought the way he would prevent himself from losing power was to create two separate
00:33:14.960 military groups that were completely separate from each other.
00:33:18.720 So neither one could coup him without the other one coming to defend him.
00:33:21.700 But then he like died or lost control in some way.
00:33:25.700 I forgot.
00:33:26.280 And now the two military groups are at war with each other.
00:33:28.920 And they have.
00:33:29.940 Yeah, it's not great.
00:33:31.540 OK, well, OK.
00:33:33.080 The point here being is that we as humanity have always not cared about the people who
00:33:38.920 were not economically or socially relevant to our lives.
00:33:42.800 And when you look at the people whining about Gazans, it's really honestly just an excuse
00:33:47.500 for anti-Semitism, which has always been a major leftist thing.
00:33:53.020 And simmering was in the left for a long time.
00:33:55.580 But now it's just boiling over.
00:33:57.220 They don't give a shit about the people of Gaza.
00:33:59.220 They probably, you know, couldn't have even pointed to it on a map before this.
00:34:03.700 You know, they wouldn't have gone there.
00:34:06.840 They wouldn't have done aid programs there.
00:34:08.840 And they didn't do aid programs there.
00:34:10.840 They care about it because they can use it to besmirch the Jewish people who actually
00:34:14.700 do matter in a historic context, broadly speaking, because Israel is one of the few
00:34:19.220 high fertility, high economic productivity places in the world.
00:34:22.620 And so they want to shit on them as the same way they want to shit on any successful group.
00:34:27.420 You know.
00:34:28.100 But they don't care.
00:34:32.240 It's so funny.
00:34:32.720 I said, well, when I talk about like the what's going on deferred and then like the
00:34:35.420 Sudan now, you know, it's like you go to a leftist and you're like, did you know that
00:34:39.040 there is a black group being genocided right now?
00:34:42.500 And they're like, oh, my God, that's horrible.
00:34:45.220 What imperialist is doing it?
00:34:48.100 And you're like, it's actually another black group.
00:34:50.620 And they're like, oh, fuck that, man.
00:34:52.400 I don't care.
00:34:53.820 Like, what are you talking about?
00:34:56.760 What are you talking about?
00:34:58.200 Black people killing black people?
00:34:59.680 Do they do that here all the time, too?
00:35:01.440 We don't give a shit.
00:35:02.800 That's the way they feel.
00:35:03.920 Right.
00:35:04.220 Because they don't give a shit.
00:35:05.280 They care about how they can use it for their political points.
00:35:09.160 They don't actually care about these communities or these people.
00:35:12.560 Do you think polarization politically is going to be worse, better?
00:35:16.140 Oh, astronomically worse.
00:35:18.300 Astronomically worse.
00:35:19.280 Oh, no.
00:35:19.840 People wonder why we are so confidently siding with the right.
00:35:23.280 Like, we're like, oh, you're doing things that really make it hard for you.
00:35:26.240 And it's because, one, I know that they ultimately are going to win this.
00:35:29.760 And two, because in the future, you're going to need a side.
00:35:35.640 Anyway, I love you to death, Simone.
00:35:37.360 You are a great wife and a very smart person.
00:35:40.200 And I hope I'm not being too spicy in this episode.
00:35:42.420 Oh, I love spicy.
00:35:44.220 You know it.
00:35:44.940 I hope that our children create a future that is a little brighter than what we're describing, for the most part.
00:35:52.700 And that they do not succumb to AI girlfriends and boyfriends, as disgusting as humans are.
00:35:57.500 So it's going to be really difficult.
00:36:00.480 You might need to at least have artificial wombs by then.
00:36:03.960 I don't know, actually.
00:36:05.220 I think it's going to take a while.
00:36:06.400 20 years?
00:36:06.960 It's going to take more than 20 years.
00:36:08.640 I agree.
00:36:09.260 I think it will take more than 20 years.
00:36:10.480 So, you know, they're going to need to find real breeding partners, which means we're going to need to find and convert real breeding partners into our cultural group or within our network of families do an arranged marriage for them.
00:36:21.340 No, honestly, Malcolm, hold on, though.
00:36:23.460 Like, you are such a romantic person who is so sweet and loving.
00:36:27.720 I think, hopefully, I think, hopefully it will be in their denna, just, like, genetically inherited to be romantics who seek out a carbon-based partner, I hope.
00:36:40.280 Nothing against AI, because it is awesome.
00:36:42.640 But, yeah, AI is not going to have kids.
00:36:45.640 So, Simone needs grandchildren.
00:36:47.660 Well, not yet, eventually.
00:36:48.800 It'll be a third, and they'll have a polycule with all AIs.
00:36:53.980 It'll be a monogamous polycule, because it will be them and their partner and then another one.
00:36:58.640 I haven't thought about our kids being poly, and I'm going to have to, like, put on a mask of just not cringing.
00:37:09.440 Poly is so cringe, though.
00:37:11.680 If you want to see a good poly story that I found really interesting recently, look up the wacky Portland polycule.
00:37:17.940 Well, it likely wasn't real, but, like, it felt so real to a lot of people that they were like, oh, this must be true, because I just didn't like this.
00:37:24.880 Yeah, Strange Aeons did an episode that you...
00:37:26.980 Did you watch it?
00:37:27.860 I watched it.
00:37:28.680 Of course I was.
00:37:29.060 Was it good?
00:37:30.100 It was good.
00:37:30.860 It was good.
00:37:32.040 Yeah, Strange Aeons did it.
00:37:33.320 Yeah, it all unfolded in an Am I the Asshole post.
00:37:36.600 Sorry, Am I the Asshole post on Reddit.
00:37:39.100 So, yeah, we'll leave you with that, ladies and gentlemen, in case you're too depressed about our future.
00:37:45.220 There's nothing to be depressed about.
00:37:46.840 We win.
00:37:47.940 The efficacious people.
00:37:49.220 That's why we care about you, our watchers.
00:37:52.920 Yeah, actually, the people who watch this are, like, uniquely high agency, so congrats to you guys.
00:37:58.920 We love you, but I love Malcolm more, so I'm going to start your pork chops.
00:38:02.880 Sound good?
00:38:03.960 I am excited.
00:38:06.540 Nice.
00:38:07.100 All right, I'll see you downstairs.
00:38:08.680 Have you looked up how to cook pork chops?
00:38:10.200 Uh-huh.
00:38:11.320 You see, this is the great thing about YouTube, looking up how to cook pork chops.
00:38:14.080 It's a lot like pan-seared steak, is the way I'm going to do it, but with a lot of butter in the cast iron skillet, so.
00:38:19.380 No, you know how to keep me happy.
00:38:21.380 Yeah, I'm ready.
00:38:22.860 I'm ready.
00:38:23.500 I mean, I might fuck it up, but we'll see.
00:38:25.060 We'll see.
00:38:25.400 I love you even if you do.
00:38:27.680 Oh, thank you.