00:52:09.700But no, we've hit a plateau. It's all a bubble.
00:52:14.580Okay, guys, have fun being left in the dust. Enjoy it.
00:52:18.660No. But I see. Because when people are thinking about a product, they think about it from a consumer
00:52:25.340level. They think about it like an iPod or a, you know, why isn't this in my pocket, right?
00:52:30.460It can also be hard for people to wrap their heads around it. You know, like when cars started being adopted, it's like, oh, this is just a rich person thing. Like, they break down all the time. They, you know, it's just better to have a horse. Just keep your horse. People couldn't imagine not having horses on the roads. You know, it's similar to the hallucination arguments. Like cars break down. AI's hallucinate. Why? How is that going to transform society?
00:52:52.200Yeah. Buckle up, guys. You don't have to. But yeah, if you go into this not wearing your seatbelt, this is on you.
00:53:01.760Yeah. And I could go into like technical things where it looks like parts of AI development have slowed down recently. But in other areas, it looks like it's sped up recently. Like that's the problem with a lot of this is you can say, well, it's slowed down here. It's slowed down here. And then, well, it's sped up here, here and here. Right? And then you'll get some new model, like DeepSync's new model. And they'll be like, oh, and now we have some giant jump. Right?
00:53:22.180And then we've just been seeing this over and over again. I hope it plateaus. It's going to be scary if it doesn't plateau. But we're not seeing this yet. We're seeing what is kind of a best case scenario, which is steady growth. Steady, fast growth. Not fooming. Okay. But steady, fast growth.
00:53:40.740Yeah. So yeah, multiple people actually requested this discussion in the comments of the video we ran today, which was on how geopolitics will be changed after the rise of AI and more accelerated demographic collapse. So I'm glad that you addressed all this.
00:53:59.200There's a lot more. I mean, we're only just getting started. And a lot of people also chimed in in the comments. They're like, well, give me specific dates. I need to know like, you know, what by when we can't do that. Like we can give you dates. We're going to be wrong. Like it's, it's really hard to predict how fast things are going to be. And there are so many factors affecting adoption, including regulatory factors, social factors, that it just makes it really hard for us to say exactly when things are going to happen. Are heuristic with these things? If you're just trying to be like, well, yeah, but like, how do I know when to start planning?
00:54:29.200This is your reality now. Just like accept it as reality and live as though it's true. That's how we live our lives. We live our lives under the assumption that this is the new world order. And we don't invest in things that are part of the old world order in terms of our time or dependence. And we do lean toward things that are part of the new world order, if that makes sense.
00:54:48.440Yeah, no, I, I, I absolutely think it makes sense. And I'm just, I totally understand where people are coming from with this, but my God, are they, they, it's, it's like hearing computers transform society and only thinking about the computers that you use for recreation instead of the computers that are used in a manufacturing plant and to keep planes connected and to, you know, like the,
00:55:18.440and even if the development stopped today, the amount that the existing technology would transform societies in ways that haven't yet happened is almost incalculable.
00:55:30.440Like that, that's the thing that gets me. I don't need to see AI doing something more than it's already done today. Like I, I, I, I don't need to see something more advanced than Grok 4. Okay.
00:55:43.340Than opening AI 5. I, I, with these models, I could replace 30% of people in the legal profession. That, that's a big economic thing. Okay.
00:55:53.440Yep. And I mean, again, we, we can't say how fast this is going to be impactful or not because there are already states in the United States, for example, that are making it illegal, for example, for your psychotherapist.
00:56:09.920Even though AI outperforms normal therapists on most benchmarks.
00:56:12.620Well, it just to use AI to help themselves, like, and they're going to cheat anyway, but like, so people are going to try to artificially slow things down in an attempt to protect jobs.
00:56:23.220Or protect industries because they don't trust it. So again, things will be artificially slowed down. Sometimes things will be artificially sped up by countries saying, okay, we're all about this. We need to make it happen.
00:57:05.500This is, this is a fun time to be alive during the, the AI slash fertility rate apocalypse, because I get to do the things that I want to win.
00:57:14.440Have lots of kids and work with AI to make it better.
00:57:44.040You are a lovely wife. And I love that you cut my hair now. It feels so much more contained.
00:57:49.100The more things we bring into the house, whether it's you making food or cutting my hair, it dramatically improves my quality of life because I don't have to go outside or interact with other people.
00:57:58.880And I really hadn't expected that. And it's, it's pretty awesome.
00:58:02.840Yeah. I get now why for many people, it's a luxury to have everyone come to your house to deliver services, but it's even better if you don't have to talk with someone else and coordinate with someone else and pay someone else and thank someone else.
00:58:16.820And not like I'm not appreciative of what other people do and the services they provide, but it's just additional stress. Like this is a generation of people that can't answer the phone, like me included. It's just, and so like the anxiety that you have to undergo to like have a transaction with a human is so high.
00:58:35.720Even if they, even if they're doing a great job and they're happy and you're happy, you still have to like go through the whole, thank you so much. And, oh, can I have this? And, well, this isn't quite right. Can I have this adjusted? And like, no, I would rather use my mental processing power to just keep our kids somewhat in order.
00:58:52.700Somewhat in order. That's a tall, what do people think of the episode today?
00:58:58.460What did they think? They, I think they liked it. I'm trying to think of like, if there was any theme in the comments, a lot of people had small quibbles here or there about birth rates in certain areas. And I think that's because the data is so all over the place. And a lot of people have anchored to old data.
00:59:16.220And then they're really shocked to see how much the birth rates have changed. Some, I haven't gone deep into it, but some people have questioned why you think like growth in certain areas in populations won't matter due to them sort of being technologically just not online yet and not developed.
00:59:38.220Yeah. This to me, I just find like a comical thing. Like they, they, they think that they're going to get a Wakanda, right? Like this is not going to happen. Right. You, you, you can't just, when we've seen populations like jump in technology and industry levels, it happens because of some new form of contact or some new form of technology being imported to the region. Like we saw in East Asia.
01:00:01.580It's very unlikely that you're going to see something like Somalia, which has good fertility rates, just like suddenly develop. It doesn't, it's, it's, and we've tried to force it, right? Like this is fundamentally what the U S tried to do with Iraq, right? Like we tried to force them to become a modern democracy and a modern economy in the same way we did with like South Korea and Japan and Germany. And it just didn't work.
01:00:26.920What do you think about the city-states that like Patrick's working on in Africa? Couldn't you theoretically create Wakandas?
01:00:35.160You could, you could. I think one of his city-states would be most likely to do that, but that's not going to have an impact on a wide spread of the region, right?
01:00:43.040Yeah. Basically just those who can get in.