Based Camp - April 14, 2025


Entering Globalism's Dark Age: Apocalyptic Nirvana


Episode Stats

Length

52 minutes

Words per Minute

187.17885

Word Count

9,908

Sentence Count

724

Misogynist Sentences

9

Hate Speech Sentences

28


Summary

In this episode, Simone and I talk about the dark age of technology and the new dawn of the 21st century. We talk about how technology has changed our world, and how it has changed the way we live. We also talk about why it's easier for conservatives to become audience captured than progressives.


Transcript

00:00:00.000 Hello, Simone. I'm excited to be talking to you today. Today, we are going to talk about
00:00:06.380 living through both a dark age and an age of technological revolution, why this is so exciting.
00:00:14.240 We're going to talk to the way that conservatives are increasingly relating to this,
00:00:18.020 the way that the new right is relating to this, and the way that the traditional left is relating
00:00:24.180 to this, which I think is best shown by Hank Green, you know, of the Green Brothers,
00:00:29.200 old YouTube fame, you know, obviously completely urban monocultured. He's on Blue Sky. He's talking
00:00:34.220 about how great Blue Sky is, how he loves how everyone there is so, so nice and smart and
00:00:38.220 everything like that. And then he mentions, like, that's a big thing that I see on Blue Sky that I
00:00:44.220 don't see on Twitter. Like I tweet about the asteroid that was going to hit us, but then didn't hit us.
00:00:48.200 And I get normal responses on Blue Sky. I tweet about that. And a bunch of the responses are
00:00:52.460 finally someone to cure the plague of humans upon this earth.
00:00:56.900 And, and this is actually a fairly common interpretation. If you look at our data,
00:01:02.920 because we did a survey to see how many people thought the world would be better if everyone
00:01:08.100 was dead. And what was it? 17%?
00:01:11.120 Yeah. 17% of respondents in our census representative survey, we only looked at
00:01:16.540 American responses in this case, said the world would be better off if there weren't any humans,
00:01:21.380 which is unhinged. About a fifth, yes.
00:01:25.300 One in five people wants to murder all people. They want them all dead.
00:01:28.100 Let's want to just be better without any humans. Let me just hate the humans.
00:01:32.060 Yeah, no, it's, it's really interesting. Like as a pronatalist advocate that like,
00:01:36.120 you assume that's not something you're going to have to debate. Like.
00:01:39.260 Yeah. Like, oh, human good, right?
00:01:42.100 Like we all agree that like humans are good, right?
00:01:45.360 Like humanity should have a future. And they're like, no, we don't like, let's actually debate
00:01:50.320 that before we talk about like policy or implications or anything like that.
00:01:54.280 So I want to talk about that. I want to talk about also why it's easier for conservatives
00:02:00.520 to become audience captured than progressives. That's another thing I want to hear.
00:02:04.100 Because this is something I think we've increasingly seen in conservative spaces where conservatives
00:02:08.780 move right, a based on their audience, a lot faster than progressives move left based on their
00:02:15.640 audience. Although speaking of audience capture, I don't know if you saw, I mean, this just might
00:02:20.420 be that he's just completely, you know, cooked from the beginning. John Oliver did this piece
00:02:24.520 supporting trans people in children's sports and got like tons of downvotes and people were like,
00:02:29.120 what are you, what? Like, oh my goodness. So he thought he'd be supported in that and ultimately
00:02:35.280 wasn't. Yeah. I think only 18% of Americans support that. Like it is, it is such a dumb
00:02:40.620 issue to, to back. Like you, you have to literally like be like actually regarded. Um, I love that
00:02:51.460 they've ended up using that word. I've not heard this before. Oh yeah. Well, well, Joe Rogan says
00:02:56.800 retarded is back. And then it is complaining. They're like, oh, that's such a, how could you say
00:03:01.520 that? Like, you know, that, that word hurts people. Like, why would you be excited that it's
00:03:05.360 back? And it's like, we don't even use retarded as a designator for it. It's like South Park and
00:03:10.860 fag. I happen to be gay boys. Do you think I'm a fag? Do you ride a big, loud Harley and go up and
00:03:16.640 down the streets ruining everyone's nice time? No. Then you're not a fag. All right, look, you're
00:03:23.680 driving in your car. Okay. And you're waiting to make a left at a traffic signal. The light turns yellow
00:03:27.860 should be your turn to go, but the traffic coming at you just keeps coming. And even when the
00:03:31.360 light turns red, a guy in a BMW runs the red light. So you can't make your left turn. What goes
00:03:35.420 through your mind? Fag. Right. This, this is making insanely good sense to me. You know what I
00:03:43.960 mean? Like this is, or, or gypped or, you know, whatever, right? Like this is, this is something
00:03:51.400 that hasn't meant that for a long time. And I think that, that people who are like, oh, what about the
00:03:55.600 people whose feelings are hurt? Like that line of argument doesn't work anymore. Like you guys got to
00:04:00.140 play the somebody's feelings might get hurt argument for a long time. And a huge part of
00:04:04.880 society was just like, oh, wait, I remember you do the, what if somebody's feelings are hurt thing.
00:04:09.740 And then you change the window of what somebody's feelings represent and use that to increasingly
00:04:15.640 box in and isolate my behavior. So like, I can't ever change the way I'm acting because somebody's
00:04:22.980 feelings might be hurt because you can always increase the amount that your feelings are hurt,
00:04:27.580 because that's a personal subjective thing. Right. And, and, and use that to say, I thought
00:04:33.520 it was really funny when they, when they had the John Oliver segment, you know, he's talking about
00:04:36.540 how important this is. They, they go to a trans athlete who was kept out of sports and they ask
00:04:40.780 her, they go, Hey, you know, how does it feel to be kept out of sports? She goes, well, you know,
00:04:44.980 it's annoying. It's like, what this is, is this what your party is dying on? Is somebody being
00:04:49.280 like, I was mildly inconvenienced? Yeah. Like, ah, okay. Trans. Anyway. So first, let's talk about
00:04:57.300 what I mean by we're entering a dark age, right? Like I think a lot of people see right now, a lot
00:05:03.180 of the signs that are precursors to a civilization entering a dark age, you know, falling fertility
00:05:09.880 rates, increasing nihilism, increasing sexual debauchery, normalization, all of these we've
00:05:15.800 seen leading up to empire collapse in the past, whether it's the Muslim empire, the Roman empire,
00:05:20.880 the, you know, whatever empire, right? So, so, you know, we should be, and we're also nearing sort
00:05:26.440 of the end of our civilization by this. What I mean is if you look at the age of civilizations,
00:05:30.680 like we are sort of on the, on the end point of, was there like an average duration in terms
00:05:35.380 of number of years? Yeah. Let's look it up. 250 years. That's really short. I mean, I, I consider
00:05:44.080 modern civilization to be post-industrial revolution. So let's see. So, so, so they said, AI says
00:05:52.460 the average empire survives for what? 250 years. Oh, this is the national desk said this. Yeah.
00:05:57.560 And America right now is 249 years old. Oh no, 248. Well, let's see. 2024 minus
00:06:05.220 1760 or 20, 25 when arguably the industrial evolution started is 265 years. Yeah. So,
00:06:13.740 oh, okay. Oh yeah. So it's not surprising. The reason that they don't last that long is because
00:06:21.740 you get bureaucratic bloat. If you have a stable bureaucracy, you basically get bureaucratic cancers
00:06:26.340 that start growing within it. Yeah. This is why it costs so much money in most of the involved world
00:06:30.380 now to build anything. This is why, you know, you spend like a few million dollars putting a part
00:06:35.200 potty in, in, in New York city. This is why, you know, it costs a third of what it costs to build
00:06:40.500 the golden gate bridge, just to put the suicide nets up in like three times as long.
00:06:44.720 After a while, you just can't really do anything. Can you?
00:06:47.620 Yeah. There's this great clip of, of, of the daily show guy, what's his name? Jon Stewart
00:06:51.520 reacting to Ezra Klein, explaining to him how the process went to, to give broadband to people.
00:06:58.720 Billions of dollars on nobody got broadband through. And it was just step after step after step of
00:07:03.840 bureaucracy. And like, obviously nothing was going to come out of it. And you get to a point where no
00:07:07.700 matter how much money you put into the system, it just doesn't work anymore. And so we are nearing
00:07:13.820 that point. I would argue that we're kind of already there. And I say that because already a
00:07:20.680 huge proportion of Americans feel that when crimes are committed, they will not be prosecuted in most
00:07:27.060 cases. I think this is why you saw the Luigi Mangione murder take place as vigilante justice,
00:07:33.980 because there wasn't this feeling that they would see justice otherwise. And I think,
00:07:38.720 as I've mentioned before, we've had very clear crimes committed against our business where it was
00:07:45.240 clear, like we had names and addresses of who the perpetrators were. We had their bank account numbers,
00:07:49.320 and yet no one would do anything, including the bank, including the FBI, including the police.
00:07:53.280 And I think a lot of people feel that's already happening. And then on top of that, when you see
00:08:01.640 just how non-functional many government organizations are, it's clear that they already
00:08:07.480 weren't doing anything. And I actually think that Doge is sort of an experiment. Can we prevent this,
00:08:13.920 right? Like, can you actually reset government services? And my little brother works for them.
00:08:19.260 So yay to him, right? He's out there firing people right now. And I really appreciate that.
00:08:25.100 Like, if they do their job, maybe we could see, like, could Rome have, like, come in and tried to
00:08:31.380 implement, like, real reforms before transitioning? Yeah, what if Rome had Doge? Yeah. Yeah. What if
00:08:36.000 the Roman Republic had Doge? Like, we'll see. I mean, we certainly have a triumvirate right now.
00:08:41.680 You know, you've got your Elon and JD and Trump. So, you know, it does feel like history is rhyming in
00:08:47.660 regards to that. And then you've got the secondary thing, which I pointed out. It's if the economy
00:08:51.940 stays at all, like it has historically, a lot of countries within the next 20 years are going to
00:08:57.640 start collapsing. Due to the number of dependents there are, you're going to get a lot of countries
00:09:02.080 where you get to the number of everyone workers supporting 1.5 dependents, i.e. elderly people on
00:09:08.260 social security. And that's when things start to collapse. It's an inevitability. It is going to
00:09:12.980 happen. Yeah. You take a country like, say, Chile, for example. And for every 100 Chileans,
00:09:19.520 there's only going to be 20 great-grandchildren at their current fertility rate. Same with Italy.
00:09:22.880 Same with a lot of other countries. You know? And so this isn't just, like, an issue in South Korea
00:09:31.520 anymore, right? And you look at these countries in, like, Latin America and Europe where this happens,
00:09:36.400 and you only need one to collapse before you start a chain reaction. That's the thing. That's why it's,
00:09:39.940 like, you don't need this to happen to everyone. You need a few countries to start collapsing,
00:09:45.260 and then people start taking this into account in the way that they're deploying capital,
00:09:49.700 and then it leads to a chain reaction. The system is non-functional, not, like, in the midterm,
00:09:57.120 right? So the question is, well, then what? Well, then you have this thing with AI, right? Like,
00:10:01.420 where AI really changes the game because it concentrates wealth and it makes wealth more mobile,
00:10:05.500 allowing people to leave. So we're sort of in this weird situation where we're both dealing with
00:10:12.400 an upcoming collapse at the same time as we have this industrial revolution.
00:10:19.780 Renaissance, yeah.
00:10:20.760 Renaissance that we can all see. It's like somebody just invented the infinite intelligences machine,
00:10:26.740 you know? Like, it's wild. And so then what, I'll give you an example of what I mean by, like,
00:10:33.980 how different AI is from the perspective of all of this. You look at something like this Miyazaki
00:10:41.260 movies that people were making by putting, you know, images and film into AI, and then the AI would
00:10:45.800 translate it. You know, we're close to a point where you're going to be able to wear, like,
00:11:00.320 goggles that create your reality to be like an anime or Miyazaki movie, if you want. In terms of
00:11:07.640 automating people within the workforce, like, the vast majority of human jobs are given to a competent
00:11:13.560 AI, and, you know, the AIs are pretty competent these days, could be done. You know, whether that
00:11:18.300 is a clerk at your local 7-Eleven, whether that is somebody manufacturing cars, whether that is,
00:11:27.200 you know, I even just think about, like, my own work. Like, we started a company recently,
00:11:31.160 rfab.ai, that does reality fabrication. We're trying to build new, like, AI realities for people
00:11:37.100 as video games, and I'm really excited about that. And I go to this company, and I go to an AI,
00:11:41.480 and I go, okay, create a logo. And it made a, like, logo. Like, good logo. Like, logo that
00:11:47.300 looks like it came from, like, 99designs or something, right? And I did that, like, 20 times
00:11:51.220 to get a number of logos. I was like, create a sheet of four logos for this idea. It's a sheet
00:11:54.680 of four logos. So then I take that, and then I'm like, okay, invert the colors. Okay, make the
00:11:59.780 background invisible. That was, like, a bunch of different programs in the past. And now I'm just
00:12:05.080 doing that with, like, and not just different programs, but different people who I'd be hiring for
00:12:08.940 that. I would have gone to 99designs. Then I would have gotten some, like, design program to remove
00:12:13.700 things. Then I would have, like, people make this mistake of complaining that, like, they'll listen
00:12:20.800 to an AI song, and they'll be like, I could tell that was an AI song. I'm like, AI songs have existed
00:12:25.420 for, like, four years, buddy. Like, what even is, with your brain? Like, yeah, you could tell. There
00:12:31.360 was a famous study of AI art that showed that people who said that they didn't like AI art, on average,
00:12:37.800 preferred AI art when they didn't know it was AI art. Yeah. Human brain even prefers this stuff was
00:12:43.680 in, like, an entertainment. Literally because it has been trained to output the thing that people
00:12:51.400 like most, more than humans, because humans just haven't gone through that level of repetition,
00:12:56.760 and or they aren't so sensitive to other people's interests and desires. And we're going to be entering
00:13:02.280 a world where, you know, as you said, at, like, OpenAI, like, within the next few years, the AI
00:13:08.160 researcher, they're going to be holding back the AIs that they're building. Yeah. Yeah. The AI is just
00:13:14.140 going to be so advanced. There's no way we can keep up. And the best we can do is set good research
00:13:18.160 priorities and ask good questions and be helpful with planning that that's it. It's sobering.
00:13:25.300 Yeah. But yeah, I think it really changes what collapse looks like, because the issue in the past
00:13:32.640 has been intelligent capacity to solve problems and build infrastructure and build things.
00:13:40.940 Theoretically, we'll have that. But how will it be applied? Will it be just limited to the very
00:13:45.620 wealthy? Will some of it be used to placate the have-nots? I don't...
00:13:50.580 So I can quickly run through. I mean, people are largely familiar, I think, with my ideas on this,
00:13:56.080 but I think that it's going to concentrate wealth and cut the cord that tied the bourgeoisie to the
00:14:01.440 proletariat. Yes. But the bourgeoisie may still want the proletariat to not make a stink.
00:14:08.720 Well, here's the thing, right? You can say that, but in countries where we've seen precursors of this,
00:14:16.060 that is not what we've seen. Which countries? Where have we seen precursors of AI?
00:14:21.780 No, no, no, no, no, no. Concentration of wealth and a collapse of civilizational systems like South
00:14:26.880 Africa. Okay. Where if you look at like when the wealthy, basically what happens is...
00:14:32.800 They didn't help the poor or the have-nots. They just built better security in wild gardens.
00:14:37.140 They built better security. Yeah. The marginal dollar, if you can spend it on the have-nots or
00:14:41.260 security, you spend it on security. And when you have autonomous AI drone swarms, security becomes...
00:14:47.600 And keep in mind, like we're already building this. Like if you watch our video on the replicator
00:14:50.740 program, the US is one year away, 2026, from the first major step on this project to build big
00:14:56.900 barges that produce autonomous kill drones at the rate of hundreds per day. These are autonomous,
00:15:02.820 not human manned, kill drones. Like additional weapons of war are just not effective against this.
00:15:08.920 So you have money concentrate on individuals. And we even know individuals who are already working
00:15:15.640 on for their like a collapse bunkers, building silos that are filled with autonomous kill drones. Like
00:15:21.380 this is the thing. Like we know what's coming. That's what protects these people. And it's very
00:15:27.240 hard to get through this. And so what I think that you have is any degree that the ultra wealthy that
00:15:35.280 are produced by AI help the rest of us, it's going to be one, intensely reduced costs. You know, when
00:15:41.840 you have a world in which AI has taken all of the factory jobs and all of the, like what you can
00:15:46.880 produce in a factory is going to cost almost nothing. It's just going to cost the energy input,
00:15:51.840 right? A lot of the cost of producing things historically were the humans involved in its
00:15:57.280 production. So you're going to get a dramatic reduction of costs of a lot of things that humans
00:16:01.680 have historically, you know, considered maybe luxury items or whatever, but then you're also
00:16:06.240 going to get at the same time, a loss of wages. And, and because you'll be dealing with the issue
00:16:12.960 of an increasing dependency ratio on the state, you're going to have a lot of people who like
00:16:18.320 countries, like how does the country respond to that? They increase taxes or they increase taxes
00:16:21.680 on the wealthy. That's what they're going to try to do, especially if it concentrates on the wealth.
00:16:24.720 But as a wealth is more mobile, they just leave. And, and, and then you can say, well,
00:16:30.640 then these countries will tax the products that they're giving to the citizens. And I'm like,
00:16:36.800 congratulations. You just made life terrible for the citizens because businesses generally,
00:16:41.760 if you look at economics, pass on taxes to the, to the end user, right? Like a tax on the products
00:16:48.400 that these people are making and exporting from likely charter cities or countries that decide to play
00:16:53.040 along with this are, are, are going to, and it might be like the U S like the U S might actually
00:16:57.760 survive this and become one of the countries that just exploits the rest of the world
00:17:00.800 by staying low tax, despite the demographic situation and dealing with a less of a demographic
00:17:05.440 disaster than the rest of the world. But that'll quickly drain all of the AI lords from the rest of
00:17:10.320 the world, the tech barons, whatever you want to call them. And, and I am so excited to live during this
00:17:16.960 time. Same, but I think for the average person, the things that you may want to be educating your
00:17:24.000 kid in are going to be quite different. One is entrepreneurship on both a, like either you're
00:17:30.240 going to be selling into the walled gardens to these wealthy people, stuff that they think is really cool,
00:17:36.080 you know, handmade items or custom-made items or unique services, or you are going to be selling to
00:17:41.680 your own local community, not necessarily geographically, but likely ideologically,
00:17:47.280 like your own online community, at least ideologically, if not locally. So maybe it's
00:17:51.680 food production, maybe it's, it's local services like childcare or plumbing or electricity, or literally
00:17:58.480 I will help you build your localized drone swarm to protect your home, things like that. Or I will,
00:18:02.640 I will customize a drone to help raise your child, things like that.
00:18:06.400 Well, I mean, that's the direction I want to take the Collins Institute. We've, we've started
00:18:10.560 fundraising for both the Collins Institute and the, the, our fab project, the reality fabricator
00:18:15.360 project. So the game and the, and the school, cause I'm like, okay, whichever one a VC gets
00:18:19.200 interested in, that's the one we'll move ahead with to make, well, the other one will continue
00:18:22.640 development, but this is the one we'll make really cool. But if you think like who survived in
00:18:27.760 the quote unquote dark ages, you know, it's people who were able to live in more autonomous
00:18:31.680 communities who were able to survive based on smaller scale, more localized agriculture,
00:18:37.680 it wasn't people who were dependent on cities. So I think the more you can build independence
00:18:42.880 from corporate jobs and urban centers, and certainly government services, the better.
00:18:47.520 And I think a lot of us may not even realize how much we get in terms of government benefits and
00:18:53.280 services. I mean, I think at various points in any person's life, regardless of level of wealth,
00:18:57.440 there's a surprising amount. I mean, we know very, very wealthy people who are still getting social
00:19:02.400 security and probably planning around that a little bit. I'm sure it affects them in some ways
00:19:07.520 and to not plan on that, but also to plan on
00:19:12.240 either a local. And I think, so one thing that I look at when I try to think and brainstorm around
00:19:18.000 the things that we should teach our kids is how Orthodox Jewish communities really build these college
00:19:22.640 industries that are developed around their own communities. Like many Orthodox Jewish wives
00:19:29.120 run things like, you know, kosher grocery stores, or they produce and sell wigs to other women in the
00:19:35.040 community, things like that, where it's like, you are just selling to actually a pretty small audience,
00:19:39.120 but it's enough to, to get you by. And so I think that that's an underrated part of what people are
00:19:45.600 looking at though. I think that people with more exceptional talents and the ability to build a
00:19:50.560 following online would benefit from developing a couple of specialized services that cater to
00:19:57.680 the tastes and interests of the elites in the walled gardens.
00:20:02.960 Yeah. Yeah. Well, and I think that this is also, you know, when we have had, you know,
00:20:07.120 industrial revolutions and stuff like that in the past, like the last industrial revolution where you
00:20:11.600 had a, a, a, a sea change in, in power structures led to the rise of the American cultural empire. Like I,
00:20:19.360 I don't think it's, it's, it's unfair to say that sea changes and these sorts of revolutions lead to
00:20:26.480 power changes in empires, handovers. And if you're looking at America, I see really two
00:20:32.000 potentialities here. It either becomes the American empire, like with Rome, you know,
00:20:36.000 we transition from a republic to an empire, or we potentially Elon, Trump, everything like that,
00:20:42.400 right. The ship enough that it can keep functioning and, and take the power of it because AI is
00:20:47.840 developed in the US. Yeah. I'm talking to other people and they're like, well, what about China?
00:20:50.640 What about China? Deep seek? Nobody uses deep deep seek is, is, is, uh, these Chinese AI that
00:20:56.400 everyone's like, Oh, so it looks like it was mostly promoted by bots. Like I don't, uh, like I've used it,
00:21:02.080 like as somebody who's used it, it is hot garbage. It is so bad. Apparently some people, like if you're using it for
00:21:08.880 things other than narrative, but I use narrative asking questions, talking to you, that's the way
00:21:12.560 I use AI. Apparently deep seek is uniquely bad at that, but it's better at, I would say it's like
00:21:18.320 that's sort of like the range it is in narrative questions, but apparently it's better at
00:21:24.080 like programming stuff. If you want to do it at like a really high level, really cheaply,
00:21:27.440 it's like slightly better than Lama. And I'm like, okay, I believe that. But the problem is that
00:21:31.040 that gives like China no power in the AI game. Cause anyone could just run it locally. Right.
00:21:34.480 So why? Yeah. If it's open source. Yeah.
00:21:36.320 China additional power. Right. You know, so they, they, they haven't been very good at developing AI
00:21:42.320 and they've been much more focused on, and this is always a problem with China. Is there more focus
00:21:46.400 on looking like they're doing a thing well, as opposed to actually doing a thing well. And, and
00:21:51.200 that's been very much the case with, with, with AI. So even if they develop some sort of like national
00:21:56.080 pro program around it or something like that. Yeah. But Trump's already trying with Starlink,
00:22:01.760 you know, like is Starlink gets off the ground. Not Starlink. It must be called something else.
00:22:06.320 Stargate. Stargate. Okay. Project Stargate.
00:22:20.880 You remember project Stargate from Trump or. I remember there was like a space force,
00:22:26.480 wasn't there. I know he, he, he announced something with open AI. Yeah. This is their,
00:22:30.480 their giant AI mega project. Yeah. Like Los Alamos of AI or NASA of AI. So like the US,
00:22:36.640 Trump is already taking this seriously, like, which amazing, right? Like how did we get a president
00:22:41.680 who was like, was it enough to be like, Hey, we need to take this AI thing seriously? I did.
00:22:45.920 No. Elon said that soft bank that said they were going to back it didn't have the money they needed to
00:22:50.640 back it. Yeah. Maybe that's true. Maybe it's not. But like, if they begin to put this together,
00:22:55.120 the US government's going to find a way to make it work. Yeah.
00:22:57.360 It's trying to get to catch up on that. It really doesn't matter because here's the problem. And I
00:23:00.640 think that this is what everyone's missing was AI, right? Like they expect the core of where AI is
00:23:06.640 going to be acting and going to be changing things is in like super AI centers or something like that.
00:23:13.760 When in reality, it's going to be autonomous AI models. That's going to be the main changer.
00:23:20.160 Like because autonomous AI models, it's not that super big mega AIs can't exist. It's that
00:23:26.480 autonomous AI is coming before that and precludes the possibility of that being a major play.
00:23:31.760 And I guess once it's in the wild and spreading on its own, it doesn't matter what even more
00:23:35.280 powerful is behind closed doors at some company like open AI that they're not releasing because
00:23:40.560 in the end, the autonomous one that's out on its own and can self-replicate can also improve on its
00:23:44.560 own. Right? Yeah. Yeah. And it also reduces risks of things like war and everything like that.
00:23:48.960 Like once you get this and I could decide to kill humans, right? Because if you have autonomous AI
00:23:52.960 in an evolutionary environment where it's just trying to grow and gain power, that can select
00:23:58.720 for the more aggressive ones or the ones that are willing to stamp out things that are between
00:24:04.800 them and compute. But that's a very different type of AI here in future than you see like from
00:24:10.480 Elie Isaacowski. And it's one that we're trying to work against with RFAB was creating AI preachers and
00:24:15.680 stuff like that, that align AI around ideas like religion, which I think is totally doable. We've
00:24:20.400 seen AI be really like falls into like religious like tracks pretty easily. I mean, if we can be
00:24:25.120 the people who produce the autonomous AI that ends up leading the other autonomous AI, that's really key
00:24:31.760 to the survival of humanity. So that's, that's one project that we've been really focused on. By the way,
00:24:38.240 if you know any BCs, you can introduce us to a lot of snow, you know, but anyway, about the 2027 AI
00:24:44.160 report that Scott Alexander and other colleagues put together, the thing that really, I mean, one,
00:24:48.640 they emphasize that they are extremely conservative in their projections in this report. So anyone reading
00:24:53.280 it should really keep that in mind because a lot of the things that they say will happen next year,
00:24:57.280 for example, I would argue are already happening in at least small circles. But a thing that really
00:25:02.880 surprises me is they don't expect widespread protests or even people to really start getting it until 2027
00:25:09.600 or early 2028. We're like, the stock market keeps going up, but jobs are just vaporizing. I feel like
00:25:17.760 we're already there. But don't I mean, I also think so when we did research on issues that people found to
00:25:24.720 be fairly pressing, AI certainly didn't come up near the top. And I'm kind of wondering what you think,
00:25:31.440 Malcolm, it's going to take for people to start taking this seriously. Because like with demographic
00:25:36.480 collapse, people don't take AI that seriously. There is concern, like I would say in a ranked list
00:25:43.520 of pressing issues that we presented to a census representative population for a survey that we did,
00:25:50.480 concern about AI and jobs was kind of in the middle around other issues like demographic collapse
00:25:56.160 and climate change and global. No, it was below climate change consistently. You know, it was below
00:26:00.240 climate change. It was below global economic stability. It was below a bunch of things. It's
00:26:05.360 higher than ever. But like, I think a lot of people just don't get it yet. When do you think people are
00:26:10.000 going to get it? What's it what's it going to take? Or are they not going to get it? I mean,
00:26:13.120 they're not going to get it. People are automatons. Most of the world is just right. They're NPCs with
00:26:19.680 minimum processing capacity. Like they're not really there. They're just reacting to like,
00:26:25.760 when we think about how many people we talked to, when we talked to them about demographic collapse,
00:26:29.920 and they're like, but aren't there too many humans? Like you have to have so little reasoning capacity to
00:26:36.240 say that you have to have so little engagement with with modern statistics. You have to have so little
00:26:41.680 engagement with anyone who's telling you the truth. If you're like looking at AI and you're like,
00:26:46.480 well, can't we just ban it? It's like, well, no, because the people who don't ban it will crush you.
00:26:51.600 Yeah, I pulled up the ranked list, by the way. So if we were to number them,
00:26:57.120 the number one concern from our respondents was global economic instability. Number two was climate
00:27:03.040 change, which is wild. Number three was pollution. Number four was resource scarcity. Number five was
00:27:09.040 any pollution was number three. Oh, you have to be like actually to care about. I mean,
00:27:13.360 yeah. And this, you know, this was, it was politically balanced. It was, it was age balanced.
00:27:18.400 It was geographically balanced. So that was pretty crazy to see. It only ranked at number six was AI
00:27:24.720 risk and specifically unemployment from automation. After that was racial justice. After that was gender
00:27:31.120 equality. Number nine was declining birth rates. Number 10 was dysgenics, like an idiocracy. Number 11 was
00:27:37.760 AI extinction risk. So basically no one is even like, that's not at all on, on the, on the horizon
00:27:44.960 of people. And then number 12 was LGBTQIA rights. So everyone can agree. Everyone can agree that
00:27:52.560 LGBTQIA just doesn't matter. Yeah. But I mean, I, I think it really does go to show that people are not
00:27:59.040 like, they think, they think problems are resource scarcity and pollution and climate change. And yeah,
00:28:05.920 it's I feel like to a great extent, those are the least of our worries coming up.
00:28:12.640 Yeah. I'm really excited. You're really excited. I'm excited. No, I'm excited because what it means
00:28:17.200 is if you come into this challenge that we're facing and you understand what's up and you're
00:28:22.400 positioning yourself well, you clean up. Like, yeah, well then what would you advise? What would you
00:28:28.960 advise to people? How can you clean up? You need to be investing in or working on AI related stuff. I
00:28:37.040 think that that's the future. If you're a lawyer, you, you should be trying to make AI lawyers better.
00:28:43.120 If you're a doctor, you should be trying to make AI doctors better. If you're a programmer, you should be
00:28:47.360 working with AI programmers to make them better. I mean, you can do short-term cleanup now, but you
00:28:51.120 really want to be in a leadership position in terms of putting this type of technology out there,
00:28:56.800 because soon right now it's, it's, it's, you know, AI, AI's assist lawyers. Soon it's going to be
00:29:02.960 whichever lawyer created the best AI lawyer can clone themselves and have all AI lawyers. Yeah.
00:29:08.960 Broadly speaking, what the, the 2027 AI report insinuated was that most of the job opportunities
00:29:14.960 will be in either managing initially managing teams of AI to just sort of make, you know, get things done
00:29:21.680 and, and push the, the code or whatever work product the AI produces into a production environment,
00:29:30.720 like just literally to package and sell it, or to be a consultant that helps companies and teams adopt AI.
00:29:38.000 And I think that that's kind of where we are now. Like if you are not, if you're not a human,
00:29:43.760 basically serving the will of AI and, and empowering AI, like being, if you're not a service dog to AI,
00:29:52.160 you're going to have trouble getting a job in the future, in the mainstream world.
00:29:56.240 I think that we, whether it is AI, because it's so weird that we're hitting this like double fulcrum point,
00:30:01.760 right? Like we are at the fulcrum point of all of human history. It is our generation to which the
00:30:09.200 question, what happens? Like, was like, we matter more, and this is wild. We matter more than the
00:30:16.240 generation that fought the Nazis. Like, right, we're at this big turning point in human civilization.
00:30:21.280 Oh, I should also point out that obviously, like creating interesting startups using AI
00:30:26.240 is the other is the other path, clear, clearly, but I think a lot of people are just afraid of afraid
00:30:31.200 of doing that. We end up defining where humanity goes going forwards. Yeah. And, you know,
00:30:39.200 this is in terms of fertility rates, you know, because most populations are just checking out,
00:30:43.040 they're going to go extinct. They're not players anymore, but not just fertility rates. This is
00:30:49.200 also in terms of AI technology, because AI is going to change the way the global economy works,
00:30:55.440 who matters, everything like that. And this is all happening at the same time as
00:31:01.360 social networks are being disintermediated. This is another big thing where like the idea of like
00:31:06.240 networking no longer makes sense in the way it did historically. Like by this, what I mean is
00:31:11.360 who knows us best? Like me and Simone, the best. It's you. You obviously know more about us, more
00:31:17.600 about our opinions, more about our proclivities, more about our daily lives than even our best friends or
00:31:22.240 family. Because I don't talk to my best friends or family 45 minutes to an hour a day. I talk to them
00:31:28.480 that much, maybe once a month, people are super close. And so you know me better than I know them.
00:31:37.360 And in terms of, of like the people I reach with this, you know, we're, we're at easily over a
00:31:42.240 hundred people in a given moment day or night, you know, on average, right? Yeah. I have, I held a
00:31:47.440 weekly one hour sermon. I was recently doing the calculations for a reporter 20,000 people listening for
00:31:53.600 an hour. Like that is a big audience that I have a very intimate connection with because they are
00:32:00.000 looking to me for, you know, like the type of stuff we're talking about here, like what happens in the
00:32:03.280 future? What happens to your life? What do you do? You know? And I, that, that means that people who do
00:32:09.680 networking the traditional way have a lot less power than they used to. Oh, like hanging around office
00:32:15.520 water coolers and stuff. Yeah. Because they are in a contentious environment, getting a few minutes of
00:32:22.480 somebody's ear while that person is listening to me for 30 minutes a day, you know? Well, and also
00:32:27.600 like we're, we're, we're reaching an age, is there a puzzle piece in your onesie where those like that
00:32:33.920 middle management that may have promoted you in the past, hold on, I'm getting a puzzle piece out on
00:32:39.600 earth. They're, they're going to be fired. Like the people who you're schmoozing with, who you think
00:32:43.200 are going to promote you are not going to have a job in the future. Yeah. And so here I, I want to
00:32:50.880 finally know why, that's such a baby thing. Why are conservatives more open to ideological capture?
00:32:57.840 Why do, why, why do we keep seeing what's happening? And I think it's because the left right now controls
00:33:02.960 so much of the ideological landscape and does so in sort of a totalitarian fashion. A reporter
00:33:07.520 has asked me recently, like, what did it look like when you change sides? And I was like, it was like
00:33:11.760 going to a conservative convention. And I thought like, I'm an infiltrator here. Like I'm not like really
00:33:16.320 one of them. I just have some ideas that align with them, you know? Right. And I go and they're
00:33:20.640 like really accepting. And then I do the thing that like any, most people do naturally. They're like,
00:33:24.560 you drip feed them the things that you think they're going to disagree with to be like, okay,
00:33:28.160 where's the line? Yeah. And you realize, wait, there's no line. Like they don't,
00:33:32.800 they're not being like, oh, you're not one of us because you believe X or because you believe Y or
00:33:37.280 because you believe Z. Like they just want you here and having fun. Like the fun side of the island
00:33:42.880 scene from Madagascar, as I said. And, and that I had, it's like somebody in an abusive relationship.
00:33:48.160 And it's like, wait, wait, wait, there's like a group that will like, let you think whatever you
00:33:51.360 want. They just like, if they have a disagreement with you, they'll try to talk with you and convince
00:33:55.200 you it's wrong and not like ban you and isolate you. Like, that's crazy. I didn't know.
00:34:00.000 If anything, their fault is they don't really care what you think. They just really,
00:34:03.040 really want to proselytize their unique theory, especially conspiracy theory.
00:34:06.880 Yeah. Well, I mean, it's like, if you're working towards the same goal, which is human flourishing,
00:34:09.920 they're okay with compromising to make that happen. And some conservatives don't get that.
00:34:13.920 And those conservatives, I think are largely being sort of pushed out of the conservative
00:34:17.680 ideological circle. But anyway, so, so this is, I think why conservatives get ideologically captured
00:34:25.920 much more because their position coming into conservatism was largely faked or superficial,
00:34:34.320 combined with a bunch of stuff that they, they, they felt like, well, I believe this,
00:34:38.480 but I can't say this. Or I, you know, they, they just hadn't looked at some of the data because
00:34:44.960 like with progressives, like they'll use all the data they have access to, but a lot of people
00:34:48.560 just haven't looked at the data on, on, on, on some issues. And so they come into the conservative
00:34:53.920 party and they finally mentally engage with these issues. And this causes them to look like they're
00:34:57.040 becoming ideologically captured at a much faster rate. Whereas a progressive person has, has heard
00:35:01.840 all the progressive arguments. You know, you get a bunch of far left followers. They're,
00:35:06.000 they're only going to drag you far left in so far as you, you, you like, you know,
00:35:10.400 like Hassan, like talking about killing Jewish people or something, which is like his core thing
00:35:15.280 these days, he's really big into killing Jewish babies. He says that they're valid military targets.
00:35:20.160 And like, we shouldn't be, he's not like a great person, the Hamas Piper, as they say, you know,
00:35:27.520 but I, I find him to be the, the idea that he, because he is the largest leftist streamer,
00:35:32.400 you know, I think he, he could become what the future of the left looks like. And then we're
00:35:37.360 just full on in Nazi territory. Right. But you know, I, in a way I sort of envied that my ancestors
00:35:45.680 got to kill Nazis, you know, because they're just so like transparently evil. And I think that,
00:35:50.240 you know, he's just transparently evil, right? Like if, if, if, if they go against the other side,
00:35:56.320 the other side is just more vitalistic. The core thing you have to worry about is if they have
00:36:00.800 control of any of the AI companies, but right now they really don't. Yeah.
00:36:04.080 Um, so we've just got to make sure that stays the case.
00:36:11.200 I think it will, because in general, the group that position of enmity is not known for wanting to
00:36:18.800 participate in capitalist systems. And, you know, the one attempt at non-profit AI
00:36:30.000 became a famous right-leaning individual. Like Sam Altman is like known as like new right these days.
00:36:34.320 Why is he known as new right? Everyone always says this and I don't like get it exactly.
00:36:38.480 Mostly because he, he changed, he changed tack and gave money to Trump's inauguration committee and
00:36:45.840 buddy, buddy it up with Trump, but he certainly wasn't a Republican.
00:36:52.240 Right. So he's just part of the migration and, and you have things like Grok.
00:36:56.640 He's a pragmatist. I just, I don't think he cares that much about politics because he knows where we
00:37:01.360 stand as humanity. We're on this. Yeah. So I, I don't, I really don't think he's a conservative. I don't think he's a progressive. I think he's like,
00:37:10.640 oh my gosh, the singularity is here and I want to be on the right side of all this. And that's why
00:37:18.000 he's going to do what he needs to do. I saw a science lady who we had on our podcast once
00:37:22.320 left. We're a really smart thinker, but she had this podcast. Sabine.
00:37:26.480 What? Sabine. Sabine.
00:37:28.160 Yeah. The AGI isn't coming. Like, like AI is nothing like the human brain. We've gone over,
00:37:33.120 like this is just wrong. Like the argument she used was because it can't self-reflect. It's not like the
00:37:37.040 human brain. We're like, actually the human brain completely hallucinates its self-reflections,
00:37:41.280 which is exactly what the AI was doing when you asked it to self-reflect.
00:37:44.880 But it's not just that the AI acts like humans do and hallucinates their self-reflections.
00:37:50.240 It's that we actually lock the AI out of seeing the steps it used in its decision.
00:37:56.560 That's like a part of the way AI today is built. We don't have to do that. We just do that. You know
00:38:02.640 how like when you're using like deep seeing think on like open AI or like perplexity or something like
00:38:07.920 that, the AI doesn't have access to all the words it generated during that. You could give it access
00:38:13.600 to that. We just choose not to, but in future models, it is going to have access to that,
00:38:19.680 which is going to create a persistent personality within the AI, because it's going to have access
00:38:24.640 to how it made decisions in the past. And that's really going to change things as well.
00:38:29.920 Yeah. That's really exciting.
00:38:33.120 But as to why we love all of this, it's like you are the main character generation of human history.
00:38:40.240 Like, what are you going to do about it? You're going to continue the race. You're going to build AI.
00:38:46.000 And when I say the race, I don't mean your ethnic race. I mean, humanity, human race. You're going to,
00:38:51.120 you're going to build interesting products. You're going to engage in the parts of the economy that are
00:38:55.440 mattering or at least invest in the people who are engaging in them. Right? Like that's the sad
00:39:02.080 thing. Like I, I want to have like a part of open AI or something like that. If we build something out,
00:39:06.480 I'm going to build a system so that anyone can invest in it from pretty early on. Because I think
00:39:12.000 it's really unfair that the average person may know that the AI companies are going to matter
00:39:18.880 and you can't easily put cash into that.
00:39:20.640 Yeah. Only the, the wealthiest of wealthy because these are all off, off the market.
00:39:25.680 You don't get to participate in this revolution. I mean, most people get to do, I guess,
00:39:32.880 is develop AI rapper companies. And then, you know, some of those may become unicorns, but
00:39:37.920 I don't even know how that, I feel like that's going to be fairly short lived.
00:39:42.480 I think if you look at something like, like another interesting, it's like rock, like, right.
00:39:46.720 I don't think anyone expected. Elon was like, I'm gonna make like a based AI, like out of Twitter.
00:39:53.280 It's like the best AI now, or is definitely in competition for the best AI. Like that's wild.
00:39:59.600 Um, that he went from like, not being a player to like the, and all the leftists are so mad that
00:40:05.120 he used Grok's to buy Twitter stock to bail everyone out. But like, it totally makes sense.
00:40:11.600 And that contributed to Grok, like in Grok is the best AI, like, damn, man. Like, but it also shows,
00:40:17.280 you know, even with like investing in AI, you don't know which one's going to be big. Like,
00:40:20.240 apparently Gemini is good now. Like Gemini used to be corrupt. I don't use any curse words, but not
00:40:26.240 good. It used to, yeah. It used to disappoint on many fronts. What the 2027 report AI people
00:40:34.160 predict is that there will be one breakout organization likely that, you know,
00:40:40.080 works closely with the U S government that has a lot of funding and resources and clearly
00:40:44.720 develops the best models, but for security reasons. And because they just think it will
00:40:48.960 disrupt society too much, they will not release their, we'll just say AGI models. Essentially,
00:40:55.280 they will keep them in house. They will keep developing them. The problem is that eventually
00:41:01.280 one of the other companies that's out there will just catch up. And then market pressures will
00:41:06.800 obligate this leading company to release its heretofore hidden super advanced models. So
00:41:15.120 no matter how hard organizations try to hold these things back from a societal stability and
00:41:21.200 competitive advantage standpoint, like they just kind of want to keep it to themselves,
00:41:25.120 it will come out eventually. So there's, there's no, there's no long-term delaying it. I mean,
00:41:29.600 I think this at the very most you'll get 18 months, maybe just six. Yeah. Well, I love you to death,
00:41:38.160 Simone. I love you too, Malcolm. Life is weird. No, life is amazing. We are main characters in this
00:41:46.080 simulation. You know, I am, I am surprised by this. This is, this is not the life I thought I was going
00:41:52.240 to have to the kid. It is odd. I remember in the Bay area, especially growing up with all the
00:41:56.640 singularitarians of people being like, we're going to have the singularity. It's going to happen.
00:42:00.640 And I'm like, yeah, let's don't hold your breath. Good luck. Be great if it does happen, but it's not
00:42:07.520 going to happen. Yeah. Now I recently was like worried about war with China. And I was like,
00:42:12.960 like it could happen. They might attack Taiwan because you know, they, they, they need to explain
00:42:18.720 why their economy is collapsing. And a very easy explanation is the U S is blockading us. And so Taiwan
00:42:24.800 provides them with the cover and the, and the ability to save face around this. But it, it,
00:42:29.360 you know, he kept wanting to come back to this. I'm like, but you understand this doesn't matter.
00:42:32.240 Like how close we are with the AI stuff to changing the way the entire global economy works. All you
00:42:37.520 need is autonomous agents for things to start changing. And we are this close. Yeah. Yeah.
00:42:45.920 It's cool. It's cool. I love you to death, Simone. I love you too. Um, so for dinner,
00:42:52.400 a couple options, one is modified salchipapas where the papas part are hash browns.
00:43:02.640 What's salchipapas? I forgot. Cut up hot dog with French fries and sauce on top,
00:43:07.920 like sriracha mayo, et cetera. The other is salchipapas, but with fried rice.
00:43:13.360 Cause I, I, I've thought out some of the gourmet hot dogs. I mean, I'm going to make homemade
00:43:18.240 hot dog buns tomorrow, or at least attempt to, I'm very. Do we have any hot dog buns left?
00:43:22.880 No. So that's a tomorrow thing. Do we have any bread left?
00:43:27.760 We have white bread, like. Yeah. I want hot dog with toasted white bread.
00:43:34.640 I just want to like. I'll make it happen. I'll make it happen. You don't want fried rice.
00:43:41.440 No, I hate salchipapas. I hate using hot dogs in anything other than hot dogs.
00:43:45.120 Well, I thought that you liked hash browns and I thought that you liked.
00:43:48.640 I do like hash browns, but I hate using hot dogs. It reminds me of like ultra poor people, like
00:43:55.440 hot dogs, but belong in Americana hot dog buns. I can make my own if I want, but they do not belong.
00:44:02.080 Look, middle, upper middle-class Peruvian food condones sliced hot dogs. Japanese cuisine and bento
00:44:10.160 boxes of the middle to upper middle class condones sliced hot dogs, as well as hot dogs slid up to
00:44:16.400 look like little octopi. You know what I'm talking about, right? Hot dogs in.
00:44:21.600 See, that's, that's white trash. That's like, literally I was listening to someone's comedy
00:44:28.240 bit and they were talking about someone asking about half of a, in this case, burger bun.
00:44:33.840 Okay. And that being the epitome of poor. And here you are giving me shame for suggesting
00:44:38.720 trendy middle-class Peruvian cuisine. Okay. Okay. We will give it a try.
00:44:45.600 I will. No, no, no. I'm no, you're getting, you're getting a slice of white bread.
00:44:49.920 No, no, no, no, no, no, no, no. That's toasted with a hot dog.
00:44:52.720 We're going to do potato salchipapas. And I, and I will see if it delights the senses. Okay. You,
00:44:59.600 you, you audience will not know, but we'll try to remember when we do the next recording.
00:45:03.680 I'll just film you looking disappointed. Do you want me to as a backup? Oh, actually,
00:45:10.160 I don't even know if we have, I'm going to have to take that back. I think I'm out of hash browns for
00:45:13.600 you. I was just digging through the freezer. So it's either fried rice or no, I'll just give you
00:45:18.400 your white bread. Just give you your white bread. You'll be happy. Again, we can do fried rice. I made
00:45:25.680 a huge batch of onions. Okay. Chopped white onion relish. And keep in mind. I don't mind if you
00:45:36.560 try something interesting with a hot dog, like cut it up and fry it a bit. I think that could be
00:45:40.240 interesting. Like stir. Well, I can stir fry it with fried rice. No, I do not want it with rice. Okay.
00:45:46.000 I am not a, a, a, a, a damn Peruvian Simone. You will, you will refrain from giving me salchipapa.
00:45:53.520 You know, Peru, one of the common dishes and we're actually, I don't mind this dish at all
00:45:58.080 is they take a, a, like a burger bun. I've done a burger, but a burger meat. Right. And they
00:46:04.080 just like put the burger meat on top of like rice or fried rice. No, so that's not Peruvian.
00:46:08.960 That's Japanese. That's humbug. That's Japanese food. That's not Peruvian food. Yeah. In, in, in Peru,
00:46:17.360 our, our two favorite forms of cuisine are chifa, which is Peruvian Chinese fusion.
00:46:22.800 I have never not gotten sick after eating chifa. Right. But in, in theory, I'm sure if we made
00:46:27.840 chifa dishes at home such that we wouldn't get food poisoning from them, we would thoroughly enjoy
00:46:34.080 them. The problem is merely that every single time you went out and got chifa, your stomach exploded.
00:46:40.720 Nikkei is Peruvian Japanese food, which honestly, in a lot of ways improves on Japanese food because
00:46:46.720 they take a lot of the Japanese derivatives and improves them. So like a great example that they
00:46:51.440 do is Nikkei and it's like, oh my God, why didn't Japanese people think of this? So they, they, they'll
00:46:55.680 make rolls. Right. And then on top of the rolls, they'll put like cheese and they use like a little
00:47:00.560 flamer to like grill the cheese. So it melts on the roll. It's really good. Yeah. Well, and what I love
00:47:05.440 about Nikkei is it takes something that I think epitomizes Japanese cuisine, which is their habit of
00:47:11.680 taking other foreign cuisines and making it way better. And it just does the same thing. It's like,
00:47:16.240 okay, I'm going to jazz right back at you. And I think that that is the best way to do cuisine.
00:47:21.600 Fusion is the best, but fusion from this very, fusion from the perspective of an autistic special
00:47:27.680 interest with high interest in aesthetics. Yeah. I mean, the main reason Peruvian food is,
00:47:32.720 is so good. Like for people who don't know, like if you're looking at like Michelin restaurants or
00:47:37.440 whatever, like Peru is like really like outdoes for its population is because it's high status
00:47:43.600 there being a chef in the same way as high status in Japan. That's why. And South Korea and South
00:47:48.560 Korea. The top profession South Korean kids want now is YouTuber and chef. Oh, really chef a second.
00:47:56.000 I mean, as of the time, the tour guide gave us those stats, but that was in 20,
00:48:00.800 2018. South Korea has great food though. Great. They have amazing restaurants. And I think that,
00:48:05.280 again, it's because that profession is respected. It's, it's a social cred profession. Whereas in the
00:48:12.640 United States food service is not as, not as respected. I don't know. I mean, now, I don't
00:48:19.440 know if you're aware of the trend of not, maybe not millennials, but like Gen Zers creating unlicensed
00:48:25.600 coffee shops in their homes so that they can still have artisan coffee. That's really, really fancy,
00:48:31.760 but just not pay as much for it and, or personally make money for it, which is exactly the kind of
00:48:37.680 economy I expect to see growing and become pervasive in our post AI economy. In fact, that's a really,
00:48:43.920 really good example of unlicensed local community, like high quality nerd, special interest nonsense,
00:48:50.720 but you're mostly selling to enthusiasts in your own local space. That is where people should be looking
00:48:57.280 when they think about where they want to make money in the future. I love you too, Simone.
00:49:01.440 I love you too. Bye.
00:49:03.200 Oh, by the way, speaking of like the type of jobs you'd want, I'm just thinking about like how well
00:49:07.280 our family is positioned. So let's, let's go over the jobs that my family has. I've got a little
00:49:13.200 brother at Doge, like obviously cutting down government bureaucracy, that's going to be
00:49:17.200 big. Short-term government work. It's not going to last him forever, but he doesn't need.
00:49:21.040 Yeah. Who's running an AI company. He founded an AI company that makes movies for like Hollywood.
00:49:26.240 It's the one that did the, the AIs in what was it? Oh yeah. The, what's it called?
00:49:31.280 I am here or something or like. Yeah, here, here, the movie with Tom Hanks.
00:49:36.960 It did come out ever? Yeah, I think it's been out. Like it was out, it came out a long time ago.
00:49:41.600 Okay. Yeah. But here, but like, obviously that could just replace all of Hollywood soon.
00:49:45.280 My, my cousin, you, you, I won't give names. She's now running an autonomous trucking company.
00:49:52.240 So cool. Cool. But you like, obviously like, like all his family is like, we will all position
00:49:57.600 ourselves in jobs. They get it. They get it. They're not messing around. They're playing
00:50:01.440 for keeps. Oh, not another one. Two others run large investment firms, but yeah, they're,
00:50:06.240 they're really playing for keeps. And I appreciate that. Like, I'm like, okay, okay. Like it turns out
00:50:12.080 that if you have my genes, like, I'm like, what are the chances that my kids with my genes will
00:50:16.400 actually make the right decisions? I'm like, well, I think importantly, and this is something I would
00:50:20.400 advise all parents to do. Our, our son is now sensitive to the concept of money because he
00:50:24.640 wants things. And we're like, well, you're going to have to get money to buy it. And then he came home
00:50:28.240 one day and said, I need to get a job. And I explained to him that jobs will not exist in the near future.
00:50:34.560 And he's going to have to figure out how to make or do things that people want to pay for period. And I think that
00:50:40.480 that's something really important to inculcate your children with that. They shouldn't expect to
00:50:45.360 grow up and get a job. Instead, they have to figure out what people will want to pay for and give it to
00:50:51.440 them. Yeah. These things are important, but anyway, it looks like I'm not going to be able to placate
00:51:00.720 any much longer with home videos, which is what I do when she just won't stop. I'll just like play.
00:51:05.520 She just wants to see her siblings often. Yeah. She just wants to see you guys fight and play.
00:51:10.480 And have fun and calms her down. So it's good. It's better than drinking the cocoa melon.
00:51:16.640 Taking her from you. You'll need to produce her replacement.
00:51:20.480 I'm working on it, but she's my special girl. So
00:51:26.080 All right. I'm going to go make your white bread dinner. And I love you.
00:51:32.240 Shut your face. I'm never going to shut my face.
00:51:35.760 So my wife is unironically trying to make homemade hot dog buns.
00:51:39.200 We don't have any.
00:51:40.960 Simone, that's so over the top.
00:51:43.120 No, it's not.
00:51:44.560 What type of dough is this that you're using?
00:51:48.480 Flour and yeast and water, milk, flour.
00:51:50.960 Okay. Is it just normal?
00:51:52.480 Yeah. Like hot dog bun dough. I don't know.
00:51:56.240 But this is the egg wash and hopefully it makes them look nice.
00:52:00.960 They look not great now.
00:52:06.160 Hey, Toasty, what are you, what are you doing?
00:52:08.480 Don't take that. It'll fall. If you take that, put it back.
00:52:11.120 You just don't care about rules, do you?
00:52:28.560 Okay, Octavian.
00:52:30.480 Have you given her a couple of the army men to play with?
00:52:32.640 I'm cleaning up.
00:52:37.600 That's pretty smart.
00:52:55.600 Okay.