AI Killed Job Security: How to Survive in a Post-AI Workforce
Episode Stats
Words per Minute
193.95396
Summary
In this episode, Simone and I talk about the end of the 9-5 job and why we need to prepare our kids for a post-job world. We talk about how to prepare your kids for the future and what they need to do to prepare for it.
Transcript
00:00:00.000
Hello, Simone. I'm excited to be here with you today. Today, we are going to be discussing the
00:00:04.640
fact that if you are young today, or you are raising kids today, you need to be raising them
00:00:10.260
for a post-job economy. It means making very different decisions about the way that you help
00:00:18.260
them prep for, well, being financially stable as an adult. And it also means in terms of your own
00:00:25.380
decisions, I think for a lot of us, we will be in a post-job world. What do you think, 25 years?
00:00:32.720
Yeah. Well, the future is here. It's just not evenly distributed. I think there are many people
00:00:37.460
now who are losing their jobs and will never get one again, period. And here, when I say job,
00:00:43.280
I mean traditional nine to five, like recurring revenue. Simone, do you want to go into what is
00:00:50.640
causing the end of the concept of a job? And I first note here, for a lot of people who think
00:00:55.200
this is like an insane thing, jobs are a fairly recent invention. They were really only invented
00:01:00.020
as like a mass thing in like the 1920s. And this concept of having a lifelong job and getting a gold
00:01:07.200
watch and having a pension, it was such a short-lived phenomenon. It's hard for me to imagine
00:01:13.240
that, like, how did we ever get that impression that it was going to be that way forever?
00:01:20.400
Yeah. So I think already for a long period of time, we've seen a very slow easing into this,
00:01:26.040
and it hasn't just been about AI. And I think that also the way that job reports are coming in,
00:01:31.360
there's a lot of underreporting because obviously they're not reporting people who've stopped looking
00:01:36.320
for a job entirely. And the number of people who are living now by gig work and piecing together a
00:01:41.060
lot of jobs, including short-term jobs, is just so high now. And I don't count those as jobs,
00:01:47.720
and I don't think most people count them as like long-term jobs either because they're not
00:01:51.600
sustainable, they get laid off all the time, or they change jobs all the time.
00:01:56.040
So- I want to elevate something you said here because I think that a lot of people might miss
00:01:59.960
this. And it's what was the economic pressure that led to the concept of a nine-to-five job
00:02:06.340
that stopped existing long before AI? The reason why you had quote-unquote jobs and the skyscrapers
00:02:14.380
that housed these people back in the day is it was largely due to the difficulty of long-distance
00:02:20.160
communication. So if I was a company and I wanted somebody who was competent at X task, okay, and I
00:02:28.820
wanted to be sure I had somebody who was competent at X task this year, and I had somebody who was
00:02:32.860
competent at X task in 10 years or five years or next year, the most cost-efficient way to get one of
00:02:40.600
those people was to source a competent sort of blank slate and train them in that task. Maybe
00:02:48.380
somebody who had a bit of training in that task to begin with. And so the way I would do this is I
00:02:53.200
would go to the most elite universities or the universities in my area as sort of an authentication
00:02:57.140
mechanism. I would find individuals who are graduating and I would say, okay, how well, you know,
00:03:03.060
how well are you, like how smart are you generally? And I can get that from their GPA.
00:03:06.420
Okay. And then I would hire them. And then I would train them up as much as I could for a specific
00:03:11.520
task. And you saw a lot of like rotation programs start for this for like management jobs. You saw
00:03:15.320
a lot of, but, but that was the idea. It was, it was not that these companies cared about people.
00:03:20.280
It was that it was efficient. And this is why you had the goal.
00:03:24.980
Well, and more than that, it was worth it for them to invest in training, which is another
00:03:29.840
clear sign to, I think many people intuitively already that permanent jobs and the stable jobs are gone
00:03:35.600
forever. No one trains anyone anymore. This idea of like apprenticeship and investing in someone's
00:03:40.220
training is just non-existent. People expect you to come out of the box, completely ready to go on
00:03:44.600
day one with absolutely zero support. This isn't totally true. Slow weaving organizations,
00:03:48.920
rotational programs still exist and stuff like that, but they're being phased out.
00:03:52.320
They're totally being phased out. And I feel like they didn't really, they didn't have high efficacy.
00:03:57.420
No, they didn't. So, so this worked because if you were trained in this stuff and you were in most cities,
00:04:06.260
there really wasn't that much mobility you had, you know, you might be able to go work for one other
00:04:11.200
competitor or something like that, but there was a huge reason to stay with the company you were with
00:04:15.440
because of that, but also because of like the pensions and the benefits and stuff like that.
00:04:20.260
They worked into their systems that a lot of the benefits for working in for them required working
00:04:26.440
for them a long time. So they were aligning the interests here. They invest in you, you invest in
00:04:32.840
them. And it was worth it for this company to invest in this person because they could reasonably expect
00:04:37.560
this person to stick around for the duration of their career. Otherwise it didn't make any sense to
00:04:42.860
train someone, which again, like it's, it's not the company's fault, by the way, that they're not
00:04:47.280
training people. People are so flaky. Like it is, it is an active risk to even think about training
00:04:54.040
someone. And then something happened and it wasn't AI. This is where you have the charts and you can
00:04:59.720
watch our video on this, or we go into this a lot more details. What actually happened in, I think it
00:05:02.660
was 1976, where you see all the charts begin to diverge in terms of like income data and everything
00:05:07.740
like that and salary rise and everything. And what it was, was that was the big rise of outsourcing that
00:05:12.560
started. So phones became common and internet became common were the two things that happened.
00:05:19.400
And that meant that if I am a company and I want somebody who is skilled in, let's say accounting.
00:05:25.800
Okay. And I want to, as all companies intrinsically do pay the least I can to get somebody who is
00:05:31.400
broadly competent in this, given that living standards in Africa or India or the Philippines are
00:05:37.900
dramatically less than they are in the United States, I can hire somebody who might do a slightly
00:05:43.520
worse job, might have slightly fewer like connections in the office or be able to connect
00:05:47.820
with people, but I'm paying one 10th when I'm paying somebody in the U S yeah. Why would I pay
00:05:54.380
somebody in the U S when I can pay one 10th of what I'm paying in the U S yeah. Yeah. And this is what
00:06:00.540
led to the collapse of global poverty. This wasn't all bad. If you look at poverty rate worldwide,
00:06:05.060
they begin to collapse like severe poverty rates at the same time as U S salary begins to stagnate.
00:06:10.820
And this happened for a long, long time. This was happening for, but there were still roles that
00:06:16.980
made sense. There was still bureaucratic inertia within the United States and some other countries
00:06:23.020
in the developed world. AI changes that entirely. Yeah. Well, I think first there's, there's a couple
00:06:29.200
of things that happened. And before even AI started happening, just general tech and automation
00:06:33.800
have been progressively changing this over time. It's not like there's a before AI and there's an
00:06:39.180
after AI period. There's a before email and after email there's a before, like all these different
00:06:43.400
types of software and CRMs and automations that enabled people to do the work of many other people.
00:06:50.860
And that that's why also there's sort of this trend you started, I think like beginning in the nineties
00:06:55.440
and then onward, you heard of more and more people being like, I'm now expected to do the job of
00:06:59.880
three people. This is so stressful. So, but not really, cause they had software to support them.
00:07:05.260
It's kind of like how housewives got like dishwashers and refrigerators and clothing washers,
00:07:12.420
and they were able to basically do all their chores in very little time.
00:07:16.340
All of this happened with the introduction of zoom as well. And that was the other big thing is zoom
00:07:21.120
plus COVID even disintermediated the concept of a corporate environment.
00:07:25.980
But then on top of that, and this really, I think started to shift with Twitter. So only very recently,
00:07:33.720
there was the realization that, oh my gosh, my giant corporation probably has an excess of 60 to 80%
00:07:43.320
of employees. Like I could eliminate all of them thoughtfully that at 60 to 80% and be just as
00:07:51.260
functional, if not a little bit more functional.
00:07:53.540
Yeah, which she's talking about is Twitter slash X, you know, when it axed like, what was it? 98%?
00:07:59.760
80% of its staff, there were all these doomsayers who said, oh, this is going to crash the corporation,
00:08:04.780
but didn't. It functions as well as it ever did now.
00:08:08.240
No, I mean better. Look at Grok, everyone's favorite AI right now.
00:08:11.420
Yeah. It's, yeah, you're right. It is just strictly better than it used to be in every
00:08:15.740
conceivable way. And you realize, and a lot of the corporate world took, took away from this. And
00:08:20.580
we've seen massive firings since then across the corporate world. A lot of people, what were you
00:08:27.280
going to say, Simone? Well, yeah. And, and, and I think Twitter was the one that proved it to people.
00:08:31.920
And that's why the, and the, the firings are taking place in very subtle ways, I think,
00:08:36.240
to sort of reduce alarmism and like news organizations are struggling. So they're doing
00:08:41.340
things like ordering people to come back into the office and making new requirements.
00:08:45.780
Yeah, but that's not because corporate America makes sense anymore. Everybody knows corporate
00:08:49.780
America is pointless. They ask you to come back in so they can fire the ones who don't.
00:08:52.940
Yeah. It's not like people think that office work is exceptional with, with, with, I guess,
00:08:59.240
the exception of some people like Elon Musk could just, but I mean, I can get it with his work,
00:09:04.020
like with Tesla and SpaceX, where you have people in factories, you need people there.
00:09:07.460
So I get it. I mean, the same with hospitals, you can't like have everyone be remote working,
00:09:11.580
but anyway, so all of these things have already happened. I would already as a parent be telling
00:09:16.840
my kids, which is what happens now. So we have this son named Octavian who's old enough to understand
00:09:21.040
money and who wants it. He wants it bad. He's like, I need money. Where can I get money? And like,
00:09:26.600
he knows that we, we work to earn money and therefore pay for his things. So he's like, okay, well you work,
00:09:32.140
I need a job. Where am I going to get a job? I need to go to work. And we're like, no, you're never
00:09:38.000
going to get a job. You have to build your own company. You have to make your own money. You can't
00:09:42.620
go somewhere and get a job and get paid. That is over now. You may see people doing that, but that
00:09:47.820
is the end of an institution that you will never get to participate in, which isn't 100% true.
00:09:52.140
Obviously there are going to be salaried jobs for big organizations always, but I just don't.
00:09:57.560
I think you might be, and I think a lot of people might be shocked by how rare they become.
00:10:02.260
Well, not just rare, but like a kind of a form. I don't, it sounds wrong to say this,
00:10:07.020
but kind of a form of slavery, kind of where you're just living paycheck to paycheck. You're
00:10:11.360
in debt. You don't have stable income. You're, you're being jerked around. It's, it's, you're,
00:10:16.780
you're not happy. You're stressed out. It's, I think it's just going to get progressively worse
00:10:20.500
in most cases. This is like the old factory towns where, you know, if people remember this was,
00:10:25.480
you know, during the time of slavery in the South and stuff like that in the North,
00:10:28.280
you had the factories where you'd have big lines of people waiting to get a job. And if you didn't
00:10:32.440
accept the abuse, if you didn't accept the poor standards, they're just like, here's the line,
00:10:36.740
like they would keep it there. So you could see one of these people will take your job for less
00:10:40.580
because look, they're all starving. And, and when you have access to anyone in the world to,
00:10:44.420
to replace people, that was originally the thing, but with AI, what really needs to be emphasized
00:10:49.220
is look at how advanced the people I know who complain about AI and are like, it's really not
00:10:55.420
that good at thinking. It's really not that good at, these are people who have not interacted with
00:11:00.060
AI recently. They're, they're generally like, I tried it when it came out two years ago. And I'm
00:11:04.940
like, then try it again. Try Croc again. Try Claude again. The paid models, the good models.
00:11:11.420
And what, what are these people even enter? They're just like, I imagine they're just doing things like,
00:11:15.960
make me a sandwich or something. And the AI is like, well, I can't do that.
00:11:20.440
Yeah. They're not like, here's my taxes. Do them for me.
00:11:26.360
Well, no, it's not far from a lot of that stuff. You know, we use it for that all the time.
00:11:30.720
Yeah. I think, Hey, we need to write an executive order for the president. Can you draft one for me
00:11:35.860
on this topic? Hey, I need to write a, you know, like to help me with taxes all the time. It's
00:11:41.460
actually, it's incredible with anything sort of tax or government or statute related,
00:11:45.720
because it can translate government ease into human, which is really helpful.
00:11:51.160
Or you go to something like, I can go to Croc and I went to Croc today and I was like,
00:11:54.380
what episodes would listeners of based camp with Malcolm and Simone likely like to have an
00:11:58.820
answer on? And it'll give me like great suggestions. Like, you know, this is absolute or perplexity.
00:12:05.920
You know, we use a ton with our shows instead of hiring like a researcher, I'm just like,
00:12:09.340
give me links to everything on this particular subject. Like, or the art that we use in our title
00:12:14.720
cards. That's somebody who we might have outsourced to in the past. And yet every one of them is done
00:12:19.000
with an AI or multiple AI, sometimes multiple AI. So we, you know, to use AI, well, you really need
00:12:24.840
to know all the systems for a specific task. So it should go without saying that basically
00:12:28.400
AGI is well on its way. And though historically for the past, at least 20 years, we have been seeing
00:12:35.740
the downfall of the job as technology has made more and more human action obsolete.
00:12:41.100
Now, even the most elite human action is going to swiftly become obsolete. And if you ask AI about
00:12:47.060
it, AI is even going to vastly understate how much AGI is going to take over human jobs. For example,
00:12:54.640
if you ask, and I asked, I asked ChatGPT, I asked Claude, I asked Brock, and I asked Perplexity,
00:12:59.920
all about what they thought, you know, that the only human jobs would be, or like the remaining
00:13:05.340
areas where humans could actually have successful companies and do successful work vis-a-vis AI.
00:13:11.860
And they're like, oh, well, I think that things like counseling and therapy and the creative arts
00:13:17.760
are things that really only humans can do bullshit. You know, the most popular chat companion on most
00:13:26.340
of the AI chat sites, it's a therapist. Yeah. And everyone loves it a lot more than human therapists.
00:13:32.600
And what is like, what is the first thing that really nailed AI was, was, was art. Like it was.
00:13:40.300
And I love this complaint when people are like, AI is just give like the, the, the, the most average
00:13:45.720
answer possible because they're token predictors. And I'm like, is that not what you want? Like
00:13:50.260
from a therapist, do I not want the average of what all therapists might do?
00:13:54.000
Well, they actually give the most pleasing. And I don't know if it was Scott Alexander or someone
00:13:57.680
else. I think it was Scott Alexander who did some kind of like there, he either had a test or
00:14:01.540
offered a test or took a test that compared famous historical works with AI, basically
00:14:08.400
equivalents of those works. And, and a lot of people basically just weren't able to tell
00:14:12.200
the difference. I was instantly able to tell the difference. And all I had to do was look
00:14:16.260
at the two paintings and think of which one was just a little bit more perfect.
00:14:20.640
Well, hold on. I have another great, there was a great study done on people who said they
00:14:24.740
hated AI art. And it turned out that people who say that they hate AI art when presented
00:14:29.940
with two pieces of art. If they don't know which one was created with AI, persistently
00:14:36.800
Of course. Because when you compare, you know, a piece side by side.
00:14:39.640
People are just willfully ignorant is really the point. They have this belief that the world
00:14:44.640
isn't changing that much or that things can't really change that much or that technology can't
00:14:49.480
be that good. And it's just a, it's, it's weird. It's almost like the opposite of like
00:14:54.280
a Luddite perspective. It's like, I don't engage with the world perspective. And it's,
00:14:59.920
it's bizarre to me, but those of us who have kids and we really do need to think about our
00:15:04.080
kids and like how we're raising them. We need to accept that this is the reality for them
00:15:08.640
and that AIs are going to be better than humans at most tasks.
00:15:13.720
And so as they think about like, what does it mean to educate myself? The very last thing
00:15:22.040
you want to be as a human today is somebody who is outputting the average answer.
00:15:26.400
Yeah. Being a mediocre and generally good at, at things is really bad.
00:15:32.780
Yeah. Actually, I think I took notes here. Grok put it best. Let me pull up what
00:15:37.640
Grok said as a general overview. Humans will still hold an edge in areas rooted in emotional
00:15:45.380
connection, physical presence, cultural nuance, and subjective experience. It, it thinks basically
00:15:51.520
that a lot of areas. No, it's not. It's not. He, Grok wrote niche is king. AI will dominate mass
00:15:59.900
markets. So focus on small, passionate audiences willing to pay for human expertise or connection,
00:16:04.800
use AIs, obviously the whole time to do this, build relationships and start early.
00:16:09.460
So the general conclusion that I came after talking with the four major AIs in my view and
00:16:15.620
getting their insights and where they think that humans really matter is what, here's what I think
00:16:20.100
we should do with our kids. And you can critique what I think we should do. So one, our kids need
00:16:24.880
to grow up AI native. Their friends should be AI. They need to use AI to solve every problem that's out
00:16:29.360
there. Like everything they do leans on AI. They grew up with AI. They are integrated with AI.
00:16:33.980
Right. The other thing is, and this is sort of on the other end of that spectrum.
00:16:38.740
I really want them to apprentice with a contractor or fixer or plumber or electrician or pest control
00:16:46.600
person, like to learn physical problem solving and get used to that because that is, that is going
00:16:52.680
to be a really important component later on in the career, whether or not they just want to fall back
00:16:57.340
on like being a pest control person or an electrician, because to, to the point of AI, actually that,
00:17:02.380
that is going to be an area where like, there's still going to be jobs forever when you can start
00:17:07.100
your own business to do that. So that's really important. Find that person to apprentice with
00:17:10.740
and help them out like for free. And that's a key part of your education. Then from an early age,
00:17:16.120
and I'm so glad our kids are so thirsty for like online presence. You need to build a strong public
00:17:22.000
online presence and a clear character that could plausibly jump into and make sense in a lot of
00:17:28.720
different niches. So like you should be online. You should be well-documented. You should be in
00:17:35.080
exchanging content, doing like appearances with other people on lots of different platforms,
00:17:40.580
and then build a strong personal network with other high agency people who are doing the same thing,
00:17:46.060
because this is the key to your, any entrepreneurial thing you do taking off. Because in the end,
00:17:52.320
people are only going to work with humans because those humans have cachet because those humans have
00:17:59.740
recommendations because there's something special about them. AI is going to do everything better.
00:18:04.220
So the only reason they're working with you is because you're a celebrity in your niche. And so you have
00:18:07.880
to be lumpy and weird. And then also be the one to start the parties in your networks. Like,
00:18:12.980
so be the one to convene vacations and events and things like that. Because one of the areas where AI
00:18:18.560
did point out humans could do fairly well is in human gatherings and events and like sort of
00:18:24.700
cultural things. So you could just be an event organizer. You could be a retreat organizer. You could
00:18:29.100
become someone who organizes these communities that people still pay money to be a part of because
00:18:35.380
they need to feel that connection. And again, it could be like really nerdy stuff. Like I watched this
00:18:41.280
one YouTuber who just reviews conventions and there are so many niche conventions, like for
00:18:47.720
some specific subgenre of like fantasy romance where they have like balls and workshops and,
00:18:55.540
but there's like people make money off this, right? So you can own that niche, but you have to build up
00:18:59.760
a reputation to do so. And then I think the most important thing, and this will probably manifest in
00:19:04.620
a lot of different businesses that our kids will create throughout their careers as entrepreneurs.
00:19:08.340
So they're basically going to create micro company after micro company, after micro company
00:19:12.100
is learn how to physically build something that people might want made for them by a human.
00:19:18.200
So it could be, I think Grok gave the example of vegan leather shoemakers for cyclists,
00:19:23.220
this is an example. But I was also thinking like custom drones, custom home security systems,
00:19:29.740
bespoke AI companions, like sort of giving the right prompts and like, then like literally crafting
00:19:38.740
Wait, should they make the AI girlfriends that replace everybody?
00:19:41.940
Yeah. Like literally like make AI girlfriends, make AI caretakers, like being someone who custom
00:19:48.460
Well, no, I think that's going to be a thing is people who make AIs that are then like in the
00:19:53.140
same way you have like VTuber creators now who are like good VTuber designers.
00:19:57.160
They're going to be good designers of AI celebrities.
00:19:59.740
Well, yeah. And people like, maybe everyone's going to have an AI slap drone, but like some
00:20:04.500
people are going to have a pink AI slap drone that holds a parasol for them and speaks in
00:20:08.940
a British accent. Like who's going to do that? Sorry. People will come to that person or like,
00:20:15.300
I think there's also going to be a lot of interest in like, girl, girl, what are you doing?
00:20:22.180
I think there's also going to be a lot of interest in, and the AI agreed with me on this,
00:20:26.260
in like historical ways of building houses and furniture and clothing and even fabrics,
00:20:33.980
like artisan fabric making even now. So the, the most, the people who are most excited about
00:20:39.160
Lore Piana, for example, which is sort of the, if you know, you know, fancy person brand,
00:20:44.460
like they're not, they're not buying like Louis Vuitton or Chanel or anything. It's, it's Lore
00:20:48.220
Piana. It's, and it's because they're kind of obsessed with this like artisan fiber material
00:20:53.120
sourcing and then weaving together. And it's like, Oh, well, the material and the artisan.
00:20:57.340
And like, I think in the future, it's going to become the niche version of that. You are having
00:21:04.160
a, a niche celebrity do it. So a niche celebrity tattoo artist, a niche celebrity, whatever. And so
00:21:09.960
I want our kids, that's why it's really important that they both get really good with AI and build a
00:21:14.420
strong public persona and network, but also really learn physical problem solving early on
00:21:21.020
and are really comfortable with robotics, with electronics, with things like that.
00:21:27.820
We do, but our kids aren't yet old enough to like apprentice, for example, with John,
00:21:32.060
who I think it'd be amazing for them to apprentice with because John does, John is,
00:21:35.740
John is our neighbor and also like basically family to us. And he has a business.
00:21:41.080
Yeah. He has a business that does all sorts of things from like landscaping to hardscaping to
00:21:45.520
tree management and like fixing things in homes and all sorts of things like that. And oh my gosh,
00:21:49.640
to get our kids exposed to that would be so helpful because you have to learn those skills.
00:21:54.560
There's one other element of this that I think will be for some of our kids' lumpy interests,
00:21:58.760
which is STEM. And I think it was Claude pointed out to me that one thing that AGI will be unleashing
00:22:06.540
is whole new scientific fields and new discoveries that, that smart and entrepreneurial people can act on
00:22:15.360
in ways that AI just kind of in isolation, maybe can't necessarily, because you really need to connect
00:22:21.620
breakthrough technology with high agency people who make it a thing, right? Like if there's some-
00:22:27.820
We changed the Collins Institute to breaking out all human knowledge. And instead of having,
00:22:31.860
we used to have multiple choice-based tests to pass a skill. Now it's a Socratic AI tutor.
00:22:36.300
That you can do in Socratic mode or to test you.
00:22:39.380
So let's say that, you know, AGI discovers some amazing new therapeutic. Well, if our kid
00:22:45.620
also then is really well connected with, you know, great investors and other scientists and things
00:22:51.160
like that, they can start to market and sell customized versions of this therapeutic to other
00:22:58.080
people within the network. I don't care at this point if it's like regulated or not, because that's
00:23:01.880
not going to matter in the future. You know, like what catch me, like you're going to have
00:23:06.000
basically, because we, we also know that AGI is, it's going to create a very, very, very wealthy
00:23:13.640
0.0001%. And then everyone else who's basically, and we need to be popular within that wealthy
00:23:19.380
network. That's the other thing is being a type of person. If what you're selling to the 0.0001%
00:23:25.120
is illegal because they're richer than crisis, it just doesn't matter. Like they have ways of hiding
00:23:30.420
that. And so will you using AI. Sure. What, what am I missing?
00:23:36.300
No, I, I largely agree with everything you're saying there, to be honest. I think, you know,
00:23:40.980
building community that people want to be affiliated with is really, really important in where we're
00:23:46.820
going and focusing on community in terms of how you're selling, in terms of how you're thinking
00:23:52.440
about, you know, with our kids, what's going to be important for them is in the previous generation,
00:23:58.420
you could just go to an elite school like you and I did Stanford and Cambridge. And then there
00:24:02.440
were certain networks that you were like automatically in. That's not going to be true
00:24:06.660
for this next generation. That means these people who I think are investing in these Harvard degrees
00:24:11.800
and stuff. Now they're not going to matter as much as they did historically at all. They might even be
00:24:17.640
seen as a negative in a lot of communities because they're seen as, you know, trying to split the
00:24:22.080
difference and, and, and potentially invalidating of an individual. And I think that a lot of people,
00:24:26.200
that's something that is, is not being accounted for as much, um, in people who are now like,
00:24:32.060
yeah, but it's better to get the Harvard degree to be safe. I'm like, you don't know if in 20 years
00:24:38.260
that will be seen as a strict negative within positions of power within our society. There's
00:24:42.800
already a lot of suspicion around stuff like that. I would not be surprised if I'm talking about the
00:24:48.280
next generation. Now I'm not, not this generation. I think on the margin, if you've gotten in,
00:24:52.660
it's probably worth it, but for the next generation, I'd say my best guess is it's probably going to
00:24:58.860
hurt your career prospects if you could have gotten in and that you're much better off on
00:25:03.180
focus on building community eyes was in online environments or social networks.
00:25:07.060
I think it might be more of a liability because the people scrutinizing you are going to ask,
00:25:14.600
didn't you have something better to do like with AGI and even with AI as it is today,
00:25:20.880
this idea that you would slow down your life by three to four years to just like a passport stamped.
00:25:30.500
Yeah. It's, it's, it's, it's suspicious and it implies a lack of drive alive, a lack of ingenuity,
00:25:37.460
a lack of being willing to take initiative and try and build something yourself. And, and for that
00:25:44.120
reason, I would view it with suspicion in the near future. I get why people do it now because we
00:25:50.720
still live in this very brackish world where some people are already fully steeped in the future,
00:25:55.780
but many people still are living by the old rules and, and you still in those communities and those
00:26:05.320
Well, somebody will say something like when they look at the college institute, they'll be like, well,
00:26:09.420
okay, it may be in every conceivable way better than a traditional education at this point.
00:26:13.200
I love people like, why aren't your kids going yet? They can't read yet. Okay.
00:26:15.940
We're building a college institute version 2.0. That'll have a part of it for kids who can't
00:26:20.360
read yet. That'll do voice with AI and everything like that. That'll be there eventually. It's not
00:26:23.480
there yet. But my gosh, is it a school for kids who can't read good, but want to?
00:26:28.460
Yes. But the, the version of better in every conceivable way, except people will say, well,
00:26:34.220
what if the kid doesn't have self-motivation, you know, doesn't, doesn't have the ability to drive
00:26:39.160
income from the court? Well, then they can just sit back and rot with their AI girlfriend forever.
00:26:44.780
Yeah. I was like, that's the core thing you are responsible for teaching and imparting to them
00:26:49.260
and the parents. No. And school, the traditional school system isn't going to impart that. If their
00:26:55.240
motivation is just, oh, well, I'll get in trouble if I don't do this, that's not going to exist when
00:27:00.400
they graduate. Nope. Cause you'll just take your UBI and go, you know, die in the woods.
00:27:08.680
Yeah. I'm not, I'm really concerned about UBI. I'm really.
00:27:13.080
Well, I want to do an episode on this particular topic because I've looked at how UBI has destroyed
00:27:17.840
communities that have gotten it intergenerationally. And we have examples of this and things like
00:27:21.200
American, Native American communities and wherever it's happened intergenerationally, the community
00:27:25.820
has just rotted. We're talking, you know, only one in four people is employed.
00:27:31.100
I mean, hell, we can't trust fund kids. Like the way that the, the percentage of them
00:27:35.440
shout, we didn't even need to look at the disadvantaged communities because everyone's
00:27:40.180
always going to argue, well, oh, but they were facing systemic disadvantages. No, look at,
00:27:44.820
look at the people who descended from arguably the best privilege, every advantage, all the money,
00:27:52.280
all the connections. And I have a lot of friends in this, a lot of them ended up nobodies.
00:27:55.820
Like the majority ended up nobodies because they expected that it would continue and, and not
00:28:01.820
Well, no, no, no. I think especially those, especially those who had no forcing function
00:28:06.520
to try. I would argue that those who were not given the luxury of living off a trust fund
00:28:19.140
It's the ones who had the trust fund that washed out.
00:28:21.560
The, the ones, yeah. And the, or the ones whose parents like fucked up at some point
00:28:26.960
That's why I loved your family tradition, basically like sort of unspoken family tradition of like,
00:28:31.080
oh, are you not doing well? Well, don't come home.
00:28:33.680
Yeah. Like we don't help the ones who don't do well. Like, no, but it actually is really
00:28:38.280
important. And we see this among our network. If I, I look at like, what's typically the profile
00:28:42.600
of someone who's like actually super successful. They're typically from a family that was super
00:28:47.240
successful, but then lost all their money or something else like that. Or for whatever reason,
00:28:51.800
disown them or something like that. That's the general, like Elon, right? Like Elon's dad was
00:28:56.760
fairly successful, but he never really got any benefits from that because his dad is such a skeezy
00:29:01.400
person. Am I allowed to say that his dad is a skeezy person? Like generally accepted.
00:29:06.220
Marrying your stepdaughter puts you into skeezy territory. Yeah. Yeah.
00:29:12.120
But yeah. But I think that that, you know, hugely, if his dad had actually owned like emerald
00:29:17.220
mines that he could have milked, he probably wouldn't have ended up anywhere near as successful
00:29:21.620
as he is. And- He didn't have as much motivation. Yeah.
00:29:25.300
Yeah. And this isn't just a US thing. I'm thinking about like the friends we have who are like
00:29:29.040
successful in Latin America, the friends we have who are successful in Asia. It's a very similar
00:29:33.280
pattern in this existing economic system. And I think that this pattern will reinforce itself in the
00:29:38.320
next economic system. There is nothing worse that you can give someone than a safety net.
00:29:46.300
I mean, yeah, there are exceptions. You know, I think that sometimes there are
00:29:50.820
unforeseen events and acts of God that can throw you back. And I think it's, you know,
00:29:58.020
very case by case in terms of a family deciding to step in or not. And they're absolutely legs up
00:30:04.300
that can make a world of difference for someone. And those, the, those types of advantages really
00:30:09.500
vary. So it depends on the time and the context. So I do think that occasional
00:30:15.280
hands at the back, you know, like spotting hands are good. I just think that a guaranteed spotting hand
00:30:25.560
Kind of like when you're rock climbing, you shouldn't feel any slack. I'm sorry. Sorry. You should feel
00:30:31.320
nothing but slack. And if you fall, you're going to fall for a while before the rope catches you.
00:30:40.320
Well, I really, I hated that when rock climbing, I'd be like, pull up a slack, pull up. Like I basically
00:30:44.020
wanted to feel like someone was literally pulling me up the rock wall I was climbing, you know, like
00:30:49.000
that's the, that's how it feels comfortable. And that's true. Like trust fund kids are just being
00:30:52.720
hoisted up. You're just hanging in midair, just being hoisted up.
00:30:56.080
Well, I mean, that's what happened to people in the DEI networks. That's what was happening
00:31:01.700
Yeah. So who's going to try, you're not going to climb if someone's hoisting you, that would
00:31:05.780
be stupid. Like people are behaving logically. It's sad.
00:31:11.480
Well, and how do you, when, when UBI happens, do you tell our kids not to take it?
00:31:16.840
I mean, they'll take it, but they'll invest it in their businesses.
00:31:21.020
Like the idea of using that to support yourself is, is sad, but yeah, we're going to need
00:31:25.680
to invest. We need to like make a maker studio. Like we could maybe put like a little barn.
00:31:32.240
Like I, I, our kids need to get used to building artisanal physical objects really early on and
00:31:40.060
find weird enthusiast communities that get excited about them.
00:31:43.620
If you're watching this by the way, and you have kids around our kids age, you're even, you
00:31:48.120
know, you're planning to have kids in the near future and you want to be on our email
00:31:50.320
list for like parents to do like get togethers and stuff like that. Cause we're going to
00:31:53.620
do eventually yearly, like summer camps and stuff to build, you know, collective culture
00:31:58.760
networks and stuff like that. Let us know. And we can help you on those fronts.
00:32:05.720
I love you to death, Simone. What am I doing for dinner tonight?
00:32:10.240
Per your request, we are doing more of the chicken curry with garlic naan, but what type
00:32:18.460
Lol. Hold on. Moss. See, it was, it was a lamb dish, but then we made it with chicken.
00:32:26.040
Oh, well let's throw in more of the, yeah. Like that super red spice.
00:32:37.080
Lol Moss. Okay. Well I'd put in some chili oil.
00:32:39.960
It's Rajasthani mutton curry known for its fiery red color and rich flavors.
00:32:46.100
Some chili oil. Okay. Do you know where the chili oil is?
00:32:49.840
I thought that you, that gave you digestive discomfort.
00:32:55.880
Chili oil and we'll try. No, and I note, I didn't.
00:32:58.640
So not the red chili powder. You want chili oil instead.
00:33:01.100
I didn't say the, the spicy sauce, the tampon sauce or whatever it's called, but chili oil,
00:33:06.380
just labeled chili oil. It's like a red oil. So a little bit.
00:33:15.060
That's a specific flavor that I think is what's calling the digestive problems.
00:33:25.680
You sure you want me to just put in the red powder?
00:33:28.760
Chili oil and a bit of the red powder, but not a ton of the red powder.
00:33:38.920
Storming the castle. What are you talking about?
00:33:41.220
Our kids, by the way, are rolling down the hill over and over again right now.
00:33:51.360
Because they go fast. We've got a hill by our house. It's like paved.
00:33:54.580
And I remember seeing them the first time. I was like, they're going to die.
00:34:03.080
Yeah, I remember the first time I learned not to play it too fast and loose on my bike because I was leaning into turns.
00:34:11.180
Oh, I don't want to hear about this, but I will tell you this, Simone.
00:34:14.120
They won't learn. You think an injury is going to teach these kids anything?
00:34:17.360
I don't know. When you scrape off half of your skin.
00:34:23.560
You'll learn. You'll learn. I need to get better bandages for that. I need to prepare.
00:34:28.900
We had this like, you want that plastic kind that goes over the open wound.
00:34:35.720
Are we going to, when do I need to start taking them into the woods more?
00:34:39.200
They already found it very formative. Remember Octavia?
00:34:42.920
After the swamp marigolds come out, I would say.
00:34:46.520
Oh yeah, that's a great time. We can actually go to like the deep swamp area and I could let them out one night for like a witch day.
00:34:53.180
Where is that hiking trail that has the witch swamp?
00:34:56.620
When we were walking, oh no, it's right by Pauling's farm.
00:35:04.680
Should I, should I have them go give them like high boots and stuff and go explore the witch swamp?
00:35:10.500
Octavian, remember I took him in the woods and he made drawings of it for a while of like, remember he was like, oh.
00:35:15.680
Oh yeah, this is me going to the woods with daddy.
00:35:19.340
But then there were those instances of Octavian deciding he was finished before the family was.
00:35:30.540
Like Octavian comes home and I'm like, where's Torsten? Where's Titan? Where's daddy?
00:35:33.880
He ran home while I was in the woods and I was panicking trying to find him.
00:35:36.660
Yeah, you're out there like looking for him and he's home just chilling, helping me make dinner.
00:35:43.820
Like daddy said it's okay. I did not say it's okay.
00:35:53.220
Clearly, this is the thing. We get in trouble for barely beating our children.
00:35:57.140
And I think the problem is, I just don't beat them severely enough.
00:36:11.440
None of them have gotten seriously injured even once. It's kind of surprising.
00:36:14.480
It's a huge relief. I really don't want them to get hurt.
00:36:16.600
Yeah, I don't want to deal with that. But, you know, I don't want to raise pussies.
00:36:24.800
At any rate, I will go start your dinner and I love you very much as it happens.
00:36:33.180
God, Reddit's become such a cesspit every time.
00:36:36.200
I had tried to prep a video. This is actually really interesting.
00:36:39.280
I wanted to learn about, like, people who are doing, like, VR dating and everything like that.
00:36:45.480
Like, with their AI boyfriends and girlfriends?
00:36:48.100
Well, no, no. Real people, but through virtual reality.
00:36:53.580
Because, you know, there's these, like, virtual reality, like, chat rooms and stuff now that remind me a lot of...
00:37:04.100
But, like, the communities in them seem to be dramatically less interesting or diverse than the ones that existed within Second Life.
00:37:11.340
And the stories of the merges that come out of them are just, like, super boring.
00:37:31.760
Which, their illegal probationary firing was ended.
00:37:34.700
Thousands of people were rehired due to a judge.
00:37:49.900
Having learned that the Clinton administration also had a doge-like effort.
00:37:55.460
And that they just went through the proper channels.
00:38:00.800
Part of me is a little regretful of the timeline that Doge has.
00:38:06.180
You know, they did not give themselves a lot of time.
00:38:39.180
So, your kids can watch, like, the video of Deja and Mikey.
00:39:18.180
Like, I like and subscribe to somebody's channel, too.
00:39:26.540
Do you think that they'll get happy if they like and subscribe to our channel?