Based Camp - September 11, 2025
How Europe's Relation With AI Is Erasing It From History (How AI Changes The Collective Consciousness)
Episode Stats
Words per Minute
180.22025
Summary
In this episode, Simone and I talk about the impact of artificial intelligence (AI) on society, and how it can change the strategies we as humans use as we relate to the world around us, and ensure our own impact on the future.
Transcript
00:00:00.000
Hello, Simone. I am excited to be here with you today. Today, we are going to be talking about
00:00:05.380
how Europe is erasing itself and its civilization from history within the context of AI. And we're
00:00:16.580
also going to talk about how AI changes the strategies that we as humans use as we relate
00:00:24.420
to the world around us and ensure our own impact on the future. We need to talk. What is that?
00:00:32.400
This is a flying robot I just shot out of the sky after it delivered a package to my house.
00:00:37.860
So I destroyed the robot. No one is safe from these bastards. So to start with this second topic here,
00:00:46.880
you used to be at one point, you were the director of marketing at the 45th most trafficked website in
00:00:51.540
the United States. And it's the website, your people go and write. It was hub pages is what
00:00:56.240
it was called. Yeah. And then it was, I think it sort of became a different brand over time, but
00:01:01.380
yeah. What was the other brand that bought it? Squidoo? Yeah. Well, no, it acquired Squidoo, I think.
00:01:06.660
And then it became something that now calls it something else. At the time, it was really big,
00:01:11.460
right? Yeah. This meant that you were at that point in your career, one of the world experts in
00:01:17.940
what is called SEO. This is search engine optimization. This is how you ensured that when
00:01:24.300
people search for things, they saw your takes and not somebody else's takes. I mean, often it's
00:01:30.100
involved in marketing or whatever, but it matters a lot. Like if we were running the pronatalist movement
00:01:35.680
back then, everything we would be focused on is how do I ensure it ranks well within SEO?
00:01:42.560
How do I ensure? And these systems, you know, companies lived and died on SEO. Like Google
00:01:47.840
would roll over in its sleep and all of a sudden it's a whole new ball game. Oh, it would. Yeah.
00:01:52.320
All hands on deck, emergency situation at the business, 100%. Very stressful. But we are moving
00:02:00.020
into a world where SEO is almost irrelevant given the way AI works. And what AI picks up in terms of ideas
00:02:10.940
is very different than what would have been picked up in an SEO environment. And this really changes a
00:02:17.500
lot of the online game. And I've noticed a lot of people haven't realized this or aren't relating to
00:02:23.200
this in sort of a sane way yet. There's still Brian Chow, whose article we're going to be talking
00:02:29.320
about today. He wrote an article called Public Intellectual Privilege about how you as a public
00:02:34.540
intellectual get to have a totally different relationship with AIs, which is true. I, for example,
00:02:38.740
can go to an AI and say, you know, what would Simone Collins like for her birthday? And it'll give me
00:02:43.320
like great recommendations. What is it telling you I want for my birthday? Oh, you can ask it. You can
00:02:50.060
ask it while, while I'm talking. What should Malcolm get Simone for her birthday? Right. Okay. And what
00:02:55.940
you'll see is AI actually knows a great deal about you and me. Now there are people from the last
00:03:03.480
generation. Like I recently had my dad over and I asked AI questions about him and he could find almost
00:03:10.380
nothing about him despite him being a fairly within his time famous public individual. And yet he expressed
00:03:17.840
this as a good thing. He's like, well, I don't like having things about me online. Right. It reminds me a bit
00:03:23.740
of, and one of the things we're going to be talking about a lot in this is the, you know, self-defeating
00:03:30.280
obsession with privacy that some individuals developed during the age of deep data where we
00:03:37.360
saw privacy as a way to protect ourselves from large companies. But I mean, also like, what are
00:03:45.120
you even protecting yourself from? Like getting better Amazon recommendations? Like I, I understand
00:03:51.940
you can be like, well, what if my data is leaked, but like that's data that can be leaked regardless.
00:03:56.800
Right. The thing is, is that between the era of deep tech personal protection advocacy and the era
00:04:04.700
of AI, the benefits and downsides of AI knowing about you or, or, or companies knowing about you
00:04:12.560
has done a complete one 80. Whereas it used to be almost just like a strictly net negative to have
00:04:19.260
a lot of information about you online. Now it's almost strictly a net positive because it means when you go
00:04:25.760
and you ask AI questions. So first I'm just going to dive into what is SEO, but for AI, how is it
00:04:31.580
different? So first you don't really care as much because Brian was telling me, like, why do you still
00:04:37.140
care about getting media? And I'm like, well, because media references AI to some extent, but in a large
00:04:42.100
extent, like just being in media a lot, just doesn't matter that much anymore because media does
00:04:47.760
not alter what AI is going to do or say on a particular piece, what does alter what AI says or does about a thing
00:04:56.640
is the volume of work that is out there. Volume now matters much more than how much the public is engaging
00:05:05.900
with a thing. The number of clicks a YouTube video has, or the number of clicks a site or document has, or link juice,
00:05:14.940
as it was called historically, the number of credible sources that are back linking to content
00:05:19.100
does not very heavily determine whether an AI is going to cite that document. What determines if an AI
00:05:27.320
is going to cite a document is how well sought through it is, how long it is, and how novel it is.
00:05:35.640
And I have seen this because AI regularly cites Basecamp episodes in responses when I am asking it about
00:05:43.740
just like normal day-to-day things. Totally unrelated to you? Totally unrelated to me.
00:05:49.060
What's an example of a thing that you asked about? I was asking about something related to like trans stuff
00:05:55.560
and an AI ended up referencing some of our episodes. I was asking about a thing related to antinatalist
00:06:01.060
philosophy and it referenced some of our stuff. So it's like tangential to things that we talk about.
00:06:06.240
I mean, obviously, but the person could be like, wait, how are you getting your content referenced
00:06:11.480
by AI when the AI doesn't know that you're the one asking the questions? And the answer is that
00:06:18.460
we release our content long form every day and multiplicatively. That means that whenever we're
00:06:29.500
releasing an episode, you're going to see a YouTube episode of it, which can be scraped by AI,
00:06:34.520
but then you're also going to have a Substack article, which some AIs can scrape. And then
00:06:39.620
that Substack article is also going to be broadcast to a number of podcasting websites, which some AIs
00:06:45.580
also scrape. So it's getting the information from multiple sources. And then with the most important
00:06:52.380
episodes like the tracks or the book series that we wrote, we actually have full plain text versions
00:06:58.560
of both all of the tracks and all of the books hosted as individual webpages mark for AIs to use
00:07:06.480
on multiple websites that we host. This is why when people are like, why does AI give such good
00:07:12.680
responses in response to techno puritanism and your religious beliefs? And it's because we structured
00:07:19.380
and aggregated those religious beliefs in a way that makes it very easy for AI to grab,
00:07:24.880
which is not the way traditional SEO companies think, or the way a lot of traditional people
00:07:30.820
think when they're engaging with this stuff. Thoughts, Simone?
00:07:35.760
I agree. I've done a lot less of this than you have, but I also intuitively just really have a lot
00:07:43.500
of gripes with people and online privacy or privacy.
00:07:49.860
What did Grok say about you? Like, what did it?
00:07:52.540
I wouldn't appreciate any of the gifts it recommended. I mean, some of it would,
00:07:59.860
Come on. I have literally given you that before and you've appreciated it and it wouldn't have known
00:08:08.140
Oh, you wouldn't appreciate if I got it for you?
00:08:09.900
No, I don't want it. I, cause I can make it like, it's so easy for me. Why would I want that?
00:08:16.820
Popcorn is a big one. There's no recommendation for popcorn here.
00:08:21.600
Someone to clean our house so that I don't have to for once, you know, like just like,
00:08:28.380
you know, Hey, you know, I'm going to get you like three house cleaning sessions.
00:08:33.340
Why don't you show me how to clean in the way that you like?
00:08:36.380
Cause you're like, just like physically and mentally incapable of it, dude.
00:08:40.580
And it's just like, keep in mind though. I think it's, it's part of a lifestyle thing. Like I grew
00:08:46.420
up when I spent time with my mom, she would clean the house and I would hang out with her. And she'd
00:08:54.160
be like, this is Borax. This is what we use to scrub the tub and the sink. And like, this is like,
00:08:59.680
she would just walk me through like, this is cleaning materials. Do you want to try it? And then
00:09:03.880
like, okay, no, you did it wrong. Okay. No, you did it wrong. Okay. You did it wrong. And like,
00:09:07.060
that's like, I don't have a hundred hours to give to you to train you in cleaning. I'm trying to do
00:09:13.740
that for our kids over time. Oh, no. Are you going to train them into little worker bees to
00:09:18.580
handle it? I'm going to train them how to handle their own housekeeping needs for sure. And like
00:09:22.980
laundry, but like, I don't think, you know, your parents also didn't clean their own house. They
00:09:27.180
had people do it. And those people certainly wouldn't be willing to like also tutor you in house
00:09:31.980
cleaning while also trying to complete their job as quickly as possible. So this is not on you.
00:09:36.020
So it is just that like, I appreciate that you could pretend to be grateful.
00:09:42.060
But yeah, no, it also says that I should get high-end travel gear for my global lifestyle when
00:09:46.500
we've like decided to quit traveling. So like, I guess I can understand that like in the past you
00:09:50.940
and I were very jet set, but it doesn't know that like now things have changed. The one thing it does
00:09:54.820
suggest is customized educational tools for our children, but I'm buying them the like STEM kits that
00:10:02.260
it's describing here all the time. Like I just bought one of those Montessori circuit.
00:10:09.240
Yeah, no, that was, that was what Catherine Boyle from Andreessen Horowitz was like suggesting in her
00:10:15.980
Twitter, her X feed. Cause I I've seen them on Amazon all the time. These little, like they're
00:10:21.580
these Montessori circuit board, busy board things for like young kids. And I always thought this
00:10:26.380
would be so cool. And also my kid is never going to use this, but she said, if you know, you know,
00:10:32.340
hours of endless fun. And I'm like, really? Like, is this occupying your four-year-old? Because if it is,
00:10:36.980
I'm going to get it, like I've wanted to get it, but I thought no one would actually use it. So
00:10:41.660
we will actually see. We will see. I'll tell you what they do use is a little Lego man that we bought
00:10:46.100
a collection. Yeah. They're like a total choking hazard for you. Well, and the arms just come
00:10:49.580
off constantly, but Toasty last night spent like hours being like, I will fix it, which is really
00:10:54.300
cute. So. All right. So now to the second part of this, you know, we're starting by talking about
00:10:59.240
the move from an SEO world to an AI world, which is a total transformation in the way you think
00:11:05.960
about your online and public persona. And it also makes me nervous because the SEO world,
00:11:10.340
I mean, what I know about it was that it was exploited in many ways. It was a constant game
00:11:15.460
of cat and mouse that people would, after they understood how it worked, manipulate it and
00:11:20.020
manipulate search results, which is one of the reasons why even today search results are not that
00:11:24.480
great. And that's because no matter how hard any search engine will try it, we're not just talking
00:11:28.740
about Google here. There will be, you know, people will quickly figure out how the algorithm has
00:11:33.860
changed and they will game it. It's a lot harder to manipulate AI in the way that you're thinking
00:11:38.720
of. Really? I hope. Yeah. Put me at ease here, but that is my one concern. One of the things that AI
00:11:43.140
looks for in articles is how unique the ideas expressed in the articles are and how well-structured
00:11:49.980
it is. AI has the ability to tell between low quality, low brain takes and high quality, high brain
00:11:57.620
takes. I can't tell you the number of times that AI, like a perplexity or grok when it's referencing a
00:12:03.320
source references a blog that I have never seen or heard of before, but is fairly well-written and
00:12:08.900
well-researched. It seems to have a good ability to tell between sources. The real advantage that
00:12:16.660
you're going to get is to not focus so much on things like link juice and focus more on the volume
00:12:21.820
of the work you're outputting. Hmm. But to continue here, what Europe has done, which is really
00:12:32.040
interesting with AI, is it put into place something called the GDPR, which is a system that prevents
00:12:39.900
how much data companies can collect from people within Europe. For what it changes, explicit consent
00:12:46.980
is required. Companies can't use pre-checked boxes or bundled consent. They need clear, explicit
00:12:52.860
permission for every single thing. Yeah. So every time you go on a website and you have to accept
00:12:58.480
cookies and read a stupid notice, you can thank the GDPR for that. Yes. Data minimization. Companies
00:13:05.340
can only collect data that is absolutely necessary for their state of purposes and no more collecting
00:13:10.200
just-in-case stuff. Purpose limitation. Data can only be used for the specific purposes disclosed
00:13:17.540
when collected. Companies can't repurpose data without new consent. And then usage restrictions.
00:13:24.320
The companies cannot refuse service if you don't want to give them data. Easy withdrawal. Users must
00:13:29.400
be able to withdraw consent as easily as they gave it. Storage limits. Companies must delete data that's
00:13:35.260
no longer needed for the original purpose. And then they have cross-border transfer restrictions.
00:13:40.040
And in terms of fines for this, you can be fined up to 4% of the company's global annual revenue
00:13:46.560
if you violate this. So really strict fines for this. Yeah. Well, also this, we're just talking tip
00:13:54.140
of the iceberg with this regulation. I remember the collective brick that companies shot when this came
00:14:02.240
out because of just how complicated it was going to be to comply with these rules. It's just from a
00:14:09.840
regulatory and efficiency standpoint. It makes me so angry. It's so bad.
00:14:16.360
But it's created a situation in Europe where, one, Europe basically cannot develop high-quality
00:14:25.360
AI models. European companies can't. And then two, what in terms of content is being produced
00:14:33.520
within the European internet is not accessible to AI. If you are a European individual who is engaged
00:14:41.880
actively with the internet and you're not using a VPN pretending you're in the United States,
00:14:45.620
you will not be remembered by AI. You will not be recognized by AI. And this is a very, very big deal
00:14:54.160
when you consider the turning point of human history that we happen to be at right now. The future of what
00:15:02.080
humanity becomes as we take our manifest destiny among the stars, as AI dramatically changes the
00:15:08.000
direction of human evolution, what it means to be human, what it means to interact in this world,
00:15:12.180
as it will, is Europe and cumulatively European culture, except for how it's expressed out of the
00:15:21.780
United States and Latin America and Canada, is going to be erased from history. And it's going to be
00:15:31.140
replaced by American culture and Chinese culture. What's interesting is that China actually has
00:15:38.380
more restrictive systems in place and more restrictive rules in place around what AIs can
00:15:45.760
take, but the government doesn't enforce it because it doesn't care because that's the way
00:15:49.840
China works, which is, you know, good for Chinese development. So what is your thoughts on privacy
00:15:57.800
before I go into, because I looked up, I was like, what are the best arguments? Like actually
00:16:01.600
steel man this for me because I do not get why people would care so much about privacy.
00:16:05.140
Yeah, I did this. I did this because I found it so offensive that people would be so, so intent
00:16:09.320
on protecting their privacy. The argument that convinced me that there is any argument to be
00:16:15.180
made in favor of privacy is that, okay, sure. You may have absolutely nothing to hide now. You may
00:16:21.280
benefit from people knowing more about what you think and all that. And then someday there's a regime
00:16:27.780
change, see change in culture and government and leadership and laws. And suddenly you are liable
00:16:36.700
for your views. You are arrested. You are taken away because they have the dirt on you. And that is,
00:16:42.340
that is now, I mean, when in the age of cancellation, that's what did happen.
00:16:46.240
Take responsibility. And I, and I saw this argument as well. There were also arguments that,
00:16:51.220
that, that were compelling there, like freedom to change your, your perspective, freedom to rebrand
00:16:56.160
yourself, freedom to. Yeah. But I mean, I think people understand that rebranding is possible
00:17:03.340
with a public record. JD Vance rebranded from an anti-Trumper to. We did. We were not, we were not
00:17:09.740
pro-Trump the first time he ran. Yeah. We branded from like progressive pro-natalist tech, right?
00:17:16.520
Yeah. When we got even into the pro-natalist movement, we were still left-leaning. Like we started.
00:17:21.760
It was all about like, hey, we want to make it easier for gay couples to have gay, and we still do,
00:17:25.720
to be clear. But like, suddenly now we're, yeah, we're super right. Whatever.
00:17:32.540
We're far right activists, as they would say. Oh yes. Extremists. Yes. Right-wing extremists.
00:17:37.340
Right-wing, yes. As Hillary Clinton would brand us. But yeah, no, I mean, I, I'm, I, this is the thing
00:17:44.160
though. I don't think, I think when we live in a world where you know that your history's online,
00:17:50.160
one, you need to take responsibility for that history, right? Like-
00:17:56.380
Before everyone was aware that everything would be online, I can get that like, okay,
00:18:00.820
like for example, I did stuff like in my twenties that I would never do today, but it wasn't
00:18:05.520
because I, you know, wouldn't do them if I knew what I knew about the internet today. It was because
00:18:13.660
I thought that there was this privacy that is not expected today. And that's what I mean by all of
00:18:20.820
this is the illusion of privacy. The illusion of this stuff isn't going on a permanent record.
00:18:28.760
Everything you do isn't going on a permanent record leads people to make worse and more dangerous
00:18:34.760
positions from the perspective of political shifts and everything.
00:18:38.400
100%. Well, yeah. People who believe that they're private more often than not have a false sense of
00:18:42.520
security because your devices are recording. Everything is listening there. I mean, even if
00:18:48.740
you don't have any like smart speakers or other devices in your home and your phone is off, there
00:18:56.380
are, there are ways that people can use your own home wifi to see what room you're in of your house.
00:19:02.200
And they're much more likely to do vile things, to do the death threats we receive all the time or
00:19:07.960
the really bad trolling where their entire goal becomes ruining somebody else's life through an
00:19:14.980
anonymous identity. But here's where people can get to us. They can be like, well, what about
00:19:19.480
pseudonominidity? Pseudonominidity and people ending up getting doxxed has been a big part of the
00:19:25.100
new right movement. Whether we're talking about, you know, raw egg nationalists or bronze age pervert or any
00:19:31.460
number of these people. And I would say, well, as far as I know, there isn't a single well-known
00:19:38.800
pseudononymous right-leaning thinker who wasn't eventually doxxed. The problem with pseudonominidity,
00:19:43.960
a friend of ours, a guy who runs natalist call, what was he? Kevin Dolan, pseudononymous.
00:19:50.460
This is the problem, right? When they're like, oh, what if you have views that go against the
00:19:53.620
establishment? If you have views that go against the establishment and you feel safe sharing them
00:19:58.320
under a pseudonymous identity, that will eventually be doxxed within the existing paradigm.
00:20:04.780
The pseudonominidity on itself is a curse because it leads you to say and do things in a way that you
00:20:12.240
wouldn't if you knew you had to be publicly accountable.
00:20:17.380
It's a false sense of security that ends up putting you in a much worse position. And then you can be
00:20:22.340
like, well, what about laws around the right to be pseudonymous? That doesn't work because then you have
00:20:26.460
to give the government the right to hunt down anyone who's like, hey, this is this person
00:20:30.160
actually. And that's just like too hard to do without putting enormous restrictions on the way
00:20:35.180
companies work, which is really frustrating. Like I would like it if there was like a good
00:20:42.500
pseudonymous system out there that I thought worked and I could say, okay, I am for that,
00:20:47.920
but there isn't. And then people can say, well, people may not have, you know, out their ideas
00:20:54.300
or try to push the Overton window. And it's like, well, they, they will though. I mean,
00:20:59.240
in terms of how we actually got the Overton window pushed back to where it was from the extremists
00:21:05.680
controlling much of the narrative, it required people like you and me being out there in public
00:21:11.120
and in the news regularly. You know, if it had just been a bunch of provocateur accounts online
00:21:17.480
behind pseudonymous identities, that would never have happened because one of the things about
00:21:23.960
pseudonymous identities and their lack of ability to effectively shift Overton windows
00:21:29.540
is it's because they do not care about mainstream judgment. They are much more open about flaunting
00:21:36.660
mainstream cultural norms in a way that marginalizes your movement. So if I was operating
00:21:44.480
under a pseudonymous identity and trying to promote the pronatalist cause, for example,
00:21:49.060
the pronatalist cause would likely be far more extreme. There's many things I might say under a
00:21:53.540
pseudonymous identity that I wouldn't say attached to my regular identity. Right. And it's like,
00:21:59.260
but I'm glad that I'm not doing that because my goal with this movement is not to say everything
00:22:03.980
that I think is true. It is to try to move the Overton window so that we can have a higher chance
00:22:11.880
of human flourishing in the future of building that intergalactic human empire in the future.
00:22:16.160
And, and I see this online as a lot of pseudonymous or anonymous, like nobody's will be like, why don't
00:22:22.940
you say this offensive thing? Why don't you say that offensive thing? And it's like, because that
00:22:29.060
would really damage the movement. Like, like if, if I said that, then somebody's boss would go to them
00:22:35.940
and say, Hey, you called yourself a pronatalist the other day, or you went to natal con. And this guy who
00:22:42.660
has these thoughts is associated with that. Like, are you really comfortable? Like, I, I don't want
00:22:48.860
to say anything on here that an educated person working at McKinsey wouldn't be like, go to their
00:22:55.400
boss and be like, no, I really am against the gender transition of children. No, I really, you know, am
00:23:01.700
for, you know, the belief that humans have genes and those affect our behavior. Like every scientific
00:23:07.680
paper argues for that. I really, you know, you, you want us to take the extreme positions because
00:23:13.880
extreme positions, and this is one of the problem with pseudonymous content networks and environments
00:23:20.180
is that they create environments where the most extremist positions are the most rewarded
00:23:25.660
with re-engagement. Take an environment like 4chan or something like that, where the shock positions
00:23:31.460
are the positions that are most rewarded. Now you don't want to be like a Reddit, right? Where it's the
00:23:37.140
most normie positions that are the most rewarded. But in the same end, you don't want to be in an
00:23:41.960
environment where it's only the most shocking positions that are the most rewarding. If your
00:23:47.800
goal as a movement is to move mainstream society and politics. Yeah, absolutely.
00:23:56.640
Another one that they talked about that I thought was interesting is, well, people can like,
00:24:00.300
like physical safety, like people threaten us all the time, right? Yeah.
00:24:04.300
The problem is, is that that information always eventually leaks. One of the funny things about
00:24:10.100
having a bunch of friends who are billionaires is all people don't know, like all the billionaires,
00:24:13.540
they have like one house that's like their billionaire mansion house, but they don't live in it because
00:24:19.140
it was always leaked long ago. And then they have like other unassuming houses where they actually
00:24:24.940
like party and invite their friends because those houses haven't been leaked. And so the mansion is
00:24:30.440
always empty. And then the houses where they live are always like, actually not that particularly
00:24:36.800
grandiose, which is another one of these things that we've talked about. It's like, it's not that
00:24:40.860
useful to be a billionaire in today's world. You know, anything to be fair, some just have very
00:24:45.260
well defended compounds with very tall trees or whatever. But it's hard big fences. Yeah.
00:24:52.180
You know, we keep in mind, we live in a world of drones and everything now, right? Like if somebody
00:24:56.160
wants to fly by your compound and cameras that can go very, very far, if you live in one of these
00:25:02.320
compounds, there's almost always somebody looking through every one of your windows.
00:25:09.960
No, it is creepy, but it's the reality of the situation these days.
00:25:13.100
Yeah, you have to expect it. And it's better to be prepared than to try to avoid something that you
00:25:19.100
really can't control. That's to me a lot more scary. Like with cancer, I'm just assuming I'm going to
00:25:26.040
get cancer and I'm going to screen early for it and hopefully catch it early. Being like, wow,
00:25:31.100
I just hope I never get cancer, but never screening for it seems like a death wish.
00:25:36.920
Another one you hear is, well, companies are going to try to manipulate me, right? You know,
00:25:41.060
and it's like, are you really like that mentally weak? Like, yeah, I like that Amazon knows a lot
00:25:46.540
about me so it can give me good recommendations and I don't waste my time.
00:25:49.540
Oh, I want it to know more about me. It, yeah, it could be better.
00:25:53.040
Same with Steam. I get annoyed that Steam's game recommendations aren't good enough. I'm like,
00:25:57.400
you should have a lot of information on what games I play. Why do you not know that I want more
00:26:02.260
card-based and turn-based tactical games? Like, come on, Steam. I'm expecting this of you. I want
00:26:09.840
these indie recommendations, right? You know, and yeah. I think the most interesting thing though is
00:26:18.200
just the extent to which in the future, in a post-AI world, you will matter because people
00:26:27.560
will know about you and want to work with you or they'll have heard about this thing that you build
00:26:32.260
or this service that you provide and they will want to buy it from you. You can't just get by
00:26:38.260
anonymously with, with no searchable public profile and expect to find work or to sell the product
00:26:48.040
you're going to sell. They're like, oh, I'm a, you know, work with me and I search them and I don't
00:26:52.320
find anything. I'm like, why would we believe you? Yeah. Where's, what's your reputation? Where can I
00:26:57.260
see your work? It's been like a reporter or something like an undercover reporter, right? Like
00:27:01.980
that's a huge red flag that you have no really deep online persona. And I just won't work with
00:27:10.040
people like that anymore. But just even more importantly though, we, we are, it's, it's
00:27:14.700
already starting the, I think the CEO of LinkedIn wrote a New York times op-ed recently about how
00:27:20.480
already the bottom rung of the job market is collapsing. People in entry-level jobs across
00:27:26.340
industries, like whether it's customer service or coding or something else legal, they're not getting
00:27:31.820
jobs anymore because those entry-level jobs are being eaten up by AI. No. And then what's worse
00:27:37.800
is people try to get around this. Like the reason why you could get away with this in the past is you
00:27:43.020
could have proxies for competence, which would be, I went to Harvard. I went to Stanford. For example,
00:27:50.000
I got my graduate degree at Stanford. Simone got her graduate degree at Cambridge, right? Like that
00:27:54.120
used to be the proxy. I used to be able to stay quiet about who I was and then rely on the reputation
00:28:00.800
of the institution to buoy my reputation. But now we know that these institutions are like
00:28:06.960
DEI infested woke nests. Like you look at Harvard today and I think Harvard and antisemitism, like the
00:28:13.100
two go hand in hand, right? Like these are institutions that are now systemically racist and they really
00:28:20.980
only affirm an individual's willingness to compromise their morality to play within the social hierarchy.
00:28:30.140
And I think for people of our generation, you know, there's going to be some people going through them
00:28:35.400
today. There's going to be some leeway given, but I think in five to 10 years, they're just not going to
00:28:40.780
have the pull that they've had historically. Well, that and personal networks, this is already falling
00:28:47.080
apart. As you found out when you went back to Dallas and you saw all the kids that came from
00:28:51.820
wealthy, well-connected families that used to, from their culture, expect that because they had
00:28:58.160
this wealthy family in Dallas that was well-connected and did all this business stuff, that their job
00:29:04.080
would be set. It would be spoken for. They would be fine. And then they were finding that, you know,
00:29:10.060
now these companies- It doesn't work. Yeah, it doesn't work. Yeah, no, no, no. Intergenerational
00:29:13.920
nepotism is not as effective as it was historically for people who wonder why it's so ineffective now.
00:29:20.400
So a lot of the systems that intergenerational nepotists set up to make this stuff work had the
00:29:26.520
passive assumption that your family was going to control the company, that your family owned or
00:29:33.640
founded or anything like that. The problem is, is that these companies all became public.
00:29:38.040
Yeah, or took private equity and got a board and- Yeah, yeah. There was like one iteration where
00:29:43.020
like the board would still hire the son, right? Like that was seen as, okay, whatever. Like that
00:29:47.280
happened to my dad, right? Like he was hired to run one of the companies, his grandfather founded
00:29:51.500
by the board, but it was actually because the board was trying to do a coup on his grandfather,
00:29:57.120
right? Like even, even then you had the, the issue of, but they, but they wanted somebody who the
00:30:02.860
grandfather approved of, right? And the grandfather thought he was too young and easily push
00:30:06.960
around a bowl. So like, well, so anyway, the, the, the, the, the, you even saw that then,
00:30:11.820
but, but now a board, like somebody going in and being like, promote my son. They'd be like,
00:30:17.660
look, we might be able to give him some like middling level position was like a grandiose title,
00:30:22.880
but that is at most in his salaries coming out of yours.
00:30:26.540
Yeah. But that, so that was the, that was the pre AI crumbling.
00:30:34.200
Yeah. So now, you know, it used to be that like, maybe a family member could expedite the extent to
00:30:40.460
which your job application was considered at a company or act as a referral and a reference. But
00:30:47.160
now people just aren't hiring in that way at all. And really the one way that you're going to get
00:30:53.700
hired is if someone's like, Hey, I want the best like designer of this type, you know, or I want,
00:31:00.000
you know, the, like, Oh, that guy that, you know, whose podcast I follow, you know, his takes are
00:31:05.000
really good. And that's who they're going to hire. And I want to elevate it in the past.
00:31:14.280
If you're trying to maximize your hire ability in the world, right. By like elite organizations or
00:31:19.220
anything like that, what you wanted was when somebody Googles like brain computer, because
00:31:25.540
that's my specialty before all this brain computer interface expert. This is for how you devices
00:31:31.400
that directly controlled electronics with your thoughts. That's what I worked on before all of
00:31:35.400
this. That was my specialty in neuroscience, not like mystically. It was like implants, like
00:31:40.300
Neuralink is a good example. Anyway. So, so I would want somebody to Google that and my name to be the
00:31:46.380
first name that appears. Now, what I want is somebody to go to an AI and say, who's the world
00:31:52.680
expert in brain computer interface. And my name is the first that comes up the way that I achieve each
00:31:58.800
of those two goals is dramatically different. Good point. Yeah. Yeah. If you're writing prolifically
00:32:04.620
about it, if you're the one who's engaging in the online discourse about brain computer interface,
00:32:09.940
Grok perplexity, Claude, ChatGPT, they're going to point to you.
00:32:13.900
And then there's a secondary part of how power networks form. This is something Brian Chow
00:32:19.120
was talking to us a lot about personally. And I think he's absolutely right about this
00:32:21.960
is that the new country club is the signal group.
00:32:25.800
Oh, that's great. That's a great way of putting it.
00:32:27.660
No, I mean, country clubs, who are you going to be hobnobbing was a bunch of old people
00:32:32.080
who have no power anymore. Like all of the old communities that were like the secret societies,
00:32:38.000
you know, whether, you know, your skull and bones, you can see our episode on that one,
00:32:41.080
totally woke. Like it's pointless. It's not useful for hiring anymore. Bohemian Groves,
00:32:45.240
the average age is like 86 now. It's completely useless from a networking perspective.
00:32:50.600
Well, and even your dad told me when he was visiting that he waited for 22 years.
00:32:59.800
Because keep in mind, with an organization like the Bohemian Grove, as you get older,
00:33:06.120
the, so if, if, if in one generation, it's a group of 50 year olds, right? Like, okay,
00:33:12.860
the organization, the average age is 50, right? Well, that means the average person they're
00:33:16.560
trying to induct is going to be one of their friends. Who's going to be 50, but it takes
00:33:20.140
that person 20 years to get in. Now the average age of the organization isn't 50 anymore.
00:33:24.240
It's 70. And the organization has been around for hundreds of years. So now the average organization
00:33:30.100
is basically the maximum human lifespan, like two years before you die. And that is just not like
00:33:37.040
waiting for, it's waiting to become Pope at this point.
00:33:39.980
Waiting to become Pope. No one there is, is going to have like an active ability to sort of affect
00:33:46.480
society at this point. And I think that this is what we're seeing across the board within these
00:33:51.940
different organizations. Now, Simone, you ran, you were managing director of Dialogue, which is a
00:33:55.820
secret society founded by Peter Till and Orrin Hoffman. And then for Schmidt Futures, we put
00:34:01.060
together a, another secret society, which I don't think is still operational. And then we've worked
00:34:05.660
for like Future Forum and some other secret societies in terms of putting them together. So we have
00:34:09.520
insights into a lot of the operations. We also know how, at least for a while, like Manifest worked,
00:34:14.560
for example, and like how speakers were selected for that. I think that what all of those societies
00:34:19.840
had in common, which I think is more representative of what, of what we should expect in a techno
00:34:24.100
feudal society. In other words, a post AI society is that these were in the end societies that were
00:34:33.460
handpicked as like, these are the people that so-and-so likes, and they may pretend to have objective
00:34:41.600
metrics for, Hey, this person qualifies and this person doesn't. But in the end, in all of those, I could
00:34:48.500
point out glaring exceptions on both the rejection and exception standpoints.
00:34:54.760
Oh, I completely agree. It's, it's, you have to be. So it's, it's, does this person like your
00:35:00.000
content? Does this person like you? Do they like your online? I mean, like Hereticon, which is the
00:35:04.440
other Peter Thiel. Yeah. Yeah. Hereticon is like, Hey, do these people think you're cool? And, and, or
00:35:10.180
like, do their friends recommend them to them? If so, you're in, if not, you're not. And again,
00:35:17.120
like, how are you going to get there without any sort of reputation?
00:35:20.560
The point is that if you're talking about a secret society, like say Hereticon, which
00:35:23.180
is probably the most exclusive, like real, like actionable secret society in the world
00:35:28.060
right now. And I say this is somebody who's supposed to be running the Illuminati. If you
00:35:32.200
go to the book, the bloodlines of the Illuminati, which is the main book of the Illuminati that
00:35:35.880
came out in the 1980s, it says my dad is one of the people who runs it. And that I am his
00:35:41.980
oldest male son. So I should be in it, right? Like I am not. Maybe he does, Malcolm. The
00:35:46.580
man loves his privacy. Although I have had powerful people come to me and say, they want
00:35:49.940
to restart a real Illuminati. Can you, can you do this? And I've like drafted like business
00:35:54.340
plans on how to do that. But the, the point I'm making here is that if you look at something
00:36:01.940
like Hereticon, nepotism, like these old nepotistic networks mean like almost nothing unless the
00:36:10.040
parents incorporated you into the content they were producing, which is why.
00:36:13.760
But then on your own merit, you have to be cool. If you're just like the lame kid, then
00:36:19.060
no one's going to want to invite you. Even if you are so-and-so son or daughter or friend
00:36:31.100
Somebody's grandson, somebody Milton Friedman's grandson.
00:36:35.580
No, he's interesting on his own right. That doesn't even work.
00:36:38.200
Okay. Okay. Yeah. No, there, what are some, honestly, I can't really think of any, some
00:36:46.560
have, some societies have made exceptions for certain members because they feel that those
00:36:53.080
members who on their own are a major draw. So one, one bit of calculus, it does take place
00:36:59.000
with these societies and I'm sure will take place in techno fiefdoms as well is I really need
00:37:05.120
person X to be a part of my fiefdom. I also know that person X may exit in a huff if I
00:37:12.160
don't cave to their insistence that person Y is accepted. And so there are instances in
00:37:19.760
these societies of completely lame, like women, boring. No, not always women, not always women,
00:37:28.620
just boring people being accepted at least on a temporary basis.
00:37:32.440
Actually, this is a really good point. Sorry. I was just thinking about this. Is it men in today's
00:37:37.240
world are often more likely to put pressure on a society to accept someone they want to F than one
00:37:42.040
of their kids? Fair. Yeah. Which, which, no, no. And I, I even say it's like more culturally
00:37:47.600
normative. Like it's seen as, so like the old nepotistic networks at work, but the reason I
00:37:52.380
was saying signal groups matter is like, if I go to like the Dallas country club, because when I grew
00:37:56.800
up, that was like a really prestigious thing. You'd pay tons of money to be a member of it.
00:37:59.860
You and I went there together for, to meet someone for lunch at one point, right?
00:38:02.400
Yeah, we met my grandmother for lunch there. It's really big. It's all dated. Like nobody
00:38:06.780
important's going to the Dallas country club anymore. You know, everyone there is, is sort
00:38:12.980
of irrelevant culturally speaking, because that sort of wealth accumulation is just not important
00:38:18.880
within the existing paradigm. This is for people who want to ape or, or pretend like they are
00:38:25.280
successful rather than actually having any influence. And that wasn't the case within previous
00:38:30.480
generations, because here's the point. If I'm one of the top, like 0.5% of the population of Dallas,
00:38:35.560
I don't care about networking with the other 0.5% of the population of Dallas. I know this is
00:38:39.980
somebody who was in Dallas as a 0.5%-er. I care about networking with the 0.5% in Manhattan,
00:38:45.780
in London, in LA, in New York, and the general internet, right? You know, and if I'm in the top
00:38:51.360
0.5% of Manhattan, I need to also know the top influencers in, you know, SF and, and across the
00:39:00.120
internet if I want to matter, which is why the signal group has taken over from the, and I don't
00:39:05.480
know why it's always signal. I actually prefer WhatsApp, but signal is where they, all the fancy
00:39:09.320
groups are. But that's why the signal group has taken over from the old country club or region
00:39:17.360
locked thing. We actually have some recording, oh, let's see if we can find it, where we recorded
00:39:22.240
exploring the old core club in New York that had been abandoned. Oh yeah, I can share that with you.
00:40:09.400
They had a salon. Look at this. This is where they wash their hair.
00:40:13.240
I think these hair cost, cost like a hundred bucks, two hundred bucks. What do you think?
00:40:19.400
Like I would say one eighty, but then with tip he'd probably be paying at least two thirty.
00:40:23.400
Well this sort of, you know, this isn't Manhattan anymore.
00:40:34.760
This, this concept of these social clubs where like,
00:40:38.680
lux people get to hang out. I wonder how long it's gonna last.
00:40:46.920
I think that these people did one of these deserted buildings.
00:40:50.920
Well that's the thing, is people don't believe us when you say like,
00:40:53.480
most office buildings in New York are abandoned.
00:41:00.120
Oh my god, a full-out salon. A full-out hair salon.
00:41:08.600
It's such a shame thinking about these gorgeous spaces.
00:41:11.480
And this, you know, with demographic collapse, I think the crazy thing is,
00:41:14.920
like the, the world will be full of these spaces.
00:41:17.560
And you think like, oh great, this will turn into an apartment.
00:41:25.320
Yeah, it's got a vacant sign and the signs even falling apart.
00:41:29.160
Look at that, look at that, it says vacant right there above the Friar's Club.
00:41:32.200
It's a, that's a historic place where they used to do these New York,
00:41:34.440
all the famous comedians of the world would go and you'd have some like secret society thing.
00:41:49.080
All, all the elite don't want to be here anymore.
00:41:52.760
What's the point when you can go to the distributed elite things like dialogue or something?
00:41:59.480
I feel like maybe the future hangouts of the elites will just be at each other's sprawling compounds and mansions.
00:42:05.320
Well, that's what we do whenever we're hanging out with our elite friends.
00:42:10.040
So places like Friar's Club, Core Club, they're too, too much to maintain.
00:42:19.160
By the way, the funny thing about the elite is the non-elite don't know
00:42:22.760
is they all have these giant mansions, but publicly everybody knows where the mansions are
00:42:27.160
and so they're too afraid to go to their mansions.
00:42:29.160
So if you're actually, their, their friend, when they invite you over,
00:42:33.080
you're going to like often an unassuming house in the middle of the city.
00:42:43.720
Like, I'm sure that some staff are like, oh yeah, take this, take that.
00:42:47.080
But like, what are you going to do with this stuff?
00:42:57.720
Yeah, this is, this must have been the women's locker room.
00:43:05.800
Oh my gosh, this, yeah, this must be the women's locker room.
00:43:13.080
We got the steam room, mini fridge, sauna, and whoa, whoa.
00:43:47.880
Do you think in the future, people will just camp out in these places?
00:43:59.160
I'm like, God, you can't even stay in the buildings anymore because they're unstable.
00:44:09.240
I don't remember where we go back, but they weren't like, I don't know what they cost today,
00:44:21.560
I wonder if some of these books were from members.
00:44:38.520
This is a group that my mom was a member of, so I've had some familiarity with it.
00:44:42.600
But they were apparently like a big muckety-muck thing for a while.
00:44:47.960
This is also interesting about board positions is my dad was seeing them as quite like a prestigious
00:44:53.560
And you remember when I was younger, I wanted them.
00:44:55.640
And then recently, I sort of came to the realization of like board positions don't
00:45:00.520
Well, what they became, I think there was this misunderstanding among people was that
00:45:08.360
I could have a career of being on boards, that board positions pay.
00:45:16.360
In fact, the positions that most people get on when attempting to become career board position
00:45:21.880
holders require you to donate significant amount of money.
00:45:26.440
Most boards are basically, oh, these people are on because they donated a lot of money
00:45:33.320
They want to feel like they have influence and they don't.
00:45:35.640
And so they're just donors who get board positions to feel fancy.
00:45:41.160
And then they kind of expect, well, okay, I'll just donate a ton to be on all these companies'
00:45:45.560
But then other companies will approach me to become a board member because I'm such a prolific
00:45:50.280
board member and therefore I must be very good.
00:45:53.000
And I think that that's the leap that doesn't work and that leads people to end up spending
00:45:58.920
a lot of money on nonprofits that don't necessarily do anything helpful aside from spend a lot of
00:46:04.840
their money just finding new people to get into the scam.
00:46:07.800
So yeah, it's kind of a lame way to spend your time.
00:46:14.520
This has been a really interesting topic for me.
00:46:17.080
If you want to have a livelihood in a post-EI age, you need to build an online footprint that is
00:46:30.600
niche, that is distinct, where you're digging into unique talents of yours and where you are
00:46:36.600
at least on the pathway to providing a unique product or service that either your local community
00:46:43.720
will want or that you're like the 1% of the people and their techno fiefdoms will want to buy from you.
00:46:52.600
Like really weird, cool art or custom made cabinetry.
00:46:56.360
So you want to make yourself an AI's go-to source for some subject.
00:47:04.040
Like one of my, my, my friends from childhood, both of her parents were artists and they were
00:47:09.240
artists by trade, they made their money doing making art, but how did they do it?
00:47:13.960
It was because one of them, for example, did masonry for all the giant mansions in California.
00:47:19.880
Like they wanted custom stonework and he did it.
00:47:22.680
And I think that his work is as foreign as it seems to me when I was a kid is way more representative
00:47:30.360
of what we can expect to see people making as their livelihoods going forward.
00:47:35.400
Well, it's the locksmith who just came over to our house today.
00:47:37.960
Like that kind of work is also something that I would expect.
00:47:42.440
What is your thoughts if somebody is like, well, if the AI have access to enough of my work,
00:47:48.440
And my answer would be, if you're doing that kind of work, you're in trouble.
00:47:53.960
Like, yeah, no, and that's, that's the problem.
00:47:55.560
When I, when I say stonemasonry, I'm talking about something that
00:47:58.360
even it would be difficult for a, a, a robot to, to physically do because it is,
00:48:16.520
These are things, you know, you could have a robot or some kind of automated system,
00:48:23.240
But in the end, it's a lot easier for a wealthy person.
00:48:26.920
Okay, well, here's, here's me playing devil's advocate.
00:48:30.360
In a few years, AI is going to be able to create episodes of Basecamp with you and I
00:48:34.600
talking and people won't be able to tell us, not us.
00:48:39.880
Basecamp isn't our bet on our, our career and our specialty.
00:48:43.800
Our career and our specialty, I think is going to focus more on governance, community building,
00:48:50.760
kid education, and especially cultivating a network.
00:48:53.800
I'm going to be glad when I can go to an AI and say, make an episode of Basecamp for me.
00:49:00.680
Even if, honestly, even if these never got published, I would still.
00:49:06.360
I mean, that's why we do it is so AI has good training data on us.
00:49:12.680
This is our time that we enjoy because we like.
00:49:18.280
Every time you're just like excited about AI or excited about the direction the future is going.
00:49:23.000
I'm so excited to have married somebody like you, who's not a stick in the mud.
00:49:28.120
And if you want to check out some of our existing AI projects right now,
00:49:31.640
whistling.ai was a Z is an AI system right now.
00:49:37.880
We're eventually going to build like standalone devices for it.
00:49:40.440
That allows your kids to talk to their stuffed animals.
00:49:42.280
So you can program it to believe it's any stuffed animal, any form factor you want.
00:49:46.200
And then have it bring the conversation back to any educational topic you want.
00:49:50.280
Using your kid's name and understanding their interests.
00:49:53.080
So if you say that your kid really likes Minecraft or Legos or Bluey or, you know,
00:50:05.800
We got Parasia or the Collins Institute, whatever you want to call it.
00:50:13.240
You know, we've got about like 20 or 30 active users every day now, which is really great.
00:50:19.640
That's one of the things we're working on right now.
00:50:29.400
And then we've also got our fab.ai, which is our final product, which is the video game
00:50:36.760
And this one, if you're interested in sort of exploring immersive worlds that are built
00:50:43.880
We occasionally release demos at various stages of development.
00:50:47.240
We did that like a week ago and then we took it back offline because the development team
00:50:53.240
I don't want it reflecting us right now and then put it back out.
00:51:00.520
And I am just really excited for all of these projects that we're involved in, in managing,
00:51:06.680
because AI has sort of opened the field to change everything.
00:51:10.040
Like when I look at the whistling system and I give it to our kids, I'm like, why has nobody
00:51:14.600
Like, why is everybody else's like AI talkie thing so dumb?
00:51:22.600
Why doesn't it let me make it whatever I want it to be?
00:51:26.920
Yeah, apparently the Talktua app that was released by Haley, whatever her name is,
00:51:33.240
the Hawktua woman, had a soccer coach on it for the longest time.
00:51:51.720
Look a little dark, but I know you like a little dark.
00:51:57.880
That's how, that's how everybody knows I'm mysterious.
00:52:02.600
The bugs are still, the invisible bugs are still in there, right?