The Changing Politics of the Tech Eliteļ¼ With Mike Solana of Pirate Wires
Episode Stats
Length
1 hour and 7 minutes
Words per Minute
205.22722
Summary
In this episode, we sit down with Mike Solana, founder and editor-in-chief of The PirateWire and founder of Hereticon, to talk about what's going on in Silicon Valley and the political realignment we're seeing in tech.
Transcript
00:00:00.160
Hello, everyone. We're excited to have with us today, Mike Solana, the founder and editing
00:00:10.800
Ready? One, two, three. Hello, everyone. We're so excited to be today. Oh my gosh. Okay. Yeah.
00:00:19.940
Okay. Anyway, for people who don't know who Mike Solana is, he doesn't just have a podcast
00:00:28.560
Yes. You also sort of put together Hereticon, right?
00:00:32.280
It was my idea. Yeah. Founded it, created it. It's done out of Founders Fund, but yeah, it's
00:00:38.120
Yeah. So you've been a central figure in the coalition or sort of the consolidation of this
00:00:46.800
sort of new right or tech right political movement that right now is sort of blowing through the
00:00:53.540
country within the White House and a lot of what we're seeing.
00:00:56.500
And I wanted to talk with you as somebody who is totally integrated in like what's going on,
00:01:03.620
sort of the venture capital, Silicon Valley, tech worker scene, the vibe shift that you have seen
00:01:09.780
post-election cycle there, what's changing about how people are relating to things,
00:01:15.380
as well as the role that you played in this consolidation to write some history here as
00:01:23.360
the country changes, and also to discuss the political realignment we're seeing in the United
00:01:28.580
Well, I think, first of all, in my own personal life, I kind of, I'm like very cagey about labels.
00:01:37.620
I have tried to just be honest about what I'm seeing. And so people tend to put me in a box based
00:01:43.500
on that. Maybe I belong in the box. I don't know, but I can talk about what I've seen. And in terms of
00:01:49.420
what's different right now, I think the best thing to contrast is not what's happening today versus
00:01:54.160
what was happening like four years ago with Biden, but just to just, we have this great
00:01:57.880
example of Trump's presidency and his inauguration. And you can just compare the first one to the
00:02:04.080
second one. He's had two first terms in a sense, really. Like he hasn't had a first and a second
00:02:09.440
term. These are two totally separate. I said not too long ago, it sort of feels like he played the
00:02:16.420
video game and, and, and lost. And he just started the exact same game over from the very beginning.
00:02:24.460
Now knowing where all of the bosses are, right. It's not like he's gotten to the second chapter.
00:02:29.440
He's just still in that first chapter, but he's doing it all over again. So we've never seen that.
00:02:33.320
None of us have seen that in any of our lives. So it's like a very kind of new thing. And within tech,
00:02:37.420
you can compare that first one to the second one. And it's obviously night and day. I mean,
00:02:41.280
the first one was there was one person in tech who was open about his support of Donald Trump.
00:02:46.620
It was Peter Thiel and he was completely alienated, run out of town for it. And has since been sort
00:02:51.780
of forgotten to us, to a large extent, because there were much louder people who came to Trump's
00:02:57.380
defense and support this second time around who've, I think occupied a lot of the discourse
00:03:03.300
surrounding that in tech. And I don't want to like, obviously I feel some kind of way about that
00:03:09.760
having been on the front lines of it, obviously not like Peter, but I mean, I work for Peter.
00:03:13.900
I've known Peter forever. So I have feelings about the way he was treated, but the difference is just
00:03:19.500
obviously today, Trump supporters have status in Silicon Valley. And in fact, being a right-wing
00:03:27.840
person almost grants a certain amount of status, I would say at the higher levels. So I think it's
00:03:34.520
really unclear what's happening among the rank and file. We haven't seen another round of fundraising
00:03:39.900
data. The last time that we looked, tech was still overwhelmingly voting, funding Democrats,
00:03:45.840
venture capital was still overwhelmingly funding Democrats, but all the people openly talking about
00:03:50.520
it are, it's just like overwhelmingly Trumpian. And then everybody else is, I think, either
00:03:56.680
conflicted because the Biden term was so disastrous or quiet. And I suspect it was conflicted.
00:04:04.520
Rather than quiet. I think people actually just didn't know how they felt.
00:04:08.020
So I want to pull apart the two things you said here to focus on each individually,
00:04:11.740
because I think they're really, really interesting. The first thing that you noted,
00:04:14.880
I think is so true. It's like that game, the movie was, I want to say the Scientologist guy,
00:04:20.800
where every time he dies, he plays the same day.
00:04:26.080
Edge of Tomorrow. Yes. Trump's Edge of Tomorrow-ing it right now. But the thing that's weird about this,
00:04:31.140
and the part of this I want to focus on is, why is the left doing everything exactly the same?
00:04:38.460
Like, why are they being so predictable? Why is it the exact same playthrough?
00:04:44.580
I think that they have no idea what they are right now. They've lost so many things. It's not
00:04:53.340
just an election, right? They've lost the culture. They've lost the youth. They have lost their sense of
00:04:59.140
political identity, because Trump is not a regular Republican. Trump, they tried really,
00:05:03.300
really, really hard to make the kind of like, oh, he's a rich guy who wants to just help rich people
00:05:07.000
things stick. But even if you could, maybe that is secretly true. His policies are populist policies.
00:05:14.100
They are economically populist policies. There's a reason that Tucker Carlson was aligned with him
00:05:18.540
and is talking about things like banning self-driving cars to protect the jobs of drivers.
00:05:23.440
That's like a Democrat idea. And so I don't think it's like, if you've taken away the economic
00:05:28.980
populism, or at least provided a competitive economic populist platform, what are you left
00:05:36.340
with to differentiate yourself? And what they were left with in the last election, and even now today,
00:05:40.760
is like, maybe we should trans the children. Maybe that's okay, right? And that's, I don't think
00:05:45.580
they even believe that. They're just, that's something they were just forced to say by the party elites
00:05:50.200
to sort of be in the party, but that's all they have now. It's like those really deranged
00:05:54.820
far left social issues, because also the right over the last 20 years has moderated a lot on social
00:06:00.580
stuff. I think there's a lot of transformation happening on the right on the social stuff and
00:06:06.680
conflict on the right now, but that's, that's like coming. That's not currently where we are.
00:06:11.540
Like the terrain right now is the major, the dominant figure in right-wing politics, Republican
00:06:18.780
politics, but I don't think it's really that, is Donald Trump. And he is an economic populist who
00:06:23.900
does not give a shit about gay people marrying. He just does not care. I also don't think the thing
00:06:28.180
that's really hard to make stick is the abortion stuff, because yeah, he has been like, oh yeah,
00:06:33.520
like I'm against Roe v. Wade, but no one believes that that man hasn't paid for an abortion. Like he's
00:06:39.140
not a Christian right kind of guy. And, and he also is strongly said, you know, I'll knock out the,
00:06:45.940
any kind of federal ban for, or whatnot. He's like state's rights only it's a bad case, but
00:06:50.840
he just isn't that kind of, he's not that kind of right-wing socially, extremely far right-wing
00:06:57.760
kind of guy. It's just more complicated. So they don't know what to be. And I think there's
00:07:00.540
I've noticed two big differences in this particular playthrough. One sort of highlights the point that
00:07:06.220
you're making here. And Trump has even said this verbally. He's like, I'm pushing this issue
00:07:10.640
because it's a nine 90, 10 issue. This was specifically when he was talking about trans
00:07:14.860
people in sports. And, and I think the administration right now is looking to only
00:07:20.240
battle 90, 10 issues at the beginning because Democrats culturally have a compulsion to double
00:07:27.220
down on whatever he is opposing. So, well, because it's also always worked. They've always been able to
00:07:33.360
just scream until they got what they wanted. And they've, they've, the country has changed a lot
00:07:40.080
because of it. And the other overwhelming sort of like, let's say 80, 20 issue at least that Trump
00:07:46.100
first won on. And I think one on again is immigration. And they don't know what the fuck to
00:07:51.860
do about that because they presided over an open border for four years and have people in their party
00:07:55.980
who have now normalized that to the point that they can't go backwards. That's the thing about the
00:07:59.980
left man is like, they can never go backwards. They can never pivot. They can never change their
00:08:04.760
mind. Trump doesn't care about changing his mind. And he's also not a Republican. He doesn't care
00:08:08.740
about Republican baggage. He's just this guy who's doing what he thinks is the right thing to do. And
00:08:13.800
so it comes off like common sense rather than ideological. Yes. Well, so the second thing that he's
00:08:18.280
doing differently this time, and I want to hear your thoughts on this, because this is like a huge
00:08:21.200
difference. It's like when you're choosing to run for president and you're like picking a character,
00:08:25.160
like your character is your VP in many ways, in terms of like how you're colored.
00:08:29.200
And the VP he picked sort of demonstrated a completely different alignment. I mean,
00:08:35.220
Mike Pence versus J.D. Vance, it was really like the GOP Inc. versus the new right or the tech right.
00:08:43.480
Well, I wouldn't even say it's the tech. I think, I mean, the tech right thing is something that's put
00:08:47.840
on. That's like a Steve Bannon likes to, I think because there's a tension between Steve Bannon and
00:08:53.860
the tech right with J.D. Vance. I think it's complicated because he's come out like in favor of Lena
00:08:58.040
Khan and people like this who want to regulate the hell out of the industry. But I agree, you're right
00:09:02.960
that the difference between VP pick is really important. And the first one is I need, it was
00:09:07.660
Trump, like I want approval from the elite. And I don't know why he wanted that, but you could tell
00:09:12.920
that he really did. And so he picked an establishment guy and he wanted establishment
00:09:17.000
approval in the press. This new term is like J.D. Vance is an anti-establishment guy and he blew up the
00:09:22.700
press. He blew up the press room. He's now got like Mike Cernovich in there asking questions
00:09:26.560
alongside the New York Times. It's like he, he, Trump knows that he will never have establishment
00:09:31.520
approval. And so that actually was a huge mistake on the side of the establishment. They should have
00:09:36.320
brought him in instead of trying to ice him out because now he doesn't have anything to gain from
00:09:41.320
working with them. He has a lot to gain from destroying them. And that's what he set out to do.
00:09:46.440
He's completely surrounded now. And it's this weird phenomenon with people who hated him in the
00:09:50.700
first election cycle. I mean, J.D. Vance, Elon, RFK, you know, it seems like his entire administration
00:09:56.840
is just former adamant opponents. I don't know that J.D. Vance was ever an adamant. I think it's
00:10:04.780
like he has some comments that he's made and whatever, but he wasn't like, I thought he led
00:10:08.520
like the never Trump movement practically. Yeah. Cause he wrote Hillbilly Elegy and then he did
00:10:14.880
this like apology tour for everyone who voted for Trump. Just for clarification here. So I don't
00:10:20.520
look uninformed. Here are some quotes from J.D. Vance about Trump in his early days. He called
00:10:26.040
Trump America's Hitler. He called him an idiot. He called him reprehensible. He called him cultural
00:10:31.340
heroine. He called him unfit for office. Okay. So I don't know anything about that. I know that he,
00:10:36.560
the Hillbilly, I know that he's defended Trump supporters and things like this. I don't know if
00:10:40.260
how committed he was to like the sort of never Trumper cause in the beginning. It's definitely true
00:10:45.040
that the others were against it. But even Elon, I feel it's like, you have to think back to that
00:10:50.580
period of time. And I give people all sorts of grace because at that period of time, we were living
00:10:56.340
in like a one party state, which became very apparent the moment that Trump entered office
00:11:00.560
and the whole entire deep state apparatus rose up to prevent him from doing anything and then tried
00:11:07.020
to put him in jail for the following four years. And I think that was really the radicalizing moment for
00:11:10.560
most reasonable people. It was like, you maybe didn't support him in 2016. You maybe didn't vote
00:11:14.480
for him in 2020, but what happened after 2020 was really scary. That was like the, the coordinated
00:11:19.740
tech thing to de-platform him. Then the government went after him, started putting his allies in
00:11:23.840
prison, put him, tried to put him in prison. I think that he would be certainly in prison at this
00:11:28.220
point or on the path to it if he didn't win. And that's really scary. And so it's like everything
00:11:32.040
the left is saying about him. I've not seen that from him, but I have seen it from them. And the sort of
00:11:38.740
jig is up in that respect. Like you can't really hide the fact that you tried to put the front
00:11:43.160
runner presidential candidate in jail, like in the middle of an election. So you're going to
00:11:47.920
hypothesize where the left goes in response to this? Well, I think they have two choices. I think
00:11:53.300
the first choice, no one wants to hear this, but I think that Gavin Newsom starting his new podcast is
00:11:59.820
really interesting. He is a total sociopathic political creature. He's like a classically presenting
00:12:05.120
sociopathic politician in the late 20th century model. Yeah. Yeah. He's like a Bill Clinton kind
00:12:12.160
of person. It's exactly what's coming to my head is Bill Clinton or Hillary Clinton. Well, he feels
00:12:15.900
more like a mayor out of Gotham City. I'm just a poor schmoe got lucky. I wish I could hand out world
00:12:24.360
peace and unconditional love. Intimidate me. Bully me if it makes you feel big. I mean, it's not like you
00:12:33.740
can just kill me. Actually, it's a lot like that. Like to me, I grew up in San Francisco,
00:12:41.980
like right next to it. I grew up like sort of around like just steeped in his, his lore. It
00:12:47.600
just feels comic book-ish. Yeah. I think though that he's really smart and really underestimated.
00:12:53.760
And the fact that he's now speaking to like Steve Bannon on the second issue of his episode of his
00:12:57.740
podcast is really fascinating to me because what I saw while I was watching the clips of that was
00:13:02.560
he had clearly never, ever in his life had strong pushback on any of his ideas, even on the issue of
00:13:10.080
was the election stolen? That's not something that he's, whether, regardless of what you think about
00:13:14.640
that question, that's not a question that, that, that he has, that Newsom has ever had to answer
00:13:20.520
before. He's never had to beat back against the argument that it was stolen. And so to have him there
00:13:25.180
laughing it off and pivoting in a smart way, he's also learning. The fact that he's even talking to
00:13:29.700
Steve Bannon is means that he's learning from the culture. He's, his son is obviously red pilled and
00:13:34.140
he's like, holy shit, why is the youth Republican all of a sudden? And so he's trying to talk to these
00:13:38.500
people. He's going to learn or whatever. He'll still be a sociopath. He'll probably be a centrist and
00:13:43.160
he'll have some better signaling. Or what I think is much scarier and maybe more likely is the like Hassan
00:13:51.380
pikers of the world. The pro like the Luigi Mangione left, the populist radical yay murder
00:13:58.780
left, the pro Hamas left. Like, and I think the Hamas thing is less important as the reaction to
00:14:04.060
Luigi Mangione has really frightened me because it's so earnest and so deep. And it also crosses into the
00:14:11.240
world of Trump supporters. Anytime a left-wing policy, you start seeing Trump supporters talk
00:14:15.480
about it is like, oh, that's a real policy. That's like a Bernie Sanders, Trump overlap kind of thing.
00:14:20.720
And I think that rather than put up that little, I forget his name. He's the ex-Parkland kid who
00:14:25.900
became a leftist activist and had the pillow company briefly. And now he's like a DNC chairperson.
00:14:33.560
So that little dweeb rather than him, if they put Hassan Piker up there, who has a lot of energy,
00:14:38.440
is good looking, is very, I wouldn't say he's masculine, but he peacocks masculinity.
00:14:43.400
And he has a lot of young male supporters and he's a total monster, like an actual earnest,
00:14:48.860
like the kind of guy who would have put all of us in prison, had all of us killed in a communist
00:14:53.080
revolution, like they might win. So those are the two paths. I think the energy is actually on
00:14:58.900
some form of the populist left side and we'll see what happens.
00:15:03.220
That's really interesting. Okay. So I'm just sort of thinking through this in my head. If they go
00:15:07.220
the Hamas Piker path, I actually think that that would lose their support from the mainstream
00:15:13.900
institutions in the same way, like when Trump ran the first time and the Republicans had to sort of
00:15:18.840
like re-coordinate because, you know, he's, he's, he's supported like the kill. He said,
00:15:26.120
Sure. But what have, what is the left not normalized? Give me like a crazy left-wing policy
00:15:31.000
that they have not normalized. I mean, open borders now is a normal policy. The transing of youth
00:15:36.640
women, let's say biological women competing against biological men in youth sports. Like
00:15:41.720
these are all things that 10 years ago, we would have been like, Oh, that's crazy. Free speech is no
00:15:45.880
longer. It's like, we must do government censorship, right? Like that's a mainstream Democrat position.
00:15:51.720
I, I do not trust that there's anything that they won't actually normalize. And so I think
00:15:58.220
you might be right. That is really scary. A world where that becomes normal.
00:16:02.040
I feel like your average American though, if you get out of especially coastal cities is not cool
00:16:08.660
with that stuff. And I think that whether or not we see a radical left versus a reasonable and
00:16:14.540
corrected, like market corrected left depends on trust in institutions. I think that the reason why
00:16:19.560
Luigi Mangione is seen as a saint is because there is this fundamental sentiment that there is no law.
00:16:28.080
There is no order. There is no justice. If you are not wealthy, if someone steals from you, if someone
00:16:33.460
attacks you, if crimes are committed against you, well, sorry, nothing's going to be done. And I think if
00:16:40.380
the Trump administration manages to restore some sense of faith in institutions, like, Oh, now we
00:16:47.380
actually, um, persecute people for breaking the law, that would be kind of huge. So it I'm really watching
00:16:56.420
closely to see what happens with the fact that he's doing the things that he said on like
00:17:01.980
immigration, for example, he really didn't in the first term. And in the second term, it seems like
00:17:08.400
it's all he cares about. And well, he cares about, he's doing a lot actually, but this is something that
00:17:13.120
he promised that he seems to be taking seriously, even deporting Mahmoud. What's his face? The Columbia
00:17:19.000
guy. Yeah. Yeah. Who married and has done nothing but organized protests that are anti USA been here
00:17:26.740
for four years. He's only participated in anti-American protesting. It's like he's being
00:17:33.220
deported. And that's a difficult move for Trump for a variety of reasons. Cause I think there's
00:17:37.500
actually, once someone gets a green card, due process enters this it's, it should be harder to
00:17:42.540
kick them out rather than if they were just here on a visa. It seems like he's just doing it. You know,
00:17:46.720
the optics there are not great for him and he doesn't give a shit. He's like, this is the kind
00:17:51.340
of stuff that I talked about. If you hate the country, you're going back to where you came from.
00:17:55.500
If you don't belong here, you're going back to where you came from. And these are things that,
00:17:59.440
you know, people have a harder time talking about online, including friends of mine who are on like
00:18:05.640
the more centrist people, more thoughtful libertarian type people. But when you just talk to anybody who's
00:18:10.700
regular, they're like, Oh yeah, why would we want that person here? Obviously back to Syria or wherever the
00:18:15.920
fuck you came from. It sounds like he likes it better. Why would we don't want him? Why do we
00:18:19.960
want him? Yeah. That seems like active enemy propaganda, like in, in our school system.
00:18:25.540
Yeah. I think actually just maybe today you tweet something like your, your, your minimum requirement
00:18:29.880
for citizenship should be that you want to be an American. You like it. There's something along
00:18:34.900
those lines. You should love us and want to be us. Like you should want to become what you see here.
00:18:40.520
It should not be like, Oh, that would be a great country. If only I could change it and make it like
00:18:44.820
a little more Islamic. No, we're not doing that. That's what we're doing here. Yeah.
00:18:50.140
Go back to where you came from. So would you like to see Sharia law in Canada replace Canadian law?
00:18:55.120
At some point it will, you know, because we are, we are, we have families. We are making babies.
00:18:59.180
You're not, your population is going down the slump, right? One day we can have a Muslim majority
00:19:04.700
nation here in Canada, right in your face. All of these people who point out that you have these big
00:19:09.740
protests going on in California, where people are protesting, being sent back to Mexico with Mexican flags.
00:19:18.440
I mean, there's a broader thing here too with like, I feel that way about Israel and Palestine too. It's
00:19:23.140
like my problem with it is I don't want to see either of your flags in my streets. Like I, it's an
00:19:29.040
American flag or no flag is kind of how I feel about it. I just don't care about either one of these
00:19:34.660
flags. You're perfectly nice people. I'm sure definitely what happened on October 7th was disgusting.
00:19:39.460
Like definitely it happens a lot. Terrorism is bad. All true. I don't want your like multi-thousand
00:19:46.420
year old blood feud being litigated in the streets of New York city. It's like, I just don't want to
00:19:52.040
see that. And this is the problem with immigration. And this is, I think the reaction of the average
00:19:56.800
American who is looking at this, like, why the fuck are we even talking about this? This has nothing
00:20:01.580
to do with us. I don't want to have to think about this. I don't care. And Trump is just,
00:20:06.160
he has like an instinct for that. And he just talks to the people who no one else talks to,
00:20:12.200
which is most people, by the way, that's like, yes. Well, I mean, it's, it's not just him
00:20:17.240
mentioning 90, 10 issues. I'm hearing it more and more, even on issues when he doesn't plan to
00:20:22.500
support them. Like with daylight savings, when he was asked, will you finally get rid of daylight
00:20:27.780
savings? He said, well, this is more like a 50, 50 issue. I'm not going to touch it. No matter what I
00:20:32.900
do, someone's going to get mad. He really, it's so refreshing to hear a president look at what the
00:20:40.040
majority of reasonable people want and try to get that because it seems like we haven't done that
00:20:44.640
for a really long time. It's not ideological at all. That is the thing that people get so wrong
00:20:49.560
about him is he's actually, I think the most pragmatic president we've had. You see this on
00:20:55.580
issues of things like trade where, and also things like, I mean, any policy position he has,
00:21:00.560
it's never, it's never about what should the world look like? It's always about like, well,
00:21:05.520
what is the landscape and how do I get the best deal possible based on what people all seem to want?
00:21:11.900
He's all about making deals with people. He loves striking deals between people who don't want the
00:21:16.800
same things. He loves brokering those kinds of deals. And that's just unlike, we've never seen
00:21:21.500
anything like that for better or worse. It's just a new thing. And I find the honesty of that
00:21:28.660
refreshing. So I want to transition from this into the second part of the very first thing you said,
00:21:34.120
which I thought was just really interesting is the conversations that are happening on the ground.
00:21:38.520
The nature of them is really changing. And you mentioned it a little bit here in terms of like
00:21:42.140
what people are saying. I know from my experience, I got this email chain from Stanford MBA and it was
00:21:47.940
from my class and it was all of them were like panicking, like panicking, panicking. I was like the one
00:21:52.380
person who was starting to just went on it and started magging. And I probably really burned my
00:21:58.140
chance of ever getting a job through that network just by being like, Hey, if you ever want to talk
00:22:01.800
to somebody with the opposite perspective, they don't newsflash or narrator narrator's voice.
00:22:07.420
They did not, but the, the, the, at the same time, you know, we have people on our show all the time
00:22:14.680
who have nothing to do with politics. Like I'll invite somebody on. Cause he runs like a statistics
00:22:19.680
channel on fertility rates in like Eastern Europe or something or a religious thinker. And like
00:22:25.840
the, before the recordings turn on, it's always like, Oh my God, like, I'm so glad, like, I wish
00:22:30.740
I could move to America or I'm so jealous for what you guys are getting to go through right now.
00:22:34.040
Or like, is, is how is this, is it like filtering down from the top within Silicon Valley? Like in
00:22:43.680
the intellectual class of Silicon Valley, cause we're really connected to them. I'd call it like the
00:22:47.060
EAO sphere, like the former effective altruist community, like the top intellectuals who we have
00:22:52.180
connections with are like, they're, they're, they're moving more centrists on this sort of stuff when
00:22:57.300
previously they would have been hyper reactionary against it. But I feel like the rank and file
00:23:01.100
still think they're supposed to hate Trump. Like that's sort of what I'm seeing.
00:23:04.940
That's the culture of, there's this thing that is like the aesthetic of thoughtfulness
00:23:13.120
and they feel you have your centrist people, you're like Yimby type people, any kind of wonkish
00:23:20.660
policy person, your former EA type people, your rationalists, they care more about projecting
00:23:27.040
a sense of their own personal thoughtfulness than they do about securing high level goals.
00:23:33.540
I truly believe this for the country, like just positive, let's say growth borders, law and order,
00:23:39.640
things like that. They don't care about that as much as, as how they come off to their friends
00:23:43.480
and things like that, which is a weird thing to say, cause they're rationalists and they're not
00:23:46.400
supposed to care about that. But that is truly my read of most of them, even the ones that I like
00:23:49.860
and friendly or whatever. I think that that's what's happening. And there's nothing about Trump,
00:23:56.180
the aesthetics of Trump, there's nothing reasonable about them. He comes off way crazier than he
00:24:01.160
actually is. So the way that he just like, even on tariffs or something where it's an issue that you
00:24:05.500
could actually get behind the idea of reciprocity and trade, he just every day is announcing something
00:24:11.460
new. And so if you're like, your aesthetic is thoughtfulness that, well, that's not thoughtful.
00:24:15.640
He didn't think it through. He's changing his mind. He changed his mind five times. What does
00:24:18.640
he really believe? And I look at that and I have to do the math and I've now known him for not
00:24:23.620
personally, but I've watched him for however, 10 years almost. And it's like, oh, he's creating
00:24:30.960
leverage out of nothing in advance of some kind of trade negotiation or deal negotiation that I
00:24:35.500
don't even know anything about right now. The other day I saw him throw Google under the bus for
00:24:40.080
something. And I was just thought to myself, like, man, I wonder what he wants from them. Like,
00:24:43.160
maybe that's what's happening, right? Like, like, I don't know what's going on over there,
00:24:46.880
but probably something is going on there. And so that's how I approach Trump is just the principle
00:24:51.800
of charity. I, I just assume that he's not deciding, oh, I would really like to tank the economy
00:24:58.240
today by doing something crazy. I would like to just tank the stock market or whatever.
00:25:02.220
My assumption is there's a plan and I just don't know what it is and we can judge him for it in,
00:25:06.980
you know, six months or whatever. And it's like, we can change the whoever's in charge now.
00:25:11.020
Anyway, I was talking to New York times reporter today and she was like, well, what do you think
00:25:15.260
of like what he's done to the economy? I'm like, he didn't like plan to take the economy. If he,
00:25:19.820
even if he did, he meant for it to be short term. Like his goal is to make the American worker feel
00:25:25.280
more secure. Like, well, and also just to shore up manufacturing security. And that is a goal.
00:25:32.720
It's like, we just fucking forgot that that's an important goal. COVID happened and proved that
00:25:37.740
that's an important goal. It proved that it's not just a thing that you should care about.
00:25:41.900
If you are this plebe who is saying, oh, I wish that I could afford a home. And you're this,
00:25:48.280
a rich person is like, you idiot. That'll never happen again. That's not what we do here anymore.
00:25:52.640
We don't give you like great middle-class jobs. It's no longer just an issue for those people.
00:25:57.840
It's an issue for all of us. When you have a country like China controlling so much of the
00:26:01.020
manufacturing, and then they're also manufacturing viruses that they're then releasing. And then
00:26:05.260
they're hoarding things like PPE. And that is something that was not nearly as bad as it could
00:26:10.060
have been. But I think that the takeaway from that has to be, oh, wow, that could have been easily
00:26:15.360
so much worse. It could have been a little more lethal and that would have been way more devastating.
00:26:20.480
And we were way not prepared for it. And so I think about it in those terms, that completely 20,
00:26:26.540
how could 2020 not have changed your mind completely about things like domestic manufacturing? And Trump
00:26:33.200
really cares about it. And we'll see what happens. Interesting. So there was something you said there
00:26:39.280
that aligns was one of our recent theories that I thought you might find interesting to pull on. So
00:26:43.580
we were looking at why American conservatism, like Americana conservatism,
00:26:47.640
is the only group really other than Jews who stay above repopulation rate fertility-wise when they
00:26:53.920
get wealthy. And what we pointed to was the truck nut fertility thesis, which is to say within Americana
00:26:59.660
culture, there is this idea that if some sort of culturally dominant force or respectable force
00:27:06.540
wants to force something on you, your reaction to that should be reflexively reactionary. Like,
00:27:13.060
put truck nuts on something because it's not respectable. Yeah. Put the little naked girl on
00:27:18.080
the thing, the Hooters chicks, you know, like be offensive in your existence. And that this was Trump
00:27:27.460
authenticated him in a large part of America's mind rather than undermining his credibility.
00:27:34.640
And there is with these individuals who live their lives just to signal, I'm a good person,
00:27:41.020
an incapability of recognizing this. I think that's true. I think that the way he talks,
00:27:48.140
even just the cadence and the strangeness of his vernacular is all signaling
00:27:54.100
a segment of the population that is considered not elite to elitists. And when people like my parents
00:28:04.020
heard it, all they really registered was like, oh, this guy hates all the same people who I hate.
00:28:10.400
I, they don't care that he was rich. They don't care about his dumb real estate deals in New York
00:28:16.200
City, which by the way, like, does anybody think that there's no corruption in real estate in New
00:28:19.740
York City? Like the only way to do real estate in New York City, like they don't give a shit.
00:28:24.360
They're just like, he's going to throw a grenade at the machine, which I can't stand. And if he doesn't
00:28:30.100
do that, we're going to have a problem. And then he didn't do it enough. And there was a problem
00:28:33.160
in the election is my read of the 2020 election. It was like, he didn't do enough of the shit that
00:28:38.000
he was going to say. But I think, yeah, the way that he speaks, the offensiveness, everyone thinks
00:28:43.020
that's fun. The supporters think that's funny. And, and they register it as, oh, he cares just as
00:28:49.860
little about, he has just as little respect for this system of morality, the elitist morality,
00:28:57.400
as I do. You know, these people are people who grew up saying before white privilege was a phrase,
00:29:03.700
it was in the early nineties, you still had politically correct language. And there was
00:29:07.140
this idea of like, why the average, like working middle-class white person is like, what, what do
00:29:14.480
black people, what do I have that black people don't have? Like, no one has ever given me anything.
00:29:20.160
Why do I have to have this like reverence for this idea that the black person is persecuted
00:29:26.060
or something in 1992? They don't believe, they just never believed that because their lives
00:29:30.600
weren't that good. And they were really hard and the government wasn't giving them anything.
00:29:34.480
And so to disres, to show disrespect for that system that never in their minds gave them anything
00:29:40.320
is absolutely a part of his credibility. They're like, oh, he gets it. That's a guy who doesn't,
00:29:47.100
I think what's really fascinating is that it, I think that that also gave him credibility with
00:29:52.140
like the tech bros, I guess I'd call them. Like the Silicon Valley VC crowd has always had an
00:29:56.560
intentionally contrarian streak to it, but it's almost like, like for me, for example, I wasn't
00:30:02.240
pro-Trump his first election cycle. I, and, and I see it now as like maybe internal cowardice or
00:30:07.280
something like that, but it took me a while to recognize the contrarianism in what he was doing.
00:30:13.200
And that that aligned with the value system that I was, you know, purported to have.
00:30:19.240
I don't agree. I think that it was more a matter of for, for the people who are maybe more famously
00:30:25.600
pro-Trump, like David Sachs and Mark Andreessen, that's probably true. But for the rank and file,
00:30:30.740
like all of the, like the Mark Zuckerberg and the Google people calling up Trump and everybody was
00:30:36.340
donating to his inauguration parties and stuff that was much more about tech had tried for years to
00:30:45.400
be a part of the elite and succeeded to a certain extent when he was censored while Trump was in
00:30:51.040
office, they succeeded. Like they, they, there was a huge, it wasn't just remember the tech platforms
00:30:56.520
or the speech platforms. It was like every tech company cut Trump off and were lockstep with the
00:31:02.800
Democrat elites who were in power. It was a very scary moment in American history. I would say.
00:31:07.340
Oh yeah. That was like straight up. Like we were teetering on the brink of real authoritarianism at
00:31:11.160
that point. And, and I would say that most tech people were aligned. And then what happened was
00:31:16.680
four years in which it became absolutely certain that the Democrats were going to do everything in
00:31:20.620
their power to dismantle that power. They were, they were, they were never going to be aligned with
00:31:24.120
tech power. It didn't matter how much the tech elites peacocked the same values and pretended they
00:31:29.820
cared about the same things. Like they were just not going to work because the Democrats do not
00:31:34.780
want competition in power and tech was becoming too powerful. It was powerful enough, for example,
00:31:39.380
to silence a president. The Democrats saw that and they were just as nervous as the Republicans.
00:31:42.760
They were like, Oh my God, if you can silence the president, who is actually the powerful person
00:31:48.360
here? It's not the democratic party even all of a sudden. And so there was suddenly, they were out of
00:31:52.920
alignment and the backlash against the Democrats was, I think, totally expected just in terms of like
00:31:58.640
a read of the, the, the, the power structure. You think, you think the Democrats started,
00:32:04.100
cause I, I personally didn't see any of the Democrats really targeting tech institutions
00:32:08.100
leading up to the election. I, I remember them being really happy when they're banning stuff
00:32:12.220
other than Elon. This last, this, which election are we talking about? This last, like, like
00:32:16.240
Zuckerberg didn't really, you know, they were fine with him. They, you have antitrust legislation
00:32:22.440
targeting like every major tech company in the Valley. You have a global trade war targeting tech
00:32:28.640
Europe that our administration not only did nothing about, but abetted by sharing information
00:32:33.140
with the Europeans. They talked about, they were constantly dragging tech people before Congress
00:32:38.240
to yell at them and talk about whatever the issue was. They were talking about new taxation stuff.
00:32:42.940
They were talking about, you had people like Elizabeth Warren, who talked about, I don't want
00:32:47.120
to get that wrong. So I don't want to say who it was, but there was a conversation about going
00:32:51.540
after unrealized gains that is very popular on the left wing. That would kill the entire
00:32:57.960
concept of startups. That's how we know about them, which is like granting equity to people
00:33:01.920
who don't have money to pay the taxes on something like that. Because the, the, the gains have not
00:33:06.340
been realized, which is like a basic economic concept, but they don't, the Democrats don't
00:33:10.640
care because there is an, there is a huge part of that party. Many centrist Democrats would,
00:33:15.720
of course, would care. And in fact, there were many Democrats in Silicon Valley who absolutely
00:33:19.420
cared and talked about it, but the Democrats have in their party, a group of people who do not
00:33:24.200
believe in like industry as a concept. This is like, they're very socialist and it's not a small
00:33:29.360
number of people. And so I think at that point when they were in power and all of this happened,
00:33:34.360
it was like, wow, if we want to keep on existing, we cannot work with these people here. Maybe they
00:33:40.100
will, that, that party will crash and burn. And the new version of the democratic party, we will be
00:33:44.900
able to work with. And that's, I think what someone like Gavin Newsom is trying to demonstrate
00:33:48.800
even in his rhetoric, he's trying to demonstrate that. But what we currently saw, the Biden thing,
00:33:53.780
whatever, whatever was in charge while Biden was technically the president, that thing,
00:33:59.840
the tech industry just, it was straight up pragmatic. It was like, we will die if this is in
00:34:04.500
charge. So. So no, and I, I, I think you might be right about it. It's very different from my
00:34:11.500
intuition of what was leading the tech community, which was, if I look at like the words of the tech
00:34:16.060
people like Mark Zuckerberg, it was the government forcing him to censor stuff in weird ways. So I
00:34:22.420
think the censorship, my read was censorship, handling of COVID and the trans stuff is actually
00:34:28.600
what turned the tech intellectuals away. But you see it as more just like pragmatic economic
00:34:34.140
Well, it depends on who you're talking. If you're talking about, like, again, the Mark
00:34:39.400
Andreessen's and the David Sachs's of the world, I think those are always thoughtful people who kind
00:34:42.480
of disagreed with that stuff. And their opinions are not that different now than they were,
00:34:46.840
I think, a while ago. If you're talking about corporate leadership, you know, the C-suites of all
00:34:51.760
these companies, I think it was just straight up economics. And for Mark specifically, I think
00:34:56.100
there was probably, there is something earnest to the evolution there. For someone like Jack Dorsey,
00:35:00.020
who I've covered really closely, I absolutely believe there was an earnest intellectual,
00:35:04.640
like philosophical development there. I think that he saw what happened during the Hunter stuff,
00:35:09.520
not even from the administration, but from his own team, from himself, from what he had built
00:35:13.920
as really antithetical to all of his like crypto libertarian values, which he has talked about
00:35:19.720
forever. I think he was horrified by what he had become. And I think he gave it, he worked to put
00:35:24.620
Elon in power to end that whole entire machine. I love Jack. I defend him all the time. People always
00:35:29.780
get mad at me. But I think that he is one of the most earnest, that's like the most,
00:35:34.320
the most earnest evolution on the issue of the safety stuff. He would maybe even argue he never
00:35:38.780
evolved. I don't know, but he certainly, his company certainly had become something
00:35:43.460
really terrifying on the censorship stuff. He ran Blue Sky for a bit too, right? Like he was on their
00:35:50.040
board. Yeah, that came from Twitter. He was on the board. Blue Sky was a protocol developed by
00:35:56.760
Twitter. That's why they don't like own it now. It's not a part of the company. But then he was
00:36:01.600
on the board and then he left because he was like, well, this just became the exact same thing that
00:36:05.040
Twitter was. What are your thoughts on, because we're talking about like how it almost became
00:36:10.980
fascist in the US with the alignment of censorship and government. And yet I look at what's happening
00:36:16.280
in Europe and the shutting down of election cycles, the extreme censorship. Do you just think
00:36:20.720
Europe is cooked or like? Yeah, I mean, this is a good example of, we were saying, oh, well,
00:36:27.600
not everyone in the country, like most people in the country aren't going to stand for these crazy
00:36:30.660
ideas. That can be true, as it is true in Europe, that most people in Europe believe that you should
00:36:36.840
want to become European. You should want to be, if you're German, you're coming to Germany, you should
00:36:40.180
want to become German. You should want to learn German. You should want to integrate with the German
00:36:45.780
population and ultimately be a part of the German nation. You can all believe that and still have
00:36:51.700
people in charge of your country who are doing things in the opposite direction because they've
00:36:56.520
just, they've really seized power and the democracy doesn't matter at that point. So the only thing I
00:37:01.440
think that is going to stop what's happening in Europe is some kind of, I mean, it would have to be
00:37:09.180
a major change in the political structure even. Like an end to the EU? I don't think that's enough.
00:37:15.260
I think it's got to be like on an individual level, like, cause you could, you'd have change
00:37:18.720
in a country like France, political change in a country like France that they would just leave
00:37:22.380
the EU. I think the EU doesn't matter as much as France leaving the EU matters. But I think,
00:37:28.660
right. I feel like it's over for you. I don't think that they have, I don't think they're going
00:37:34.460
to do it. There's no mass deportation coming out of Europe. And just on the demographic,
00:37:38.640
the demographic question alone, they're 20 years from now is going to be such a different
00:37:44.620
world. You're absolutely right. I mean, one thing I point out is, is 25%, what's actually
00:37:49.240
24% of German's population is either immigrated or the descendant of immigrants after the 1950s.
00:37:55.960
Well, that's close to a third of Canada's population is from a different country,
00:37:59.880
not even descendants. Like, like. Immediately? Wow.
00:38:03.220
Immediately. It is, it's like, it's 20 something percent. I'd have to Google it really quick.
00:38:06.680
But, but it is, it's extremely high number. That's like a lot of people in your country
00:38:11.120
who are not from your country. And you may or may not want to be a part of your country.
00:38:15.220
One day we can have a Muslim majority nation here in Canada, right in your face.
00:38:19.700
Of Canada. There's been one like gambit that I don't understand why Trump hasn't polled. And
00:38:24.160
it's like really surprising me actually. Why not just like go to Alberta, go to, there's one other
00:38:29.660
province that would probably flip. And provinces can secede from Canada just by a popular vote.
00:38:35.340
Why not just say, Hey, you want to join the U S? I don't think they have those numbers yet. I
00:38:43.240
think that's like the most likely province, but they're not quite there. And I go back and forth
00:38:46.740
on whether or not Trump even actually wants Canada to become a state. I think he'd love it. It'd be
00:38:52.520
fine. He'd be open to it. But I think the reason he's talking about Canadian statehood is just to
00:38:57.320
demoralize Justin Trudeau. I think what he really wanted was to get rid of Justin Trudeau. And that has,
00:39:01.660
he succeeded. Justin's now out and he just resigned today. And I don't know. I don't know
00:39:09.400
that he thought much more about it other than that, other than like, this really works for
00:39:12.300
me rhetorically in terms of rhetoric and it really works against him. It makes him look
00:39:16.420
like a total loser. And he's just going to keep hammering it because it made Justin look
00:39:21.300
impotent. That makes sense. I would like to see a push on that. That'd be really cool.
00:39:30.180
Simone, you've been quiet here. A push on Canadian statehood?
00:39:33.380
Well, no, to take the economically most productive regions of Canada, just take their oil regions,
00:39:38.360
because they already don't want to be part of Canada. Canada established when the whole Quebec
00:39:41.940
thing was going that you can just secede. And Canada has been using these regions resources to
00:39:47.560
fund the rest of their stupidity. I kind of think Canada is on a long-term path to American
00:39:52.680
integration. It's like, the way is just in terms of, it's a slow cultural, economic, you know,
00:40:03.260
integration until there's just, we forget why we're not even the same country. And then it just
00:40:06.880
kind of happens. I don't think it will happen like this. But then also I would say like, if the
00:40:11.220
demographics totally change in Canada becomes a very different place, I don't know what that looks like.
00:40:15.060
And that could happen because this country is a country that seems to hate itself. It's a country
00:40:19.120
that seems to not want to be Canada anymore. And that is what we're seeing in Europe too. And that
00:40:23.020
is maybe the fundamental thing of our era that I don't understand that the weirdness of our era is
00:40:28.080
like what seems to me to be a pervasive self-hatred that in America, we have now room to not be that.
00:40:34.900
It used to, we had to be that culturally. There is all we have now that everyone else does not have
00:40:39.040
is we have permission to love ourselves. And they don't have that in Europe. And in fact,
00:40:45.060
when I was abroad just about a month ago, a few weeks ago for a conference, I was in London.
00:40:49.300
And that is the thing that everybody kept saying was like, man, I wish that we had that over here.
00:40:53.980
I wish that, that we had people who loved it that much over here. It seems so fun and exciting to be
00:40:58.560
over there right now. They didn't even care about the policy. They only cared about just like the
00:41:03.500
permission to be excited about being alive and, and being your, your nationality. And yeah,
00:41:10.020
they truly just do not have that there. And we do. And that's precious right now.
00:41:15.260
Yeah. It's funny that you mentioned that. Cause I was talking with some reporters about the
00:41:18.820
pernatalist conference and I was like at the last one, really the most shocking thing about it is
00:41:23.100
that everyone, it was the first time for a long time. I've been with a group of people where everyone
00:41:27.140
was happy to be alive and excited about the future, even though they think it's bleak. And I think that
00:41:31.920
the reason for this is just cultural evolution, which is the dominant culture in the world right now.
00:41:37.960
It's the urban monoculture as we call it. And to convert somebody out of their birth culture,
00:41:41.900
because the urban monoculture has a very low fertility cultural group. You need to disillusion
00:41:46.360
them with the starting cultural tradition. If I want to convert somebody out of Americana culture,
00:41:52.560
if I want to convert somebody out of German culture or British culture into the urban monoculture,
00:41:57.380
I need to cut them off from their family, you know, convince them their parents are horrible
00:42:00.920
and convince them that the culture itself is horrible. That that's how I deconvert them. And as such,
00:42:05.840
I need to convince them that they are horrible. And a lot of cults do this, you know? And so I think
00:42:10.480
it just sort of spiraled out of control. And then everyone was like, why do we, why do we all hate
00:42:14.980
it? Like, why am I supposed to hate my ancestors and our tradition and our civilization?
00:42:21.840
Yeah. And then it turns out when you just refuse to do so and a bunch of other people say,
00:42:27.180
yeah, I don't, I'm not doing that either. And you talk about it out loud. The hysterical screaming
00:42:34.340
eventually dies down because it no longer works. It's like the behavior only persists for as long
00:42:40.440
as they get a positive reaction from it. And it just eventually, it just dies down. I kind of think
00:42:45.880
of 2020 as I thought of 2020, 2017, I thought was the peak. And then 2020 happened. And I thought,
00:42:51.680
holy shit, worse. Okay. There is, it keeps getting worse, but there's this idea in behaviorism
00:42:57.240
called behavior extinction. And the idea is that, you know, people will act out to get a reaction.
00:43:03.560
And then when you stop reacting, they don't stop reacting. They don't stop acting out. They actually
00:43:08.480
act out much more at first. It's like they have to do much more to get the same reaction. And then
00:43:14.500
that works. And so they have to do much more. And the behavior extinction happens when they act out
00:43:19.860
as much as they ever acted out. And then they don't get the reaction. And then it dies.
00:43:25.080
And the behavior being that, the, like the sort of woke behavior is broken. And part of that sort
00:43:32.140
of constellation of bad ideas was this expectation that you hate your culture. In America, at least
00:43:39.240
that no longer exists. That behavior has, has, has gone extinct. And those people still, maybe there
00:43:44.980
are people out there that have those ideas, but I'm allowed to not, and nobody can stop me and nobody
00:43:49.780
even really cares, which is why I find so tedious. The people who are still doing anti-woke
00:43:53.860
content in 2025 as like their whole beat, which is like every day it's woke whack-a-mole. It's like,
00:43:59.560
look at this blue haired idiot telling me that, that white people are bad. It's like, no one cares
00:44:03.320
anymore. I don't care. Like no one doesn't matter what that idiot says. They don't have power over me
00:44:07.000
That's actually one of the things I wanted to ask you about, which is a lot of energy and time went
00:44:12.180
to that sort of the resistance. And now you have a whole bunch of people who don't really shouldn't
00:44:17.940
be thinking about that. Aren't talking about that anymore. I'm really curious to see, because you're
00:44:22.100
so a finger on the pulse of the cutting edge, both societally and in the tech world, in some spheres,
00:44:28.240
what you're seeing is like new dialogues and new obsessions and new themes that are emerging
00:44:33.300
that people are talking about and obsessing over and thinking about how to solve now that they're
00:44:37.600
not trying to fight sort of progressive overreach. I think that whenever Trump is in office,
00:44:43.900
he casts this like crazy smoke screen. He just, I think he just, I'm trying to find the right
00:44:52.100
metaphor for this, but he just makes it hard to think about anything else. And so I think a lot of
00:44:56.980
people are distracted by him and I'm trying my best to sort of be like engaging with the culture,
00:45:01.340
but to not be distracted by him from like, I'm sort of really refusing to be upset ever by anything
00:45:09.160
that's going on. And whenever he says some crazy thing about tariffs or whatever, I'm just going
00:45:13.220
to wait a few days and see what we learn. And people get mad at me for that, but that's just
00:45:17.520
how I'm going to move on. But I think a lot of people are distracted by him. I think on the far
00:45:21.500
right of politics, on the far left of politics, there's total collapse and confusion and not even
00:45:26.020
far left, let's say the center left, that's where it's collapsed, confusion, disillusionment,
00:45:29.460
sadness. They don't know what to be. And they're just not being productive at all. On the far left,
00:45:33.640
there's excitement because the center left is dead. They have no competition and they're gaining
00:45:37.940
followers. Then on the far, not even the far right, the whole spectrum of the right, there's a huge
00:45:41.900
war for what it means to be right wing because Trump is bizarre and he's not the future of the
00:45:46.220
party because no one is like him. He's no one else is as pragmatic as him, even the Trumpians,
00:45:50.560
like they're Trumpian. He's not Trumpian. He's just Trump. It's a very different thing.
00:45:54.200
There's no philosophy there for him at all. So whoever comes next, that's the war. And you see
00:45:59.380
it, I think, on both the economic side where there's a conflict between the Bannon types and
00:46:04.740
the tech right or the Elon Musk types. And then you see it on the social side where there's also
00:46:09.940
a conflict between Elon Musk and he's like Genghis Khan. It's like how many kids is he going to have
00:46:15.320
with random people and the Christian right, the people who pretend they're the Christian right,
00:46:19.940
the trad right, which started as a meme and feels more real to me by the day. I think there's a lot
00:46:25.100
of interesting, just no one knows what it's going to become. And there's like conflict there. And
00:46:29.280
without a common enemy on the woke left, it's becoming much more vicious. So that is going to
00:46:36.100
be distracting right wing people. And I think it's just going to be that what's happening is it's
00:46:40.420
just going to be culturally quite chaotic because this is also happening at the same time that media
00:46:44.620
fragmentation has happened. So not only are all of these things, would they have been natural even with
00:46:49.400
standard media that was closer to what we saw like eight years ago with a few big giant tech
00:46:55.000
companies and then the mainstream media, it's way more fragmented now. So no matter what you believe,
00:46:59.520
you can hop into a place that confirms all of your biases, shares news with you that is true,
00:47:04.080
but super radicalizing about what you believe. And so everyone is like very different and becoming
00:47:09.680
more different. And yes, I think chaos for a while, unfortunately. Okay. Here's a question that
00:47:15.100
leads from that, that you might have an answer to because I asked Simone about this. So Hassan Piker
00:47:18.600
is most popular Twitch streamer. If you look at the most popular long form podcasts,
00:47:24.160
eight out of 10 of them are right wing. Why do you think the podcast scene has gone right or right
00:47:30.640
wing people succeed in the podcast scene and on YouTube as well? And the left is becoming focused
00:47:37.240
on things like Twitch and TikTok. I have no idea. I don't know. I've even, I had this idea. Yeah,
00:47:45.460
no, I don't know. I, I actually have no idea. I just know that it's true. And I don't know that,
00:47:53.020
I don't know Twitch. I'm not familiar enough with Twitch to know who the other popular people are.
00:47:57.400
I know that a lot of what they do is, I mean, they love to fight with each other and get onto
00:48:04.820
their shows and attack each other. That used to be popular on YouTube and it's not really
00:48:08.220
as much anymore. And that was at a time when YouTube was more left wing. I don't know if
00:48:12.720
there's something related to that, that it's all more, he's just like a good WWE kind of star.
00:48:18.500
I don't, I don't know. That brings me to another point where you're like, you know,
00:48:22.860
people used to fight when you had the lefties in control. Something that you mentioned as part
00:48:27.140
of the narrative, which is personally not something I'm seeing. And I've sort of taken
00:48:30.660
it to be like a left wing gaslighting because, you know, they do a lot of this with media is that
00:48:34.980
the right is now fighting each other. I just personally haven't seen that much of the right
00:48:39.860
fighting each other. I mean, you see it even with just like the Babylon Bee talking about
00:48:44.360
anti-Semitism and getting attacked for it by, there are all sorts of sort of Israel skeptical,
00:48:49.660
where you really see it breaking down is on the question of Israel and Palestine,
00:48:52.900
which is like a shadow war for all sorts. It's our proxy war for all sorts of ideas about nationalism
00:48:59.820
and the influence of Jewish people and things like that. And I try my best to just stay out of that
00:49:06.260
entire thing. Cause I think it's one, not what I focus on America only. And then two,
00:49:12.000
it just seems like there's no way to enter that world and not become like a way scarier person.
00:49:18.940
I think that you just become the things that you fight and everybody there that's fighting that I
00:49:22.420
see fighting. I don't want to be like, so I just tried my best to stay out of it. I worry that that
00:49:26.440
fight that's butting up on the right is going to kind of overtake us all and we'll be dragged into it
00:49:31.120
and have to pick sides or whatever. But for now, I don't think we have to, but I do see a lot of
00:49:35.200
fighting on the right. I think it's on the question of Israel, Israel's influence, the influence you see
00:49:39.460
Bannon really trying to gin up fights between the, what he calls the MAGA right and what he calls the
00:49:44.860
tech right. I, those are, I don't believe in that distinction. Maybe it will become a more serious
00:49:49.320
distinction in the future, but he for sure feels a conflict there. I just saw him on the Newsom
00:49:55.620
podcast talking about the tech oligarchs and he hates the influence of tech people on Donald Trump.
00:50:02.680
I think this Elon thing really, really, really bothers him. And he's going to stoke those flames and
00:50:07.000
stoke those flames until he gets back into a position at the right hand, I think. But yeah,
00:50:11.460
those, those fights, I see a lot. I think the left is correct about those, those who is maybe I
00:50:17.620
actually haven't seen the left talk about this as much. Yeah, it's interesting. My read of Steve
00:50:22.000
Bannon, I could be super wrong about this, is he's just like deep state slime and like everybody
00:50:25.660
recognizes that now. And he's mad because Elon is showing that he, he was the Elon of the first
00:50:31.300
administration and he did effing nothing. Right. Exactly. He failed. He ended up in jail.
00:50:35.580
And that's, and that, by the way, that's what happens. If you fail on this game, you go to
00:50:39.880
prison, like you lose your life. That is what they proved. And that's why the stakes are so high,
00:50:44.720
right? Like we, everyone knows. Elon's going to prison if he doesn't nail it. If this does not
00:50:48.780
succeed, we all are going to jail. Like that is what is going to happen. They say it, they actually
00:50:54.880
just fucking say it. It is like, they, they lost the game fair and square. And their response to that
00:51:00.080
is we should do communism. And that's, that's like, that's the scary thing that's hanging over
00:51:05.280
our balance. And that's why I worry about people like Hassan, because I think that he's the only
00:51:08.380
one who's being really honest about, about his intentions. Yeah. Well, and people find that
00:51:14.020
honesty, very attractive, very appealing. I mean, for the same reason, they love Trump.
00:51:18.800
Which is a very difficult question. You'll have to answer eventually. If that conversation picks up
00:51:22.600
is like, do you have to act like them or something? Because that again, it's like, it just, it's the
00:51:29.240
only way to survive a fight like this. I think as you become not even, I don't know if you can even
00:51:32.520
resist it. If you're in a fight with someone, I think you just start to become them. That is what
00:51:36.740
happened with everybody who fought the woke people is like, you just fought fire with fire. You
00:51:42.680
get obsessed with the culture wars you became. And I say this even a little bit like just self
00:51:47.180
reflectively. Like I became too much like them over the last few years. I don't regret it because
00:51:54.080
I don't know that I could have been effective in that environment if I didn't, but it is a sad thing.
00:51:58.980
I look back at my work from six, seven years ago when I was first like, why can't people just let me
00:52:04.360
write about this fucking Mars thing? Like I'm doing a podcast on Mars, like leave me alone. It's not
00:52:08.760
white supremacy. Shut up. Like I look back at that guy and I'm like, man, I have, we like, I personally
00:52:15.980
lost a lot by fighting this thing. And I want to like, get back to it, but it's hard. Like you
00:52:21.000
just, you change in this kind of, in this kind of idea environment. Yeah, I can see that. One thing
00:52:28.420
that I find really interesting to expand it a bit from what's going on in the tech world.
00:52:32.740
And I don't know how much insight you have into like a nerd culture stuff, which is another area
00:52:38.400
that we do a lot of stuff on, but the... How nerdy are we talking? Well, I was going to say...
00:52:44.240
Warhammer lore nerdy. Yeah. I played Magic the Gathering and I'm familiar with
00:52:48.380
Dungeons & Dragons. The breakdown of the video game, like the woke video game
00:52:54.500
industry in the United States has been pretty catastrophic to the extent where you'll have
00:52:58.060
like $400 million. And this has happened multiple times. It's about to happen again
00:53:01.760
with Assassin's Creed Shadows. $400 million go into a project and like 500 people buy it
00:53:06.100
or like a thousand people buy it. And it's destroying a giant industry that was like the major media
00:53:12.540
industry. And at the same time as I'm seeing this industry burning, I then see the news media,
00:53:20.380
like the traditional, it almost feels like a light switch flipped and everyone now is like,
00:53:25.400
oh yeah, they're not like meaningful sources of news anymore. Even the people who work at these
00:53:29.840
companies. Well, even the left, they're like, oh, they're too right wing or whatever. Like
00:53:33.580
there's the trust in media is totally collapsed. But I think I was going to say, as you were talking
00:53:37.040
about, I was going to connect it with media. And I just see that as opportunity, especially as AI,
00:53:40.880
as our tool, as our AI tools advance, and we are more able to create these things more easily
00:53:45.440
ourselves, you're just going to see new gaming companies and new people, or gaming people,
00:53:50.180
single individual creators of incredibly popular, beautiful games. You're already seeing that on
00:53:54.480
the media side. And that's maybe the future of everything is people doing more with less
00:53:57.540
and creating new institutions, new pounds of status and power and wealth and things like this.
00:54:04.060
It's funny that you mentioned that we're actually doing that. We're building a video game company
00:54:06.860
right now with AI, which I'm really excited for because the big like institutional players are so
00:54:12.000
bad at using AI effectively. Yeah. I mean, well, so new it's like, and no one is ever super
00:54:17.420
incentivized as a giant to embrace the new thing for this, like the, what is the famous innovators
00:54:21.980
dilemma? You don't want to be creating the thing that's going to put your, your bread and butter out
00:54:26.120
of business. I guess they should be able to use new tools, but if they create tools that give
00:54:32.000
way more autonomy to the user, just to get to a point where you start to wonder what the point of
00:54:37.600
this thing is, which is like, you know, that's what kind of is happening with the sub stack ecosystem
00:54:43.500
and social media and stuff like this. It just got to a point where I remember the whole blue check
00:54:47.540
conversation was so crazy. And when Twitter was still Twitter and the first thing that Elon did
00:54:52.040
was he was like, I'm taking rid of you. You're no, no, you no longer get blue checks just because
00:54:55.360
you're like this anointed, you know, priest, high priest of the establishment. You're going to have
00:55:00.140
to pay for a blue check and anyone can have a blue check if you're a person and you give me money
00:55:03.560
and they were furious, but the shift had already had, like no one took that blue check seriously.
00:55:11.940
It was just like, you don't deserve this because you have, you have 400 followers and you're crazy
00:55:18.280
and you happen to work for like Vox. That's, I am more influential than you. And that's crazy. Like
00:55:23.560
you don't, you're not more special. You don't deserve a new, a special suite of tools and access to
00:55:29.040
the administration or the art, what I consider my administration, which is whoever runs Twitter.
00:55:33.660
You're not, you're not better than me. And he just, Elon just made, he forced his company to
00:55:39.420
confirm with the reality that already existed. And that's always a really interesting place to be
00:55:44.040
when, when there is something that everybody already knows to be true, but you can't say it,
00:55:48.060
or everybody's already doing something, but you can't, you can't talk about it or discuss it or plan
00:55:53.100
for it. And then someone just says, this is the real thing. This is what we're doing. This is
00:55:58.300
reality. And people love that. It's like, thank you for saying the truth. And also now
00:56:01.820
the world is different overnight. We're going to see a lot more of that in the coming years.
00:56:06.920
What do you think the things are maybe that, what is the one, what maybe one thing that you think
00:56:11.780
most people kind of think, but you can't say. About social media or the news. I think people
00:56:17.300
realize that we live in a post-job economy already and that also money doesn't matter anymore.
00:56:24.900
It's in the process of not matter. And Elon has said this as well, like.
00:56:30.220
Because of AI. And also because of sort of debt cycles and inflation. And we're,
00:56:35.600
we're headed towards a jubilee that's not going to play out like a jubilee. Like social security is
00:56:39.680
going to fall apart, but then I think the government's just going to mint money to sort of cover it.
00:56:43.640
I don't think social security is going to be privatized. And then, I mean, even if that doesn't
00:56:47.440
happen. So even if our currency isn't massively devalued that way, people are already behaving
00:56:51.760
in a way, especially younger generations and especially people who aren't wealthy. And there,
00:56:56.020
there's a lot of them like, well, money just doesn't matter anymore. I'm going to be in debt
00:56:59.440
forever. I'm going to put it on a credit card because I'm never going to pay it off. It doesn't
00:57:02.320
matter what the interest is because it's never going to be paid off.
00:57:04.460
Weight of relating to money has inverted. So historically the core store of value was anything
00:57:09.600
that was fungible and had a set value like land or Bitcoin or house, you know, whatever,
00:57:14.260
but gold, but now because those things were great for storing value because the number of people who
00:57:20.120
wanted them was growing exponentially. But when the population begins to stabilize and then begins to
00:57:25.320
collapse, the core thing that we thought of as a store of value collapses. And then you could say,
00:57:30.840
well, you could put it all into the economy like the S and P or something like that, but all of the
00:57:34.840
large companies are going to be the companies that are most at risk from the AI transition. So all of the
00:57:40.700
places where you could store value are very unsafe. Yeah. And I think a lot of people feel
00:57:46.940
like they've been scammed so many times that, yeah. I mean, I can't buy a house. I can't, you know,
00:57:51.300
I can't afford this. I'm in debt. Money doesn't matter anymore. Everything's going to be on credit.
00:57:55.000
And eventually we're going to say, I mean, this happened in Japan a couple of times where they're
00:57:59.780
just kind of like, huh, okay. It is always a part of the beginning of demographic collapse.
00:58:04.380
Yeah. Let's what, what debt it's gone. So yeah, money's going to go crazy. And I think a lot of
00:58:08.740
people are already- Well, even the idea of our national debt, like, I don't know how you pay
00:58:12.940
that back. So do you just have to say, I'm not paying that back? I think that that's actually
00:58:16.840
how it ends. Yeah. Well, I mean, our, and our currency can be massively inflated until it doesn't
00:58:21.880
really matter anymore. And, you know, the portion of our budget that it becomes is going to be
00:58:25.420
so silly because we get to just inflate it to high heaven. So yeah, that's, it's something that we
00:58:30.420
think about a lot because, you know, we, we keep, our kids are not becoming obsessed with money and
00:58:34.160
asking us, you know, how, how will I, how will I buy things? And where do I get my job? And we're
00:58:39.680
like, no, you won't get a job. That's never going to happen. And it's, it's, we're now trying to
00:58:44.740
figure out what to tell them. Help me understand. What do you think is going to happen rather than
00:58:47.640
having a job? You're going to have to create a niche personality that's, that is capable of selling
00:58:57.580
gatherings, events, access, or artisan goods of some sort that people want, that a niche of people
00:59:04.440
wants. The action of the population. So what happens to everyone? I don't know what that world
00:59:08.820
looks like. There have been worlds in the past. And I think that people just fundamentally like in our
00:59:15.580
generation aren't capable of accepting this. But if you go back to like the 1910s or 1900s, you know,
00:59:21.840
you didn't have a welfare, you didn't have social security, you didn't have Medicaid. When people
00:59:27.540
were poor, they just died. And we're going to go back. A lot of people are just going to die.
00:59:35.220
And the rest of us are going to be scraping by. And there's going to be a few people with an
00:59:39.280
astronomical level. There's going to be extended families sort of, I mean, we're going to, I think
00:59:44.740
we're going to see a bit of a return to feudalism where you're going to see sort of these walled gardens
00:59:48.340
where the top zero zero one percent is going to be. And then these ecosystems around it sort of in a
00:59:54.560
feudal format. I think that there could be a world in which there's basically UBI, but then you're
00:59:58.440
going to see systems kind of like in, you can see in prisons where your food is covered, your housing
01:00:03.560
is covered, but there are all these artificial economies where people are making food for each
01:00:07.480
other. They're trading services. They're cutting each other's hair. They're threading. They're
01:00:10.300
doing all this. So there's going to be a lot of like human to human service exchanges. And that's,
01:00:14.280
that's for the, everyone else who basically is just getting by sort of living in these localized.
01:00:19.020
And I'm not meaning even geographically, but often sometimes we call them like techno feudal
01:00:23.720
where like, you're sort of living based on your cultural subset, like, you know, the furries over
01:00:29.320
here and like the whatever, and they're all sort of exchanging services. And you're probably a member
01:00:33.240
of a bunch, like you're probably a member of your geographical one. And then maybe one or two
01:00:37.060
social set ones like the FLDS and you're this geographic area or like EDM enthusiasts of this weird
01:00:44.400
subset and this local geographic area. And you're going to be exchanging goods and services based
01:00:48.940
on that. And the ones who really thrive and manage to gain wealth, who are not part of the 0.001%
01:00:53.560
who just maintain all the wealth in the future, you have to have some kind of celebrity status
01:00:58.220
where that 0.01% comes to you for their artisan vegan leather bike shoes, because you are the one,
01:01:06.040
the master, the one whose content they've watched, who, you know, they get obsessed with your artistry
01:01:10.440
or the fact that you can weave this rare silk fabric using the method used in the 1500s in some
01:01:16.320
obscure region in China, that kind of thing. You got to, you got to sell to the autists and you got
01:01:24.100
things will change. That's such a radically different world that
01:01:30.920
it's more like it's been for thousands of years. It's back. It's back to what things were.
01:01:35.840
We're in the aberration. We're in the sci-fi world.
01:01:38.680
What is the, what it was that like, I guess the era of industrialization?
01:01:44.860
Yeah. The era of industrialization and the era of basically with the
01:01:48.840
It began with the British Imperial Empire and it is ending with AGI.
01:01:55.520
What I'm hearing is colonialism was not that bad.
01:01:58.400
It was a good run. It was interesting. It produced a lot of cool stuff. It's just also,
01:02:05.600
it's like a wave crashing on itself. It will produce something new.
01:02:09.140
For us, this is existential because, you know, we want to have a lot of kids. We want to create
01:02:12.360
a culture. We're building a religion. And so we've got to think about, you know, next hundred,
01:02:16.560
next 200 years. And I just think that things are going to be astronomically harder for the next
01:02:21.200
generation. And they're going to have to, it like you would need to in, in, in these earlier eras,
01:02:27.600
figure out how to support yourself, but maybe without a community.
01:02:31.480
Yeah. I mean, the future is here, just not evenly distributed. You can get got by probably telling a
01:02:36.340
kid, you know, teaching them to get a job and do that and like do the traditional breadwinning stuff,
01:02:40.700
but it's going to be shakier. It's going to, it's kind of like a game of musical chairs.
01:02:45.340
There are not going to be that many chairs left. I think a lot of the people getting laid off from
01:02:48.880
corporate tech jobs and from the government are just, they're just never going to get jobs again.
01:02:52.540
Yeah. Well, the good thing is we'll be able to see that happening soon and maybe we'll have more
01:02:56.880
information. The numbers are so off. Like the reporting's really off. So I feel like we're
01:03:04.100
In, in, in employment. And, and do people have jobs anymore? I think that we are, we've gone off the rails.
01:03:10.600
Here's, I think one good indicator that they do have jobs is the protest sizes have completely
01:03:16.680
plummeted since the BLM era. Everyone is out of work. That's really the, that's the real story of
01:03:23.600
the BLM protests and why they were so big is because no one had anything to do. And right now
01:03:27.920
you're looking at protests at like Trump tower or whatnot. And even like they're, they're cleaning
01:03:32.520
up BLM alley and no one's even there protesting. There are.
01:03:37.880
I know. I think it's the Hikikokomori-kazation that happened during COVID. I think COVID taught
01:03:42.800
people, you can just stay in your house in bed rot. And a lot of people never came out again.
01:03:47.080
Yeah. Why protest? I mean, they know it doesn't do anything. Like it just, there's, it's not as
01:03:52.800
fun. We see in our fan base who reached out to us sometimes and stuff like this, just never
01:03:58.320
interacted with another person really. Yeah. I don't know why it's super not rational, but I just
01:04:05.680
feel like this is not, it's not that bad. I think even AI. You're so wholesome. You're so like,
01:04:13.420
you're so, your interpretations though being heterodox and though being cutting edge, I don't
01:04:19.060
know how you managed to do it. They come across as so kind, so charitable, so optimistic. And I love
01:04:25.320
it. I love, I love your vision. I just think it's like, like even with AI, it's, did you guys see
01:04:32.460
this Sam Maltman? He posted the AI telling a fiction story? No, it just wasn't. He's like, this is
01:04:39.260
great. And all the, yeah, people were like, this is there. It's, it can do fiction. And I just felt
01:04:45.640
like it could do a lot, but it couldn't do what he said it could do. Oh, just you wait. But we're
01:04:50.600
talking, I don't think so actually, because what it's really good at doing is predicting the words
01:04:55.960
that are going to come next based on words that have already happened. And guess what? So are you,
01:04:59.840
so are you. We've done it. You believe, you believe that I am an LLM. This is, but we,
01:05:05.340
yeah, no, no, no. We've always done this. We think that there's abundant proof.
01:05:09.620
But this is what people have always done with technology. If you look back 150 years ago or
01:05:13.300
whatever it was, people are like, what is the future going to be? And it's like a crank machine
01:05:18.460
that gives you knowledge. It's like, how does the brain work? The brain, the way we described it,
01:05:23.680
we describe it in terms of the technology that we see. And so I don't think it's like this LLM is what
01:05:31.060
we are. I think that that is just how we're understanding ourselves because it's the most advanced
01:05:34.660
thing that we have. And we recognize that the brain is actually still the most advanced
01:05:40.900
piece of technology that exists in the world by far. It's a very, very strange. And I don't
01:05:45.180
think it utilizes training data to come up with, I mean, like as we want our kids to come online,
01:05:50.820
Does that make it like a car? Like it's not the same. I think it's not the same. We can do things
01:05:56.740
Have you been in a car with a 16 year old who's just learning how to drive?
01:06:00.460
Uh, I've been the 16 year old who was just learning how to drive.
01:06:04.200
I note here that this is a bit different from times in the past where we have said,
01:06:10.060
Hey, humans work kind of like a machine. It would be more akin to if we said, Hey,
01:06:15.460
humans work kind of like a machine. And then we got the first fMRI images of a person's head
01:06:20.360
and it was filled with gears. Uh, that is basically what's happening as fMRI studies
01:06:26.820
on how humans process language get better with LLM related stuff. We have an episode on that right
01:06:34.700
here. If you're interested in this topic, uh, but I didn't want to derail this particular
01:06:42.420
The smartest humans can do things that AI can't do. I don't know if like, there's a few things
01:06:48.580
that AI is bad at because they appear to be using different systems in our brains,
01:06:52.460
like counting or something like that. But I don't know if like, when I look at the,
01:06:56.380
the language processing of AI, that seems better than 80% of humans, 70%.
01:07:02.960
It's not about, sorry, I'm getting a phone call right now.
01:07:09.920
Yeah, we'll leave you go. We kept you for way longer than we said.
01:07:12.580
This is another fascinating conversation. I wish we could have done more about it,
01:07:17.840
Thank you for coming. Is there going to be another Hereticon?
01:07:20.200
I think there will be. Yeah. I think it'll, I don't know if it's going to be the last one
01:07:23.180
or not. Maybe these things will be good. It's a feel like a trilogy is important.
01:07:30.000
Very similar to this show, but much more mild. And if you hate our AI takes, which like
01:07:34.780
a big part of our fan base does, cause we're very pro AI, like this guy is like us,
01:07:40.180
but with saner AI takes more optimism. This is good. If you like us, but you don't like
01:07:45.800
our pro Luigi Mangione stuff is like us, but not pro Luigi. Thanks guys. Thank you so much.