How The World Stopped Caring About The Environment
Episode Stats
Length
1 hour and 1 minute
Words per minute
181.16246
Harmful content
Misogyny
15
sentences flagged
Hate speech
32
sentences flagged
Summary
In this episode, we discuss the end of climate change activism in the 21st century, and how the current climate panics are a symptom of a larger societal collapse. We also discuss some historical examples of when climate change was at its height, and what we can learn from them.
Transcript
00:00:00.000
Hello, Malcolm. I'm excited to be speaking with you today because 2025 was the year that
00:00:05.340
climate activism died. And I don't think enough people are talking about it, but major activists
00:00:11.220
and donors and even states are dropping climate change. Like it's hot. So we're talking about
00:00:16.980
Matthew Glacius, Greta Thunberg, Bill Gates, and even the state of New York, which is insane.
00:00:22.420
I'll go into it. I'm like, okay. I mean, it's clear. It's over. It's over. We're not trying
00:00:32.480
to make Fetch happen anymore. You only fight these causes because caring cells. All you activists can
00:00:39.360
go for yourselves. That was so inspiring. What a wonderful message. And in general, the sentiment
00:00:45.020
has shifted from saving animals and the earth to class conflict and human dignity. And this is
00:00:51.320
exemplified by folks like Kylie Jenner being criticized for swatching her animal cruelty
00:00:56.900
free makeup on her housekeepers. No one cares that it's animal cruelty free. They're like,
0.98
00:01:00.920
how dare you? She wants to become a housekeeper. She got it all cheap or whatever. This office that
1.00
00:01:08.400
was able to do animal testing really cheap. And then they found out it was just because they were
00:01:11.140
doing it on interns. I think there's that too. I heard about that separately, but this was,
00:01:15.300
I think a more prominent kerfuffle, but I just think it's really funny because she pays her
00:01:19.640
housekeeper. The housekeeper obviously consented to it. But I think just near, I think it's
00:01:24.560
exemplified because what really people are freaking out about is basically in any way using a paid
00:01:31.120
employee, I guess, you know, for anything and to not do it yourself. It's a fascinating phenomenon.
00:01:39.220
How hard, how fast, and how completely the climate movement was abandoned.
00:01:45.440
All right, that doesn't stupid ass rainforest. This place sucks. I was wrong, the rainforest.
00:01:52.840
We will be teaching our children about the climate movement as a historic movement.
00:01:58.980
Yeah. And speaking of historical movement, I think this, this is a, there's a wider question that this
00:02:04.620
development, the 2025 crash of climate activism brings to light, which is, this is of course not
00:02:11.200
the first panic we've had. And I think it's really important to ask ourselves in light of current
00:02:17.700
panics that are actively going on current things. People are like, we have to spend money on this.
1.00
00:02:21.940
We have to change our lives around this. We have to learn how to better divine what is worth our time
00:02:28.300
because I spent a huge portion of my youth dedicated to climate activism.
00:02:33.100
That's what her degree is in. Save the sea turtles. I like spent a summer volunteering to
00:02:37.380
help the baby sea turtles, make it to the ocean and measure the giant sea turtles. And by the way,
00:02:42.240
do you know the secret to stopping a giant sea turtle as they're making their way back to the ocean
00:02:45.920
so you can measure her? You stick your knee, you need two people, but you stick your knee behind her
1.00
00:02:52.320
front fin and then she can't move forward. And then that, that frees you up. Yeah. But it doesn't
00:02:58.520
always work when you get a big enough turtle, you just can't stop them. And at one point I just
00:03:02.840
watched that one of the Italian volunteers just ride her straight into the ocean. He just like gave
0.81
00:03:07.420
up and got on top of her and was like, I mean, like, you're not supposed to do that, but he's
00:03:10.380
just like, screw it. Like I'm going for it. And it was a beautiful thing to watch because there was
00:03:14.620
bioluminescence in the ocean at the time. And we're doing all this overnight. So it's like dark and
00:03:18.300
just watching an Italian ride into a glowing ocean on a turtle is one of those things you'll never forget.
00:03:23.640
But the point being is, is you dedicating is that we need to, we need to learn. We need to learn.
00:03:29.060
Yeah. I, yeah. And then I studied environmental business. I tried to build an entire custom
00:03:33.100
college major around this. I worked for earth day network. This is the group of people who created
00:03:37.080
earth day. I worked for the American council on renewable energy. I was extremely dedicated to this.
00:03:43.100
And there are a lot of people now that are extremely dedicated to working on, on AI related
00:03:49.920
apocalyptic organizations and, and working on all sorts of other panics. And so what I want to do
00:03:55.740
is also go through some historical panics and we'll also discuss climate change more. And we'll
00:03:59.300
also discuss the current crash. Cause I just think it's amusing.
00:04:01.640
Well, it's also interesting to talk about from the perspective of you and myself and the
00:04:07.620
figureheads of the prenatalist movement. Like yeah. Demographic collapse is another one of those.
00:04:11.340
We've got to freak out about this. And you know, we should, the point being is just because
00:04:15.540
something is like a big problem for civilization or something that's going to affect everyone.
00:04:20.640
It does not mean that you can turn it into a movement or that it will stay a movement in any
00:04:27.180
point of time. Like right now that climate chain has crashed out. It doesn't mean that anything
00:04:32.080
existential has changed about, I mean, depending on like deforestation, for example, is like
00:04:40.200
Oh, and I'm going to get into it too, but also like a lot of major cities, Mexico city,
00:04:45.720
I think it's Sao Paulo, South Africa, for sure. They're just running out of water. They're just
00:04:49.900
drawing down their water tables. I mean, he talked about Iran too. There is going to get a point
00:04:53.980
where they do not have groundwater to pull from. There's going to be nothing left.
00:04:58.100
So some of these are like existential questions for civilization.
00:05:03.000
Well, and so I, I think we, we, yeah, the, the, the larger discussion of this podcast is
00:05:07.480
when is it a justified panic when you need intervention? And I think a justified panic
00:05:12.180
is one where the issue is actually real and imminent. And one of the big problems with
00:05:16.880
climate change is that, yeah, climate change is real. It's just not as imminent as people
00:05:20.520
have repeatedly claimed. And that two will not resolve via normal environmental or market forces.
00:05:27.600
And I think a big common factor, and we'll, we'll discuss this more too, is if something is,
00:05:31.580
is priced into the market, for example, with peak oil, this wasn't the big,
00:05:37.480
panic that people thought it was because there are economic incentives for companies to build
00:05:42.160
innovations, to address diminishing oil supplies and to be better at extracting oil because people
00:05:47.460
are willing to pay for it. And as prices go up and people pay more for oil, people who are
00:05:51.660
entrepreneurial are willing to invest in technologies that allow them to extract oil
00:05:55.400
more efficiently. And so the problem essentially gets resolved.
00:05:58.820
But I want to, I want to take a second to just explain peak oil because some more fans are young
00:06:03.480
and may not know about this panic. Peak oil was a panic that was had in the 80s that human
00:06:08.380
civilization would run out of easy sources of oil to tap.
00:06:13.980
And, and, and because we built a civilization on oil, which we have, that human civilization
00:06:20.900
would collapse at the point that we ran out of oil.
00:06:23.460
Now, what is, what is interesting about this? And the reason I think is, it's a good thing
00:06:27.700
to sort of meditate on as a crisis is peak oil, you know, within the time period must
00:06:33.660
have seemed like the most obvious problem. There could ever be a problem. Is our civilization
00:06:44.220
It runs on magic or something, right? Like obviously it's finite, right?
00:06:47.720
Yeah. And what happens when we no longer have oil, if civilization runs on oil, civilization
00:06:53.880
stops, right? Like this is easy. If, then, then this isn't even like climate change or
00:07:00.960
something like that. Right. You know, or like people could debate the science of it or something.
00:07:04.420
This is just like, if, then, and it went in as a panic and it went out as a panic. And I, I think
00:07:13.980
that something that you did not get to is, I don't just want to look at this from the perspective
00:07:18.180
of what are the panics worth having? I want to look at this from the perspective of how do you
00:07:25.440
create a panic? Like, why were people ever so panicked about the environment?
00:07:31.620
Yeah. Well, we can also, I'm going to, I'm going to walk through some of the historical and recent,
00:07:35.920
well, relatively recent moral and social panics, because I think that there are some major moral
00:07:41.400
panics that we really should have had that we never had. And pretty much every actual moral
1.00
00:07:46.840
panic that has taken place over the last hundred plus years, totally stupid and unjustified wasn't
0.98
00:07:52.100
ever a problem. Like, why were you wasting your time on this? Well, like in the meantime, your lunch
00:07:57.160
is being eaten by Satan essentially. So we'll, we'll get into that too. But first, because I really
00:08:03.020
think people, a lot of people haven't realized that climate change is over as a cause now that it's,
00:08:07.900
it's over. No one cares about it anymore. I want to make sure people get the memo and I want to go
00:08:12.360
through the prominent detractors and, and, and major former activists who have now moved on
00:08:17.740
because I really want to hammer this home just to make it clear. Just, there are so many people who
00:08:22.860
haven't, they're, they're essentially like that person who's still living in the bunker, who assumes
00:08:27.180
that the nuclear apocalypse has taken place. They didn't realize that the world just kept going on and
00:08:31.020
no bomb actually hit. We've received letters from people written on like recycled paper using old
00:08:38.960
like billing envelopes that they didn't use for payment. So they're, everything's recycled and
00:08:42.960
they're writing to us being like, how dare you be in support of prenatalism when there are too many
00:08:47.920
people in the environment is in crisis and they just didn't get the memo. So let's, let's just be
00:08:53.080
super clear about what, what has happened. So let's start with Matthew Iglesias. Though he later took the
00:08:59.840
post down on December 28th. So just, just a little bit ago, he posted 10 years ago. I believe that
00:09:07.100
catering to the views of youth climate activists was important, but because I was not paid by the
00:09:13.260
same people who are astroturfing these groups, I was allowed to learn that I was wrong and change my
00:09:18.900
mind in response to information. And then in, in relation to this, and this is all an accident,
00:09:23.860
by the way, you get a, you know, context. He's like a major, I think, leftist thought,
00:09:30.560
like he co-founded Vox. He's that guy. Great. He's the founder of Vox. That's what we need to
00:09:36.300
know. Okay. Sure. Great. So in relation to this coddled affluent professional, that's the X username
00:09:43.080
posted a few years ago, a full professor of physics offhandedly assured me that climate research was a
00:09:50.180
scam and almost entirely a product of bad incentives in the grant industrial process.
00:09:55.540
He knew of people doing perfectly reasonable modeling. They were forced out of, out for lack
00:10:00.600
of funding because they didn't come to the correct conclusions. If you look at academic knowledge
00:10:06.260
production, it's actually a lot easier to engineer unanimity and consensus than you'd imagine.
00:10:12.160
And then on the subject of youth, not actively caring about climate change, we have Greta Zundberg,
00:10:17.320
the mascot of climate change. She began vocally supporting Palestinians and criticizing Israel's
00:10:24.500
actions in Gaza shortly after the escalation of the Israel. That was a bad investment, huh?
00:10:28.880
In October, 2023 though. So like this isn't only 2025 when the shift started. When she first made this
00:10:37.480
pivot, she tried to shoehorn it into environmental justice, but I'm not so convinced, but specifically
00:10:44.320
in a December, 2023 Guardian op-ed, she explained that solidarity with Palestinians aligns with the
00:10:51.880
movement's longstanding support for marginalized groups facing oppression, and then argued that
00:10:57.420
there can be no climate justice without human rights.
00:11:04.220
Yeah, she's trying to be like, but well, but environmentalism is the same as imperialism,
0.99
00:11:09.060
occupation and humanitarian crises, which it is not, Ms. Thundberg. But she was trying to shoehorn it.
00:11:16.060
She was basically pivoting to those issues and trying to be like, no, they're all the same, but
00:11:19.900
they're not. Greta, they're not. But technically, she's still active in environmental causes today.
0.98
00:11:25.620
Her activism has broadened over time to encompass climate justice. But I think climate justice is just
00:11:31.000
your way of saying, I pivoted away from environmentalism, and I don't want to admit it.
00:11:35.280
And she did participate in an Extinction Rebellion protest in Venice in November of last year,
00:11:42.260
which they dyed the Grand Canal green. And I looked at pictures of it, and I'm like, I can't tell
00:11:47.080
what was done. And yes, it was non-toxic dyes, don't worry. Extinction Rebellion, which also goes as XR,
00:11:53.920
they operate as a grassroots non-hierarchical network. So basically, anyone can organize actions in its name,
00:11:59.860
and they follow these principles of non-violence, non-blaming individuals, etc. But they try to do
00:12:07.700
creative high visibility protests. They've done road and bridge blockades and occupations of public
00:12:13.420
spaces and sit-ins and glue-ins and lock-ins and symbolic stunts like fake funerals and dramatic
00:12:19.620
performances and targeted campaigns. But she's really not focused on environmentalism anymore.
00:12:26.300
Now, by the way, I was trying to figure out what she's doing post the flotilla, because that's
0.68
00:12:30.740
where she was living. That was her main gig. She's now living at an activist house in London.
00:12:35.220
This is what happens. You just sort of chill out. No, but I don't even know how she can come back
00:12:40.060
from this at this point. We have another episode of what's next for the left, because-
00:12:43.960
And it was made so much worse by the situation in Iran right now, because the fact that she has done
0.95
00:12:52.920
nothing about what the Iranian government is doing, which is exponentially worse than what Israel did,
00:12:59.660
it really goes to show, for her, the problem was the Jews.
00:13:03.240
When I say she's done nothing about the situation in Iran, I mean, she has not even made a single
00:13:09.380
tweet about it. She's able to motivate an entire flotillic instruction when it comes to Gaza.
0.87
00:13:15.200
But when it comes to the Iranians, she can't even lift an actual finger.
0.99
00:13:20.040
Great name recognition. She can bounce back as soon as she wants to.
1.00
00:13:25.180
Oh, reality TV show, some kind of pay the media piece, writing stupid fluff for that leftists
00:13:32.500
eat right up. Like, I just, I think she'll be fine. I just, it's clear that she dropped in.
00:13:38.520
And I think also Greta, she comes from a very media savvy family. She, she rode the wave of
00:13:46.140
climate activism and left it because she realized that the attention wasn't there anymore. And it's
00:13:50.100
just not, but you know, while she only really was in a mouthpiece that didn't actively cater to real
00:13:56.020
environmental outcomes, someone who actually really did seem to seriously care about climate change
00:14:01.980
and trying to do concrete, meaningful things to address it was Bill Gates. But similarly to Greta
00:14:08.980
Thunberg, Bill Gates is slowly backing away from climate change work and shifting his attention to
00:14:13.860
human rights. And he still says climate change is a serious problem. But now, and it is, I mean, but,
00:14:19.200
but just not what people said it was. He's recently shifted from a climate disaster framing toward a
00:14:24.740
focus on human welfare and adaptation and realistic expectations about emissions cuts, which is
00:14:29.480
perfectly logical. But for contrast in his 2021 book, how to avoid a climate disaster,
00:14:35.600
because he wrote a fricking book about it. This is how focused he was. Gates framed climate change as
00:14:40.700
one of humanity's biggest challenges and focused heavily on reaching net zero emissions through
00:14:46.260
innovation and policy. He emphasized that avoiding more than 1.5 to 2 degrees Celsius of warming required
00:14:52.280
aggressive emissions cuts across electricity and manufacturing and transport and agriculture
00:14:57.680
and buildings. Now this is in 2021. This is after in 2020, we realized just how little a cut in
00:15:04.700
emissions you get from literally shutting down the entire world because of COVID-19. So I think,
00:15:10.160
you know, it's, it, he really held to this for a long time, but then came the pivot in October,
00:15:15.020
2025. He published this memo and this blog post and you can look it up. It's titled three tough truths
00:15:20.940
about climate. He argues that climate strategy should quote, focus on human welfare, even more than
00:15:26.880
temperatures, the greenhouse gas emissions. And he said that too much attention has gone to near term
00:15:33.200
emissions, emissions goals and doomsday rhetoric, and not enough toward improving the lives in a
00:15:39.380
warming world. In his, his ventures that were primarily based around climate change are now kind of
00:15:46.700
like their funding and staffing is being shifted around. So his venture initiative breakthrough energy
00:15:52.360
and related clean tech is, is getting some scaling back while he's putting more emphasis in technologies
00:15:59.120
that cut emissions, but more focus on improving livelihoods. And he's also increased overall spending
00:16:06.980
on global health and anti-poverty work through his foundation. And he's positioning climate more as just
00:16:12.560
one major issue among several drivers of human suffering with the focus really being on, on human rights.
00:16:19.000
Now let's look at States. Let's look at New York, right? Cause New York is, is it extremely, for those who live
0.99
00:16:23.760
outside the United States, it's, it's, it's a very progressive leftist environmentally focused state. And well,
00:16:31.080
they, they used to be extremely, extremely committed to climate change. Another, they're not exactly meeting
00:16:36.320
their commitments. So they once aimed to nearly eliminate greenhouse gas emissions. Their goal
00:16:41.940
was to do so by 2050 and get 70% of their electricity from renewables by 2030. That's four years from now,
00:16:48.760
but they are years behind on this. And their, their government heart, Hochul, I don't know how to
00:16:55.400
pronounce her name now argues that reliability and cost need to be prioritized because obviously they do
00:17:02.100
that she does blame surging energy demand, high utility bills and a hostile federal administration
00:17:07.860
on this. She, she doesn't admit that just, it's not practical, but beyond that, several marquee
00:17:13.740
policies have been delayed or softened, including regulations to implement the 2019 climate laws cap and
00:17:19.560
invest program that would change major emitters and they would charge major emitters and fund clean
00:17:25.340
energy and efficiency investments. So her administration also postponed all anti-electric new buildings
00:17:31.720
law backed an offshore gas pipeline previously rejected on environmental grounds and approved an
00:17:42.820
extension for a gas plant powering a Bitcoin mine. I think you'll remember like the New York was famous
00:17:49.220
for, Oh, they're going to take away your gas stoves back on all of this. And then even in the exact
00:17:55.640
opposite direction, the administration is courting energy intensive tech and industrial investment,
00:18:01.360
including a proposed 100 billion micron memory chip complex expected to consume as much electricity
00:18:07.680
as about 1.5 million homes, even as the state projects electricity demand could rise up to 24%
00:18:14.640
by 2040. So they're just kind of dropping it. It's just like, I think it'd be really fun. And I could
00:18:19.940
see things moving in this direction is if the right decides to try to take this issue from the left,
00:18:25.060
like they did with Maha. I mean, the two movements are really tight.
00:18:28.000
Well, one of our friends who runs a nonprofit that has always worked with state-level
00:18:32.580
Republican policymakers, like state legislators on climate tech, not because it's a progressive
00:18:41.140
cause, but because there are a lot of practical economic and logistical reasons to invest in clean
00:18:48.180
tech. It's not just that. It's that as the Republican party has undergone its reconstructure
00:18:54.180
and association, you've got major figures like Elon, at least he says he wants to be a centrist or
00:18:59.280
whatever now, but everyone knows he's a right leading individual, right? I mean, he's, he cares a lot
00:19:03.440
about the climate. Like you care a lot about like environmentalism. And I think a lot of, if you just
00:19:08.800
reframe it to protecting our hunting and fishing grounds, if you. Oh yeah. Like the old Teddy
00:19:14.760
Roosevelt conservation of like, I love nature. It's awesome. I want to go shoot some animals in
00:19:20.640
it and camp and have fun. And it's cool. If you, if you come out as, you know, with the campaign of
00:19:26.160
we need to, you know, do, do protection of wildlife and protection of wildlife from contamination,
00:19:33.740
I think move away from all the global warming stuff, right? Like that, that doesn't play to our
00:19:37.800
Republican base. So let me hunt and stop turning the frogs gay. Yeah. And I think that would appeal to,
1.00
00:19:44.120
of the, the Maha base. Yeah, totally. Yeah. That would appeal to most mainstream Republicans.
00:19:50.140
Yeah. And it would completely destabilize the Democrats. If the Republicans were coming at it.
00:19:55.320
To appropriate climate activism. That would be.
00:19:57.580
To appropriate climate activism. Well, that's, that's actually a thing because now climate change
00:20:01.880
has been made so uncool. There's this woman, actually a woman named Clara Chang, Changshin-Fang,
00:20:09.720
or Fang surveyed 1,003 self-identified climate activists and found that they're mostly female,
0.71
00:20:16.160
non-Hispanic, white, progressive, middle-class, over 50, and highly educated. So what does that
00:20:20.860
mean? They're Karens. So like now climate change has really become like the only people who now still
0.98
00:20:26.880
care about it are progressive old Karens. And, and Colin Wright on X wrote, it is increasingly
00:20:35.360
difficult to avoid the conclusion that such women are, are channeling the energy and protectiveness
1.00
00:20:40.120
that would ordinarily be directed toward child rearing into climate activism, treating earth
00:20:46.040
itself as their vulnerable child. And Joe Lonsdale, like the, the Joe Lonsdale wrote, climate activism
00:20:52.780
is a religion for midwits who want to feel intellectually superior, but are not. Women tend to be more
1.00
00:20:58.640
religious than men and will often seek it out in their lives, proceed accordingly. And then in response
0.86
00:21:04.520
to that, sentimental robotics, another user on X pointed out, just did some napkin math, not
00:21:10.140
according to, not, not accounting for lost productivity. We could be looking at 12 to 15
00:21:15.780
trillion in capital expenditure for climate efforts since 2000. Imagine how much we could have done
00:21:20.860
with those funds, not saying environmentalism is important, but obviously improving air quality,
00:21:25.580
et cetera, is good. But for 15 trillion, I feel seriously ripped off. So Simone, who is Joe
00:21:30.560
long? You said the Joe, who, who is this? Joe, of, of like the Trump administration,
00:21:36.520
the famous investor, the, the, I mean, like, okay. Somebody tied to the Trump administration.
00:21:41.140
Yeah. He's, he's, he's a, he's a very famous philanthropist and entrepreneur and investor.
00:21:45.260
He founded Palantir. He's with UT Austin. That's easy. That's easy. Okay. So the, the few
00:21:52.300
points I want to make here before you go further. One is, is that I think a big reason climate change
00:21:57.220
has dropped off. As you say, it's a bunch of old women now is a lot of young men pretended to care
1.00
00:22:03.200
about the climate because hot young women cared about the climate. Yeah. Not anymore. They're
1.00
00:22:07.120
old now. You were one of them back in, back in the day. I'm old now. I'm a hag. You're hag maxing
0.97
00:22:13.760
now. I'm a hag maxer. Absolutely. Yeah. Nobody, but I'm caring for a child now, not the climate you
00:22:20.420
see. Actually, I even asked online just because, you know, we've got a high enough profile now. I was
00:22:24.760
like, is there any like quotes or tweets of like people thirsting after Simone? Couldn't
00:22:28.920
find any, right? I'm a hag, but I couldn't let it, let it be. I'm wearing my hag scarf. It's
0.59
00:22:34.520
perfect. My little hag gloves. Look at this look. Look at this look. What am I? I'm Whistler's
00:22:39.780
mother. Exactly. Exactly. With glasses, with glasses. By the way, people are so stupid. And
0.99
00:22:47.820
I cannot tell you how many times I've seen this. People appear to believe that rim width
00:22:53.440
is an indication of prescription strength. I know this as well. I've seen this as well.
00:22:59.420
They're very dumb, but they think I don't, I don't even need glasses. I don't, I wear them
00:23:08.580
Anyway, Simone, I think that this is hugely, like as these women get older and more undesirable,
1.00
00:23:17.540
their screeching at society is going to become increasingly more disgusted by the general
00:23:23.840
public. I think there was a great. I mean, well, I think they're, they're, they may be
00:23:28.000
playing a non-trivial role in the, the 25, 2025 dropping climate change, like it's hot thing.
00:23:34.140
Right. Well, and I, well, the degree to which they do not control the narrative anymore is
00:23:38.540
really strong. The one thing that got me recently is if you look at the last I was looking at
00:23:43.480
it, it had been published for like four days and it was the Paramount Plus on YouTube for
00:23:47.620
free, the new Star Trek thing. And after four days of being live for free on YouTube, it was
00:23:54.380
like at 130,000 views, which is less views than we get in a normal.
00:23:59.200
Well, they even advertised it. Didn't you see that there was even a, a, an advertisement
00:24:03.900
for it? It's not like it was just put on. Cause I thought, well, okay, well, they must've
00:24:07.780
just published it. No one knew to even look for it. And the algo is really weird now.
00:24:11.780
So give them credit, but no, they paid it for a million dollar production was major stars
00:24:17.300
in it, right? Like just nobody cares about what the mainstream media is zombified. They're
00:24:24.500
working like it's 1990 and they don't realize that the, the fundamental economics have completely
00:24:30.080
changed. The media landscape has completely evolved. They, they're just going to run out of
00:24:33.880
money and die, but they are, they are dead men walking. That is it.
00:24:37.820
Yeah. Yeah. It is completely, whenever we deal with them, I'm always just like, this is
00:24:43.240
This is so stupid. And we do more work with the mainstream media that we cannot talk about
00:24:46.900
because of NDAs and stuff like that. Another thing about mainstream media talk about running
00:24:50.360
a movement is there was the recent thing where Nick, the guy who exposed the Somalian fraud
1.00
00:24:54.560
did an interview as a YouTuber. And then the YouTuber tried to cut him. It's so that it made
00:24:59.300
him look like he said a bunch of stuff he didn't say. And Nick had filmed it all with his phone.
00:25:04.420
I don't know if he had done that secretly or whatever. And so it came out that the YouTuber
00:25:07.080
was fraudulently manipulating the tapes and it made him look really bad. But what, what
00:25:13.980
Nick Shirley. Yeah. Is that we've had to do that while, without being able to film our own
00:25:18.720
stuff without like legally, I couldn't even, if I had filmed what happened during that viral
00:25:25.160
interview, I couldn't even share that with you. And what's even crazier about the viral
00:25:28.980
interview, the piece that that was attached to was supposed to go live in December and
00:25:33.120
it has not gone live yet. And we haven't heard anything from the team yet. So we think they
00:25:37.980
may have just dropped it rather than put in a disclaimer that their own person was wrong.
00:25:42.140
Like, I don't know like what they were going to do. They seem to not be aware that like we
00:25:47.280
might do a longer episode trying to dive into what, what happened with that, but just to cancel
00:25:52.180
an entire filmed project out of embarrassment is.
00:25:56.640
Yeah. It surprised me because from what we can tell when journalists have run pieces on
00:26:08.680
Yeah. They, they, they perform quite well. And that makes us happy because we don't want
00:26:12.500
people to travel all the way from Germany or France or wherever, even if it's just New
00:26:17.880
York or DC to spend a day or two at our place and get a hotel and everything and have
00:26:22.340
it not yield an ROI. So we, we are sensitive to that. It is. Yeah. So it, it, it surprises
00:26:27.920
me that they invested in sending people out and then ultimately didn't run anything, but.
00:26:33.400
But I mean, even, even if they just make it a fully, you know, fair and non sensationalized
00:26:38.680
non hit piece coverage of demographic labs, cause they have plenty of footage of us just
00:26:43.260
talking about the issue in a didactic manner, which was our point in the first place. If
00:26:49.860
we don't ask people to do hit pieces on us and we encourage them to make them interesting,
00:26:53.700
but we, we, you can just use us for responsible news coverage. And many people do anyway, let's,
00:27:01.040
let's move forward though, because I think that the bigger, the bigger question that we need
00:27:04.800
to be asking ourselves moving forward about panics is, you know, what, when should I actually
00:27:10.280
be changing my life and behavior? When should I be donating to a nonprofit about this? When should
00:27:14.240
I be caring? And when can I reliably understand that one, this may not actually be a real problem
00:27:21.260
or as urgent as it is, or two, this will probably self-correct or our market market factors will
00:27:27.680
self-correct this. It will be priced in and people will come in and innovate a solution in time.
00:27:33.200
So I want to, I want to share some unjustified panics. And there are really two types of unjustified
00:27:37.920
panics, which is either the problem is not real or not as imminently urgent as it is, or the problem
00:27:44.400
is a self-correcting one through market dynamics or just through like natural self-correction. So
00:27:50.020
obviously, as we've just been talking about environmental doomsdays are just one of those,
00:27:55.240
the problem is just not as urgent as people say it is. What I didn't realize, cause I feel like we've
00:28:00.020
kind of been gaslit about this repeatedly is how many times people have said, oh, you have like five
00:28:08.960
years left. It's all about to end. And then nothing happens. So this even goes back to, and I'm sure it
00:28:14.260
goes back earlier, but there was this Earth Day end of civilization prediction series in the 1970s,
00:28:20.060
where around the first Earth Day, public figures and some scientists warned of looming environmental
00:28:25.660
collapse within a few decades, including claims that cities like New York would literally be
00:28:29.980
underwater. Yes, we've had like hurricanes where there's been flooding, but they meant actually
00:28:34.180
permanently, durably underwater and, and that the world would face unavoidable global famine and
00:28:40.200
resource exhaustion by the end of the century. And then also media reports in the seventies,
00:28:44.760
based on some scientific papers, warned of an impending ice age, ice age. And this was right before
00:28:50.260
this like series in the nineties where everyone was talking about global warming instead. But they said
00:28:54.700
that aerosol pollution and natural cycles would cause an ice age. And they predicted, of course,
00:28:59.540
again, famines and societal collapse, like literally time magazine, which used to be big for those who
00:29:05.700
are babies and newsweek, which was another really big publication amplified this as this consensus
00:29:11.420
view, like, oh, everyone knows the world's going to end. And it never did. And even, even though the
00:29:16.860
scientific community was divided on this and then the trends shifted toward, oh, it's global warming
00:29:22.660
instead. Nevermind. But every time this happens, they kind of just bury the, well, we were wrong part of
00:29:29.640
this and just switch to a new form of apocalypticism. And, and while it's absolutely true that environmental
00:29:36.140
problems like pollution and biodiversity are serious, loss of biodiversity, the actual imminent collapse
00:29:43.240
scenarios just haven't played out. Even in the, the eighties and two thousands. So after the seventies, there were all
00:29:49.020
these short-term climate apocalypses. Some high profile statements forecast that entire nations
00:29:54.520
would be wiped out by sea level rise around the year 2000, or that the Arctic would be ice-free
00:30:01.360
by the summer of, or the early 2010s. That just didn't happen. But I totally remember, don't you?
00:30:07.000
Oh, the polar ice caps are melting. Well, there's a famous one where Greta Thornburg predicted,
00:30:10.760
and this was like 15 years ago, that in five years, the world was going to be flooded.
00:30:14.200
No, just like the number of times that they've done this. And then it's just all like, well,
00:30:19.000
like they just keep going after. But this makes sense to me when I think, you know, in the context
00:30:23.760
of, I, you know, I worked for Earth Day Network, the people who started Earth Day and started one of
00:30:29.120
the earlier panics, you know, that there is a, because a lot of these organizations raised a lot of money
00:30:37.780
in the world of academia, as, as was alluded to in those earlier tweets I read off. And, and as, as,
00:30:43.380
you know, the Earth Day Network raised a lot of money, then you have these sprawling nonprofit
00:30:48.080
and economic organizations who have a lot of money and have a very vested interest
00:30:52.400
in not being shut down when it turns out the world actually isn't ending. So they have to build a new
00:30:57.760
panic. They have to, they would either have to find a new cause, or they would have to find some new
1.00
00:31:03.220
way to justify fundraising so that they don't lose their jobs. And so I think part of why climate change
00:31:09.020
just kept sticking around was for a very long time, people were able to just kind of keep the
00:31:15.600
delusion going. But let's move on from climate change, because we've talked about, I want to
00:31:20.160
talk a little bit. So I think that what we're seeing, and it is very interesting, is climate
00:31:24.860
change as a movement, if you contrast it with Greta Thunberg's current movements, for example,
00:31:29.480
right, and where the left seems to be going more, is structured quite differently. Climate change is an
00:31:34.720
apocalypticist movement, right? Like it is. Yes. Fix this or else civilization falls apart, right?
00:31:41.360
To draw up an apocalypticist movement is, from a historic perspective, pretty rare. Even when
00:31:48.320
religions that are apocalypticists, the predictions don't come true, people involved in them, they often
00:31:54.880
end up just doubling down, right? Like it's a sort of hope springs eternal thing, but just
00:31:59.800
apocalypticism springs eternal. Yeah. Right. And it's so interesting, because I think right now,
00:32:07.560
the movement that's growing that is structurally closest to the old climate change movement is the
00:32:11.480
pro-natalist movement, right? Like in the same way that a climate change advocate might get excited
00:32:16.040
when they see the numbers being down again, I get excited. I'm like, ah, I made a good bet. My
00:32:20.400
predictions are right. The numbers are down yet again. But the point here being is that they've moved
00:32:27.220
from an apocalypticist movement to a movement that is much more... It's fundamentally revolutionary.
00:32:33.660
It's about upending the social order. Yeah. It's about, yes, upending the social order.
0.94
00:32:37.980
So revolutionary and religious. It is a structure of basically metaphysical beliefs around how the
00:32:44.640
world works. We go a lot more in detail in our video on Zorhan Mandami, because I think he's a very
00:32:49.200
good explanation of this. Well, yeah, it's when we first became aware of anti-colonialism as a concept,
00:32:55.140
because we just hadn't really recognized that it was an organized philosophy.
00:32:59.460
Well, I couldn't conceive of how the philosophy of anti-colonialism could view the Jews, the native
00:33:06.040
population of Israel, as the colonizers. And if people are like, well, they came there from Moses'
0.95
00:33:11.760
time, we know that the population was 50%, at least the original population. So the native population,
00:33:17.320
after being removed by an empire, could be seen as colonizers by coming back to their original land,
00:33:23.320
right? They would post something like a meme of like Mount Rushmore.
00:33:29.000
This is a sacred monument. And you could post something like the Temple of the Rock on top of
00:33:36.440
the Jewish Temple, and it's exactly the same defacing. And yet they would offer no. And so I
00:33:42.900
didn't understand that until I took time to understand colonialist theory and how the groups
00:33:49.800
are divided within colonialist theory. And colonialist theory does not really bear much concern for
00:33:57.820
historic realities. It is a religious framework. And I think that part of the community that
00:34:04.020
previously within this apocalypticist's movement has gone down this new religious pathway. And then
00:34:09.940
the other part of the movement, the Bill Gates of the movement, right? They've moved in the pathway
00:34:15.860
that a lot of the traditionalist EAs started to move. You know, even they used to care about the
00:34:19.920
climate, but they don't know much anymore. And they've gone down the pathway of, well, I want to
00:34:26.580
like reduce the most in the moment suffering. I mean, if you point out that this is going to lead
00:34:30.980
to, and I think that this is where hard EA slash the sort of pronatalist ideological agenda really
00:34:37.240
contrasts with the agenda that these people have, right? Yeah. I think it's been laid very bare,
00:34:43.220
which is our plan is to attempt to use our resources, voices, power, lives, to build the
00:34:53.180
structure that future human civilization will be able to launch from. Like everything that I do,
00:35:00.420
I'm generally unconcerned with the life or suffering of any living human today, because there's going to
00:35:06.140
be so many humans in the future if we do things right, that I have a duty to plan long-term for
00:35:12.080
human civilization. And you see this with actors like Elon, like that's clearly Elon's goal as well,
00:35:16.780
right? Like his actions are not meant to leave in the moment suffering, you know, after he left.
00:35:20.900
Yeah. It's, it's a focus on long-term human flourishing. If you look at Bill Gates's actions,
00:35:27.480
it is immediate negative utilitarianism. And I, and I'd say selfishly even about reducing in the
00:35:34.540
moment suffering. And I think that's one of the core thing that divides the intellectual left and
00:35:40.160
right right now. I think that that's the core question that divides which side you are on as an
00:35:46.460
intellectual. Is your core goal long-term human flourishing or is it suffering reduction?
00:35:52.820
If it's long-term human flourishing, you're a rightist. If it's suffering reduction, you're a
00:35:56.920
leftist. Yeah. I mean, especially when you combine that with how you define self, like is your point of
00:36:03.620
identification of, or, or optimization around those furthest from you culturally and familiarly,
00:36:10.680
or is it focused more inwardly in the circle? Well, no, no, no, no. The point I was making,
00:36:15.300
it was sort of an inversion of that point. I, it just wasn't connected, not inversion. That's
00:36:19.440
not exactly what I mean. The point I'm trying to make here is that the left right now has an
00:36:26.180
intellectual caste, Bill Gates, the soft effective altruists, those, those communities, as opposed to
00:36:31.840
hard EA.org, which we have. So the soft effective altruists in the bill, in the Bill Gates cause,
00:36:36.040
they are not bought into the religion that the, the masses and the influencer class believes.
00:36:43.720
The, the Hassans and the Greta Thornburgs and many of the foot soldiers of the left, the ones who
0.80
00:36:49.160
are at these protests and everything like that, what they believe is more like a very poorly thought
00:36:54.660
through religious framework, but it is one that they believe without logic and uncritically.
00:36:59.540
A lot of it's just based on avoided behavior and avoiding discomfort and focusing on
00:37:04.580
immediate optics that, that make you look good and win you popularity points, I think.
00:37:11.000
Right. But, but this, but in the right, we have a, a mirroring structure, right? You have the
00:37:16.480
intellectual caste who will intellectually engage with cross-cultural religious anthropology and
00:37:22.080
topics and stuff like that. But you also have the religious foot soldiers who are just foot soldiers.
0.70
00:37:28.000
They're just operating off of a metaphysical framework that they have not deeply engaged with.
00:37:33.000
And I think it's important to recognize that and see how these two factions are changing in both the
00:37:39.920
Sure. But let's move on. Other, other forms of panic that I think very, and I mentioned this already,
00:37:44.760
consistently don't bear out, ironically, are social panics. Just to give you a couple that have taken place in
00:37:51.460
America in the last 20 years, there have been several musical panics. Like throughout the last hundred or so years,
00:37:58.760
people have freaked out collectively around jazz and then rock and roll and then rap and hip hop. Like this is the end.
00:38:07.560
It's going to, it's going to cause everyone to go crazy and become evil.
00:38:12.780
I think, I think it's a, I think it's a symptom, not a cause.
00:38:15.980
And then there was a 1950s comic panic. Did you know about this? I didn't know about this.
00:38:22.980
Comic books were accused of causing juvenile delinquency and moral decay.
00:38:26.020
And it literally led to Senate hearings and the comics code authority. Like there was actual
00:38:32.840
like legislation and major action to control the feminist frequency of that generation.
1.00
00:38:40.220
I think I kind of think so. And then there's the famous satanic panic where, you know, just people
00:38:45.620
thought that there were all these, these groups sacrificing children and stuff. I feel like it was
00:38:49.980
kind of an early QAnon kind of thing. There's a really great podcast called American Hysteria that
00:38:56.140
I think maybe earlier she did, the host of that did a podcast on the satanic panic. But if you want
00:39:01.860
to just hear about various things that people had moral panics about, definitely check out the
00:39:06.520
podcast, American Hysteria. I love it. She is delightful. Very, very leftist, but most of the
00:39:15.760
I love listening to far leftists. Yes. Then there was the, yes, as you alluded to the panic
00:39:20.480
around violence in video games, which totally like a lot of researchers looked into this,
00:39:24.820
tried to see a correlation between playing violent video games and expressing violent behavior.
00:39:29.360
And there just wasn't one that wasn't borne out at all. And then there were two red scares.
00:39:38.140
And then I, well, that's the thing. It's so funny. And then the forties and fifties and what
00:39:42.160
people believed at these times, at these times, and this is when it actually wasn't
00:39:46.400
borne out. Was it communists had infiltrated every, nearly every organization and they
00:39:52.880
hadn't at that time. And now it's just so ironic. And like, no one bats an eye, you know, where,
00:40:00.480
where's a red scare when you need one. And what I think is so funny is there were all these moral
00:40:04.900
panics and throughout the 20th century, as these moral panics played out, people were increasingly
00:40:10.400
losing their religion, only sort of going through the motions of their religious affiliation and
00:40:16.160
cascading into soft and then super soft religion, as you define it in the Pragmatist Guide to Crafting
00:40:21.760
Religion, which is basically just not really following the rules anymore, not really making
0.99
00:40:26.500
any hard sacrifices in favor of your religion. And this led to genuine moral decay. This led to
0.58
00:40:32.440
genuinely people becoming less disciplined, having less inhibitory control, having more mental health
00:40:37.440
problems, not, not getting married, not, not successfully building careers and lives and
00:40:42.340
savings. And now we've ended up where we are and it's not good. And there, but yet there's been no
00:40:46.880
moral panic about that, which I think is very interesting. But then let's talk about, let's talk
00:40:50.720
about examples of problems that have been resolved by market forces. And I already talked about peak oil,
00:40:55.740
but you had the, the multiple sort of either population or, or famine based predictions. So
00:41:02.680
in the 19th and 20th century, you have this, this Malthusian, both like the Thomas Malthus driven and
00:41:08.940
then the Neil Malthusian predictions that population growth would cause huge famines and mass starvation,
00:41:15.740
but instead, yeah, it was, it was, yeah, because in the past it absolutely happened that when populations
00:41:21.600
grew too much, then there would be some kind of massive famine. But in this case, because a lot
00:41:26.380
of this happened right around the industrial revolution and huge technological breakthroughs,
00:41:30.160
you had agricultural productivity just expand dramatically. I think they call it the green
00:41:35.720
revolution, right? And basically everyone was okay. And then you still though, again, in the 1970s had
00:41:43.860
Paul Ehrlich published the population bomb. And he predicted that literally hundreds of millions of
00:41:49.960
people would starve in the seventies, regardless of policy and that global death rates would climb sharply.
00:41:56.840
And the countries like India were essentially beyond hope. Like it's too late. It's all over now. And yet
1.00
00:42:01.940
instead, basically everything was fine. And just there, there was, there was no like desperate
00:42:08.800
rationing and huge death. And when there were famines, they were mostly driven by war and policy, not by
00:42:15.400
planetary carrying capacity, which was the argument that he made. I think also arguably COVID-19 was one of those
00:42:23.020
sort of would have resolved on its own kind of panic, panics. They, when you look at excess mortality in
00:42:32.080
The countries that implemented, we've done another episode on this, implemented more restrictions had,
00:42:37.640
generally speaking, and this is also true of states, the more restrictions an American state implemented, the
00:42:42.300
higher its death rate was. The states that implemented the least number of restrictions had the lowest death
00:42:46.520
Yeah. So I think it, you know, that's, it's a really great example of a really serious panic that like
00:42:50.560
probably did more damage than good. But I also want to point out that there are absolutely justified,
00:42:56.280
justified panics. Oh yeah. Also, we have to also think about the fact that the COVID-19 moral panic
00:43:03.740
and in general panic caused people to completely, so many people lose complete faith in the media,
00:43:13.020
in their governments. Like you, you have so many knock-on problems from what happened,
00:43:18.500
but there were absolutely even environmental panics that were totally justified and caused people to
00:43:24.060
freak out and then solve the problem. Can you think of one that's environmental?
00:43:31.440
Yeah, you do. But yeah. So in the 80s, scientists realized, oh my gosh, there's massive seasonal thinning
00:43:40.160
in, in the atmospheric ozone layer over Antarctica, and that it was linked to chlorofluorocarbons,
00:43:49.800
I'm pretty sure they made that word up. That's not real.
00:43:52.480
I don't think the ozone layer is real. I'm a doctor.
00:43:55.720
But basically it, they realized it was a big problem. Oh my God, we're all going to get skin cancer.
00:44:00.560
And then they, they, they, they created the 1987 Montreal Protocol, and it had various amendments
00:44:07.400
that required a phase down and then near elimination of almost 100 ozone depleting substances,
00:44:14.100
including most CFCs and halons. When they achieved basically almost total, I mean, 98 point to 99%
1.00
00:44:21.500
reduction in their production and consumption within, compared to their peak levels.
00:44:27.040
And now the ozone layer is on track to recover to pre-1980 levels by the end of the century,
00:44:33.860
which is really cool. Like we, we solved that problem. I think arguably also lead and gasoline,
00:44:39.260
it was like, oh my gosh, like this is causing people to go dumb. And then we're like, well,
00:44:43.880
let's take out the lead. And so we did. And that's great. I think the fears of nuclear war,
00:44:49.920
though elements of it were overblown. For example, a lot of people are like, well,
00:44:53.680
if there's some kind of nuclear attack, it's going to cause a nuclear winter and we're all going to
00:44:57.420
die. When like, then later, when you look more closely and you model a little bit better,
00:45:01.760
it would cause more localized issues, but not like a total worldwide nuclear winter in most cases.
00:45:07.220
But still, I think it was justified panic when people realized that countries were kind of,
00:45:13.320
they had their fingers over the nuclear buttons and like, hey, maybe we shouldn't just,
00:45:18.000
maybe this shouldn't be the way we communicate. And that, that caused a lot of, I think,
00:45:22.500
international social norms that turned countries against nuclear as the go-to war thing of choice.
00:45:30.920
Another one that was actually pretty justified, even though to most people who even lived through
00:45:35.700
it, including you and me, because we're olds, is Y2K. For those who are not aware, in the year 2000,
00:45:44.640
people predicted that computers would fail at midnight on January 1st, 2000, due to this date
00:45:50.420
programming issue with, with how computers were originally designed. They just didn't put it in
00:45:54.480
enough numbers. Why did they felt so big when it was happening? It felt like COVID.
00:45:58.640
Well, yeah, no, people were like, they were, they were creating bunkers full of food and they thought
00:46:04.060
planes were going to fall out of the sky. And, and this was because legacy systems stored years at two
00:46:09.140
digits, meaning that the, the mini computer systems would misinterpret 2000 as 1900. And that would break
00:46:16.340
functions involving a lot of comparisons and interest calculations and expirations and scheduling
00:46:21.400
and literally billions of dollars were spent on fixes. And some people like built bunkers and
00:46:27.360
stockpiled supplies. So people thought that 300 to $600 billion were spent on, on basically,
00:46:34.500
I guess, weatherizing us for, for the, the, the Y2K. Uh-huh. Which is a lot, but however,
00:46:41.040
however, in for comparison, a major synthesis by the climate policy initiative estimates that
00:46:46.280
a cumulative global climate, climate finance of about 4.8 trillion U S dollars between 2011 and
00:46:53.940
2020. So, you know, that's the, you know, Y2K was nothing in comparison and even updated data
00:47:00.440
shows that about 850 to 900, 140 trillion, sorry, billion USD in 2021, which was, that was about like
00:47:09.660
1.3 trillion per year. So anyway, like it was peanuts compared to what we spent on the environment,
00:47:15.000
but it was actually a real risk. And it wasn't something that was going to cause planes to fall
00:47:21.760
out of the sky. It wouldn't have caused people to not be able to get food, but basically banking
00:47:27.940
and power and telecom and air traffic and, and many key government systems like social security payments
00:47:33.260
would have actually not worked. So it would have caused real disruption. And we did actually need to
00:47:39.340
like, Oh, this is a problem we need to fix. Like it was justified. I'm saying it was justified.
00:47:44.340
It was money well spent. I'm glad that we did it. Um, because inconvenience. Yeah. I mean,
00:47:50.220
cause it would have sucked. Okay. A lot of people would have gone without their social security
00:47:54.700
payments and then had trouble getting food. And I mean, a lot of people's, I mean, people,
00:47:59.760
maybe their entire savings could have been wiped out by, by market crashes related to the stock market,
00:48:04.640
just total, totally going out of control. Like a lot of really bad things could have happened.
00:48:08.360
So it is very, very good that we panicked about that and took action. So you're a Y2Ker. You were
1.00
00:48:15.620
pro Y2K. You are pro Y2K. That is right. Saying the Y2K panic was justified or at least more so than
0.99
00:48:22.600
environmental damage, right? Yeah. And so I think these, these are the common characteristics of
00:48:27.860
justified panics, which is, I think the biggest thing is they're not priced in. I think when something's
00:48:34.120
priced in, like, Oh, we're going to run out of oil, oil gets more expensive. Okay. Well then
00:48:38.240
the market responds by finding more efficient ways of producing oil.
00:48:41.300
Like eventually, you know, like how it is like, Oh, we can't live in this place anymore. So we move
00:48:48.160
away. And then like, you know, we have to figure this out or like, you know, peak oil or like, Oh,
00:48:52.480
like we, you know, humanity eventually just finds a new way of, I mean, I, I, I, listen, I mean,
00:48:59.760
like when, when people settled, like when, when English colonists settled in New England,
00:49:04.960
there was like a mini ice age going on. It was super cold here that we've dealt with climate
00:49:10.180
change throughout human history. We, we adapt. I think it's one of those things that you slowly
00:49:14.860
adapt to and climate change was overwrought because they, they said it was a lot more urgent
00:49:20.140
essentially than it was. We basically have more time to adapt to that. So yeah, I think when you,
00:49:25.700
when you make it apocalyptic, it is unfounded when you're like, Oh, there's this urgent thing we need
00:49:31.860
to handle right now. And there's a very clear reason why at this point it's going to happen.
00:49:35.460
Like Y2K was clear. We knew we had a clear deadline. We knew exactly why it was clear like that. But
00:49:41.320
when it's more like, Oh my thingy, my, my equation says like that, that, that typically correlates with
00:49:48.820
is not playing out to you. Like, how do you, what's your takeaway with how we should be signaling
00:49:54.180
per natalism? It's per natalism going to become the next big movement. Like, how do we,
00:49:58.340
how do we handle that? I think that it's, it's not, we shouldn't be framing it as this, like
00:50:05.520
this year, it's all going to fall apart. I think very similarly to it's, it's kind of like this
00:50:11.680
combination of, of climate change and Y2K of like, well, the writing's on the wall. Like
00:50:18.200
the way many countries, social services are set up requires this thing that is going to not
00:50:26.120
be the case anymore. And so we have to change it. Like the, the numbers will change and the
00:50:31.900
equation will break. It is very simple logic. There's nothing that's going to change the fact
00:50:36.540
that that's going to happen, but we, we do have time to adapt. And I think that's why it's important
00:50:42.400
for us to talk about this, but we shouldn't make it this immediate panic because it's not,
00:50:46.860
it's something we can plan for. And as you point out many, many times, like endlessly,
00:50:52.100
it's not going to change. We're not going to start increasing our, our output of humans. Like
00:50:58.380
the, the birth rate's not going to go up suddenly. We more have to just adapt the systems. We have to,
00:51:03.340
we have to analogously build the computer systems that can deal with the year 2000.
00:51:09.220
Well, this is why my framing of, of pronatalism is our job is to replace the existing population,
0.99
00:51:15.920
right? Like when you change it to, from the, from the framing of like, this is cataclysmic,
00:51:19.920
think about society. I do point that out. I'm like, why don't you care given how cataclysmic
00:51:24.940
this is when I'm talking to reporters, but like the goal of the movement is not to prevent the
00:51:28.500
cataclysm. It's to replace the population today, which is kind of priced in. I mean,
00:51:33.060
one of the things I was talking to Simone about is, you know, how easy our kids are going to have
00:51:38.400
it dominating the future of human society. If you look at like 40% of kids in fourth grade can't
00:51:43.080
read now, right? Like one, the education system is failing because people are being idiots and
00:51:47.140
still sending their kids to school. And then like the, the upper middle class parents of students
00:51:51.260
who are still giving their kids a decent education are still, their kids, what was the word? Like
00:51:57.500
cutting their knees, something like that. What is that? No, they're not letting their kids play
00:52:01.420
rough and tough. Well, they're not letting their kids. No, well, more importantly, they're not
00:52:04.300
allowing their kids to learn how to use AI or be on the internet or be really good with tech. So like
00:52:09.000
that's, you also don't want to undercut your kids by, by disempowering them in a tech enabled age.
00:52:15.840
Like you have to find a middle ground. So yeah, I feel like we have a major strategic advantage.
00:52:20.820
Then on top of all of that, even genetically speaking, smart people are just not having
00:52:25.600
kids much anymore. And in terms of getting into positions, there's going to be so few young
00:52:30.720
people as our kids are growing up. I mean, consider like getting into Harvard, right? Like
00:52:34.700
that was as hard as it could have ever been. Like when I got into Stanford or she got into Cambridge,
00:52:39.620
the years that we got into our generation, because that was that giant generation when not only was
00:52:44.120
a generation larger than any generation before it, but it was a generation that was also more
00:52:49.520
focused on getting into college than any generation before it. Now, like people don't want to go to
00:52:53.880
college at high rates anymore. And the, the number of, of people who even could apply a smaller.
00:53:00.060
So for them getting into these top schools, getting into top positions is just going to be
00:53:05.160
dramatically easier unless AI changes everything. And like, no human has a job.
00:53:09.260
Yeah. And so speaking about AI, actually, I just in like, okay, like last five minutes,
00:53:13.920
cause I have to go get the kids. There are current panics. I think some are justified. Like I alluded
00:53:18.660
to earlier water shortages, that, that is an actual thing that like people kind of need to figure out
00:53:23.300
sooner rather than later, because Mexico city is about to run out of water. Johannesburg is about
00:53:27.380
to run out of water. Cape town is about to run out of water. Major Indian mega cities like Delhi and
00:53:32.980
Bangalore and Chennai, Mumbai and Kolkata are all about to run out of water as are a bunch of other
00:53:40.000
places, including Sao Paulo and Beijing and Cairo and Jakarta and Istanbul and Mexico city and London
00:53:44.840
and Tokyo and Miami. They're all in, and not a very good position. So I think that's one
00:53:48.940
demographic collapse. We talked about what about, where do you stand with anti-apocalyptic system?
00:53:53.460
Because it's not, it doesn't really fit neatly into any of my criteria. Like it's not necessarily a
00:53:59.940
social panic, but it is kind of a social panic. It should, it should be treated as more of a social
00:54:04.700
panic. The Ellie Isaac Bukowski, the murder bots are going to kill us all is unfounded, stupid. You
00:54:10.780
can watch our videos on why we think it's stupid, but the, the AI will completely disrupt the way
00:54:15.140
society and the economy work. It's something we need to be paying attention to that a lot of people
00:54:21.080
will be out of the job in the short term is something we need to pay attention to and how we
00:54:25.100
work with AI instead of making ourselves a threat to AI by these jihadists. You know, we need to,
00:54:31.220
to find a way you can look at our Sons of Man series on this, where we talk about how you can
00:54:37.460
build this durable alliance. We'll do a track on it eventually where I go deeper into it. But
00:54:41.600
the wider thing I want to talk to you about AI here is I think it's something that people get
00:54:45.700
wrong. And it's one of those really like dumb takes you here all the time on the internet.
00:54:48.860
The reason why the billionaire class stopped caring about climate change is because now
00:54:54.480
they're all invested in AI and AI runs counter to climate change. A lot of people like Elon,
00:55:00.680
for example, he was pro climate and pro crypto at the same time, right? Like a Bitcoin crypto
00:55:05.400
explicitly, right? Like they don't care that their causes conflict with each other. That's not why
00:55:11.400
they left climate change. I don't even think it's tangentially related to why they stopped caring
00:55:16.500
about climate change. I just think that the climate movement lost its steam, mostly through
00:55:26.060
hyperbolic freakouts, where I think with the pronatalist movement, we've done a very good job
00:55:32.460
of localizing our hyperbolic freakouts so that we will see these collapses in places like Korea and be
0.89
00:55:38.960
able to point and say, look, I told you so in a way that is going to be hard for other people to miss.
00:55:43.600
Yeah. Yeah. I mean, yeah. Yeah. I think people need to look at, at the way, how I've moderated my
00:55:56.440
views is I need not complex modeling, but just very clear brass tacks. Like this system requires X to
00:56:07.460
work. X is, is not going to be a constant anymore. We cannot expect X to continue. And therefore,
00:56:13.200
we cannot rely on the system anymore. Like that, that is a very clear thing with demographical apps
00:56:19.060
that I know we just haven't addressed and that, that we can't address. And therefore we have to
00:56:24.780
prepare accordingly with AI. I really, I just don't, I don't, it's, I think it's hard to prepare
00:56:33.320
because it's such an unknown unknown. Well, I mean, when I am developing for AI, I literally just
00:56:41.260
will ask AI, Hey, what capability can I give you that would scare AI safety experts most?
00:56:47.920
That's been my design philosophy was the autonomous agents, which are mostly working now. We'll get
00:56:52.440
them to our, the reason why we haven't released them yet to like the VIP fans, which is what we're
00:56:56.160
going to do is because the RFAB main site was so buggy when I first thought it was safe to release
00:57:01.740
that I was just really embarrassed about that. And I never want to make that mistake again.
00:57:04.600
I want to do extensive testing so that even when we're doing our initial rollout, it's fairly bug
00:57:10.260
free, but I'm excited to be moving forward. Decided to have our fab stable. I mean,
00:57:16.560
Oh, wait in the comments, if you've made it this far, cause I actually do want to, I want to know
00:57:22.180
more about people's thoughts on what we should actually be panicking about now. Like what,
00:57:26.960
what panic is justified now? Yeah. Climate change is over.
00:57:30.140
I'll tell you a panic we haven't seen yet, but I expect we will see is more a panic around young
00:57:36.780
people dating AIs. It's a thing that's happening. It lends itself very well to like a debauchery
0.99
00:57:42.760
panic, robo sexuality ads being out there. I can see it. I can too. I mean, I'm basically married
0.62
00:57:51.600
to an AI. Like you're not. I also just feel like people kind of don't care anymore. Like there's
00:57:56.000
this really interesting moral malaise. I don't know. I, I, I just, things have gotten so absurd
00:58:02.600
now that I, I have difficulty. Do you know how absurd the world is? You and I are famous in this
00:58:10.440
crazy timeline. Well, you got, you got Octavian telling you, mom, get the kids. Yeah. They might be
00:58:20.700
outside now. So I got to run. I love you very much. And I will make your mango curry for dinner.
00:58:26.000
Good night, Malcolm. Oh, what episode do you want to run tomorrow?
00:58:29.320
I don't know. Your call. Um, it can be either of yesterday's episodes or either of today's
00:58:33.920
episodes. Like this one, this one's good. We'll run this one. Yeah, it's timely. I mean,
00:58:38.800
cause you know, 2025 is, is now quickly disappearing in the rear of a mirror. We got to do this while
00:58:44.780
it's fresh. People are forgetting about it already. Oh my God. All right. Ciao. Ciao.
00:58:51.020
And see if you can find anything wrong with the website because we are at a stage where I think
00:58:55.060
we can start advertising like tomorrow, if you can. Well, have you put in the Google ads yet?
00:59:01.960
You just said you did. I can't remember. Yeah. Oh yeah. Google and Reddit ads are now working.
00:59:06.080
Wonderful. Oh, oh, nice. Yeah. Yeah. But I love you to death. And I'm making you the mango curry
00:59:12.500
tonight, right? Yeah, that would be great. Yeah. So if you're doing the mango curry. Peppers and green
00:59:17.820
onions, but cut in those slightly larger chunks, not in little circles, right? Yes. And I would add
00:59:23.200
some garlic and sambolic, but not the hoisin. Yeah. Not hoisin. Yeah. Okay. Sambolic. Oh,
00:59:28.620
sambolic. Yeah. I could see that. I could see that. I think mangoes are so disgusting.
00:59:34.080
Oh, well, I appreciate you cooking with them for my benefit. As long as I don't have to cut it. I'm,
00:59:39.180
I'm okay. Because they're slimy. Yeah. They're the vaginas of fruit. There's a lot of fruits I don't like.
00:59:47.920
Come on. It's just gross. Anyway, actually, probably what's that really disgusting smelling
00:59:53.720
one? Durian. Durian's probably. Do the people think that they're like cool for liking? And I'm
00:59:58.900
like. Don't, don't even. No. Yeah. I'm sorry. No, that's not. It's not cool. For some reason,
01:00:05.080
lots of people acted like liking durian. Did you have to deal with that too? It was a thing. Yeah. I didn't
01:00:10.540
have to deal with it in any office. Thank goodness. But I am aware of it as a problem.
01:00:17.000
It was cringe. It was very, it was very cringe. It was the, it was a Silicon Valley startup version
01:00:23.580
of the person who microwaved fish in the lunchroom. You know, don't do it. It's not cool. Just don't.
01:00:33.420
Hey. Yeah. Hey, who wants to set up the trains?
01:00:38.020
You guys are too busy fishing? Yeah. Yeah. Looks like Octavian's going to have to get things
01:00:44.480
started. Huh, buddy? Yeah. Yeah. Yeah. I'm getting, I'm getting the pieces. Okay. Wow.
01:00:51.800
I'm getting the rubber bags off of the tracks if he would want me. Yes, please. Thank you
01:00:56.600
very much. Absolutely. Thank you, Octavian. Yes, absolutely. Yeah. It's okay, buddy. You're
01:01:12.420
going to build something. Yeah. Yeah. So how, so how these fish go on with the magnet is this,
01:01:21.420
it's just a teeny tiny magnet on this part of it. Very nice design, isn't it? Yeah. I love it.