Based Camp - April 01, 2025


WIRED Provably Lied About Us & Pronatalism's Triumph Over Effective Altruism


Episode Stats

Length

47 minutes

Words per Minute

187.88898

Word Count

8,906

Sentence Count

644

Misogynist Sentences

7

Hate Speech Sentences

5


Summary

We're back from Natalcon, and we've got a new piece from Wired about how much it costs to attend the conference, and why it's not as expensive as you might think. Plus, we talk about the pros and cons of NatalCon.


Transcript

00:00:00.000 Hello, Simone. I'm excited to be with you here. Coming back from NatalCon. We've had pieces on us in the past few days in the BBC, in the New York Times. There's going to be a CNN one we know. There's going to be an NPR one we know that haven't come out yet. One on an Italian station, one of the two major stations of the country. But Wired did the most unhinged piece.
00:00:22.500 Which is weird. I grew up loving Wired. This is so strange.
00:00:25.380 I thought of them as like a semi-professional. Not semi-professional. I actually thought of them as like a premium.
00:00:30.600 Absolutely. Beautiful print magazine. Loved their pieces. This is weird.
00:00:35.620 Yeah, but their piece was completely both unhinged and non-factual. Like they got almost every fact wrong in ways that even your average base camp watcher would know.
00:00:49.340 I don't know. Even your understanding of basic linguistics would know.
00:00:52.900 Yeah. We're going to read a bit of the piece and then go over statistics on what's been happening with the pronatalist movement when contrasted with other movements recently.
00:01:01.380 Because we've now significantly passed in terms of like search traffic, effective altruism, AI safety, other stuff like that, which is really cool.
00:01:08.960 The title of the Wired article is Far Right Influencers Are Hosting a 10K Per Person Matchmaking Weekend to Repopulate the Earth.
00:01:18.200 So they're claiming that it costs $10,000 to attend NatalCon, which is that true.
00:01:25.640 Well, okay. So if you're a watcher of this show, you know, we've been promoting the conference constantly for a long time.
00:01:31.220 We've been promoting it with small discounts, but it also means, you know, the conference cost nothing near $10,000 to go to.
00:01:38.940 I know it was expensive. I think it was around $1,000.
00:01:41.800 It was $1,000. So then $900 with our 10% discount.
00:01:45.020 Yeah. Which is a lot, I know, but they also did a really good job of making it good and fun.
00:01:50.740 Yeah.
00:01:51.340 From my perspective.
00:01:52.320 Yeah.
00:01:52.560 And I do think that that amount of money is needed to screen out crazy people.
00:01:57.760 Oh, plus also like, have you seen how much any event space charges for food? Like one cookie, $10.
00:02:05.800 And they served breakfast, lunch, and dinner on Saturday, plus snacks, plus open bar at dinner.
00:02:12.200 And then there was dinner and open bar on Friday. I just, you can't use professional venue spaces and basically do anything less than that.
00:02:20.300 I think that what people are thinking when they think of like costs or they're comparing it to something is they're not comparing, they're comparing it to a con, like a furry con or something like that.
00:02:30.460 Yeah. Where there's an artist alley, there's absolutely zero food. And then there's just some speakers and it's at a bigger event space. Yeah.
00:02:36.940 Yeah. They're, they're not thinking like, okay, they've got to pay for the speakers. They've got to pay for the hotels for the speakers. I don't think any of the speakers are paid, but they got to pay for the hotels for the speakers.
00:02:45.600 They, they sometimes need to pay for flights. They need to pay for all of the venue space. They need to pay for security. They need to pay for insurance. And then they need to pay for all the meals and the open bar.
00:02:56.220 They need to pay for security too, because there were protesters out the first night. There were lots of people trying to get in who didn't have registration.
00:03:03.300 Yeah. And it wasn't just like a normal venue. Like it was like a really nice venue. And then they had the second one where they rented out a museum. So like it, like I'd say for what it was, it seemed reasonably priced to me.
00:03:14.020 Just so people know, we donated to this to make it happen because I knew that like last year they were in the red in running it. I wanted to help them not, not risk that again. You know, Kevin Dolan took a personal hit on that. And so, but we, we weren't organizing it. You know, we weren't personally going to suffer if this didn't end up working out.
00:03:31.600 But the $15,000 thing is insane. Sorry. Yeah. The, the $10,000 in the title of the Wired article being that factually inaccurate.
00:03:44.920 Yeah. To literally have a title with that egregious of a factual error is wild. This, I just, I don't know. I, I, I knew journalism was largely bankrupt in terms of quality, but I guess I, having loved Wired so much,
00:04:00.700 I'm so, I'm still so stunned. Here's the thing I know. And it, somebody can be like, well, how can you be sure that nobody made a $10,000 ticket purchase? Like maybe at the last minute or something like that. And the, the answer is because we donated $10,000 to the conference. That's what we donated.
00:04:19.280 Yeah. I think someone else probably in the media knew that. I think we'd mentioned that to someone or it was out somewhere.
00:04:25.040 Yeah. But the point being is it's, that was the second biggest donation to the conference. So I know they didn't get money from anyone else. Here's what I assume probably happened because Kevin loves effing was journalists. He probably, and I think he even mentioned, he did this at one point when he knew a journalist was hostile, gave them a comically high cost figure to, to be like, Hey, I'm okay.
00:04:48.040 He was having you come here and write a hit piece. If you pay me $10,000.
00:04:52.060 Oh no, that's fair.
00:04:53.160 That's fair.
00:04:54.560 Okay. But, but it would mean that they didn't do any research on like the conference or anything around it. But anyway, go into the piece. Let's, let's read a bit because it gets really unhinged.
00:05:04.160 Should I just go straight to the unhinged part or should I read it?
00:05:07.200 Yeah. Go to the unhinged part.
00:05:09.040 So this is what really gets me.
00:05:13.260 Natal conference organizer, Kevin Dolan, a father of at least six, according to Politico, has previously stated that eugenics, the belief that white people are genetically superior and the pronatalist movement are very much aligned, which is the most unhinged definition of eugenics I have ever heard.
00:05:32.660 One, in case you are one of these unhinged people, eugenics as defined, look at Wikipedia and Wikipedia is not a conservative dominated space. There's, this is well documented too. People have actually looked at the backgrounds of all, you know, active profile or page editors.
00:05:46.940 So Wikipedia defines it as believing there are good traits and bad traits on a universal level and trying to maximize the good traits and minimize the bad traits on a societal level.
00:05:58.560 We don't agree with that, but that's what the sort of going definition is in society. And when we don't support the concept of eugenics by that definition, for sure.
00:06:07.060 Some people also, especially on the tech, right, who talk about eugenics, and I think this is more where Kevin Dolan is, are just really going to the roots of the word, which is you, good, genics, genes, good genes, like having an interest in passing on favorable genes and traits,
00:06:26.600 which really is a culture and context-specific thing, meaning that each group has, you know, things that they want to pass on to future generations that they should be proud of.
00:06:35.080 But they, no, they just say that eugenics is, quote, the belief that white people are genetically superior, which is.
00:06:42.900 But I would assume, like, a lot of people accuse us of.
00:06:45.360 A definition of white supremacism, right? But not.
00:06:47.800 Yeah, so I, I, and this is something that, like, presumably went through editors or something like that, right?
00:06:53.980 One would hope.
00:06:54.700 This is a definition of eugenics that I would assume a middle schooler, like your average intelligence middle schooler.
00:07:01.460 Well, even if we ran this through, like, ChatGPT or Claude and asked, hey, can you just quickly edit this for me?
00:07:09.040 That they would probably point out that a word would be.
00:07:12.320 Actually, hold on, let's do a thing.
00:07:13.900 Can you take that paragraph, copy and paste it into Claude and ask what grade level they would assume the writer is?
00:07:21.860 But while she's doing that, this is wild to me because, you know, sometimes we get called eugenicists and I have to be like, actually, we're not technically eugenicists because eugenics means that you want to coerce an entire population to do something using either the government or using laws or using social pressure.
00:07:40.500 And we believe in polygenics, which is everyone being able to make the genetic choices that they want.
00:07:45.020 So, one, we're not even eugenicists by, like, the standard definition, but I can understand how someone can make that mistake.
00:07:50.920 They hear, oh, they do genetic selection.
00:07:53.660 They understand, like, was there vague understanding of eugenics that it means, you know, trying to make genes better in any potential way.
00:08:03.840 And I can understand that misunderstanding of the definition of eugenics if you haven't recently read the definition just based on, like, a vague understanding.
00:08:12.820 However, this misunderstanding of eugenics is so egregious for what is presumably, like, a science article thing, right, that they could make it, I'm shocked.
00:08:25.520 So, this is what Claude says when I asked it, what is the grade level of the following paragraph and does it contain any errors?
00:08:33.520 Claude says, I can analyze this paragraph for reading level and errors.
00:08:36.620 The paragraph is written at approximately a high school, ninth to tenth grade reading level.
00:08:40.460 It uses some advanced vocabulary, organizer, eugenics, genetically, and pronatalist, which is this high school vocabulary now?
00:08:48.300 Well, because, yikes, also, we need to not send our kids to high school.
00:08:53.160 Anyway, and has complex sentence structure with multiple clauses and em dashes.
00:08:58.360 There is a factual error in the paragraph.
00:09:00.480 Eugenics is incorrectly defined.
00:09:02.200 Eugenics is not specifically the belief that white people are genetically superior.
00:09:06.500 Eugenics more broadly refers to a set of beliefs and practices aimed at improving the genetic quality of the human population,
00:09:12.200 often by encouraging reproduction among people with certain traits and discouraging it amongst others.
00:09:16.440 While eugenics has historically been associated with racist ideologies, including white supremacy,
00:09:22.860 the definition as presented in the paragraph is overly narrow and inaccurate.
00:09:27.900 But it's not just – it's inaccurate in a way where I would – like, this is what gets me.
00:09:32.120 This went through editors.
00:09:34.760 Yeah, well.
00:09:36.240 But what – the piece actually ended up being really good for us because some of my friends,
00:09:40.940 like Stanford GSB type people and other, like, high-profile people in Austin,
00:09:45.420 when they saw this piece about the $15,000 entrance fee, they were like, ooh, this looks – this looks fancy.
00:09:52.240 I want to – I want to know about this.
00:09:53.980 I might want to go to this.
00:09:55.940 And I think –
00:09:56.200 Yeah, it makes it sound great.
00:09:57.540 It's – it's – oh, evil behind closed doors matchmaking conference for the elite that are going to replace the future of humans.
00:10:05.960 Like, I don't know.
00:10:06.500 That sounds – sign me up.
00:10:08.300 That sounds great.
00:10:09.040 Like, it's not exactly true, but I mean.
00:10:13.420 I also wanted to do a roundup on – unless there's any other parts of this Wired article you want to read that you think are fun.
00:10:18.740 What did they say about us in it?
00:10:20.340 Oh, it just called us influencers, which is – whenever I hear influencers, I just think of, like, a makeup influencer or a Mormon influencer.
00:10:27.140 So I'm like, wow, okay.
00:10:28.040 But it is kind of sad to see, like, how many publications have sort of descended into this realm of just, like, tabloidism, that Wired would get, like, factual errors in both their title and, like, basic language errors, like, of scientific facts in their text.
00:10:43.460 Well, it's also just – there are just so many unhinged things.
00:10:47.220 Okay, no, I – okay, yeah, sorry.
00:10:48.840 We have to just, like, other little details that just show how – it's written from a progressive perspective that's so myopically stuck in its own ideological bubble that it only sees things through a political lens.
00:11:06.080 And I mean that because the article also criticized the fact that the AT&T Hotel and Conference Center allowed NatalCon to be there in the first place.
00:11:16.160 They included a quote from a representative of UT Austin who just said, we allow anyone to host here.
00:11:24.560 We support the First Amendment right to free speech.
00:11:26.660 I remember this part.
00:11:27.480 But then they end with – the venue also hosted an Ayn Rand conference in February.
00:11:34.060 It's in, like, as you can see, they're just –
00:11:36.880 As you can see, they're out of control.
00:11:40.500 I mean, Ayn Rand, can you believe it?
00:11:42.680 It's just –
00:11:44.320 But – and they just spent the entire piece calling our movement Nazis, like, actual Nazis and white supremacists.
00:11:50.580 And then they're, like, like the equivalent movements, like Ayn Rand.
00:11:53.620 And it's clear that they're just, like – you're not allowed to have ideas outside of their bubble.
00:12:01.340 Whoever wrote this and edited this – and if you do, you are a white nationalist.
00:12:06.860 Like, that's it.
00:12:07.800 Like, you go to a conference and the conference is – you know, there were, like, people of all ethnicities there, people of all sexualities there, people of all belief systems there.
00:12:17.440 There were even – apparently there was one homophobic person, which I feel bad about because they were not mean, but they were unapproving of one of the gay couples that went.
00:12:27.120 And I felt really bad about that because they had come based on, you know, liking us.
00:12:30.500 And I'm like, you know, you're going to get that at something like this.
00:12:34.680 But from what I heard, there was only one openly homophobic individual at the event, which is saying a lot for what is apparently a Nazi white supremacist event.
00:12:44.680 But what it showed me is just, like, you can't say anything outside of their bubble.
00:12:50.940 No kidding.
00:12:51.360 But I want to go to graphs now because I want to focus on the growth of the pronatalist movement, which I have found really telling of – like, which I have found has shocked me.
00:13:02.540 So the first graph I'm going to put on the screen here is the search volume on Google Trends of pronatalism versus effective altruism in the United States, where blue is pronatalism.
00:13:15.580 And as you can see, pronatalism in the past year has jumped to being maybe like five or six times, I guess, on average, the search volume of effective altruism or sort of interest online, which is wild to me.
00:13:30.000 Because, you know, growing up in the Silicon Valley area, I thought of effective altruism as sort of everything, like one of the biggest movements in the world, like the big social scene for smart people.
00:13:40.740 And if you look at a map of the United States, a heat map of where people are searching for these, I found this really funny.
00:13:47.480 What you'll see if you're watching this on a podcast is pronatalism is blue, effective altruism is red.
00:13:54.240 Effective altruism is just California, Idaho, and then a few New England states, including, like, in New York and Massachusetts.
00:14:02.200 Yeah, and with the exception of Idaho, it seems to be really concentrated around densely urban areas.
00:14:08.300 When you look at the district area, it is clearly also the densely urban areas in California.
00:14:13.940 The rest doesn't care.
00:14:15.420 And a little bit in the Pacific Northwest and then around Michigan.
00:14:18.380 So basically, I don't know what's going on in Iowa, but...
00:14:22.100 Yeah, what's going on in Iowa?
00:14:23.420 There must be, like, a big EA thing there or something.
00:14:25.960 Yeah, or, like, a wealthy EA donor moved to Iowa and, like, did some YIMBY project there.
00:14:31.940 That's, like, my guess.
00:14:33.040 That seems like...
00:14:33.740 That would be my guess as well.
00:14:34.780 Or maybe they fund something in politics.
00:14:36.720 Well, we can ask.
00:14:37.720 I'll add this in post.
00:14:39.440 She meant Idaho, not Iowa.
00:14:41.400 In either a Grok search or a Google search could determine why it has this blip.
00:14:46.240 It might just be an anomaly.
00:14:48.560 But I found that, too, really interesting.
00:14:50.320 And then if you look at it on the global level, again, you can see that we now have moved from being, like, well below effective altruism in terms of search volume to being maybe four times larger.
00:15:02.140 And pretty reliably so for the past year or so.
00:15:06.020 And I note here that effective altruism has hundreds of millions of dollars going to it.
00:15:12.020 We got one $500,000 grant a few years ago and made all this happen.
00:15:16.960 Yeah.
00:15:17.220 And not that we're doing it on our own.
00:15:19.080 You know, like, Kevin Dolan is doing a great job, but he's not getting outside donations.
00:15:22.420 You know, when we gave him the $10,000 donation, he was like, this is a huge donation.
00:15:25.940 It means so much to me.
00:15:26.580 When I gave it, I was like, I'm sorry we can't do more.
00:15:28.400 We just don't have that much money.
00:15:29.420 And so I know that he's not getting big donations from other people.
00:15:34.200 And if you look at the global map here, this one I absolutely love.
00:15:39.440 Actually, just to make a note, I think that the biggest donation made relative to demographic collapse was the one that the Elon Musk Foundation made to the Dean Spears-led demography research at the University of Austin.
00:15:56.120 Dean Spears did not attend NatalCon.
00:15:58.240 Dean Spears has said in recent stories, basically like, oh, Elon Musk's donation hasn't affected our work in any way.
00:16:05.100 He's like trying to disassociate from Elon Musk, if anything.
00:16:08.740 So basically, like they are one of the least vocal presences in the movement's public discourse that would be associated with search volume.
00:16:18.600 And I think that really speaks volumes to what's going on here, that this is not a funded public discourse.
00:16:27.280 It's driven by some very strong personalities and some very strong concerns.
00:16:32.880 Yeah. And if you look at the global breakdown of where people are focused on each of these, sorry, actually, we should probably mention what effective altruism is for some of our viewers who may not know.
00:16:43.280 Effective altruism is a fairly large, or I thought large, like this is why I'm benchmarking ourselves off of it, movement around the Silicon Valley community.
00:16:52.940 This is what Sam Bankman-Fried was funding.
00:16:54.880 It first emerged at Oxford University, though.
00:16:57.500 Yeah, from like Sam Altman gives it funding type stuff.
00:17:02.340 Like a lot of the guy who founded Ethereum gives it a lot of funding.
00:17:05.740 Vitalik, yeah.
00:17:06.820 Vitalik, yeah.
00:17:07.740 Pretty much if you're like an alt-productive intellectual and you are progressive coded or don't want to like offend people by being overtly conservative coded, this is where pretty much all of them have been for a long time.
00:17:22.280 And it runs a number of very large organizations.
00:17:25.980 We'll get to some of them in a second.
00:17:27.500 That they do get, I think it's $400 million a year is what I remember the last time I had in post.
00:17:32.740 What? That's insane.
00:17:34.900 Yes, I was right.
00:17:35.940 It's around $400 million.
00:17:37.540 In 2019, they got $416 million.
00:17:41.720 And that's the first search result that comes up when I Google how much do they get per year.
00:17:46.120 Yeah.
00:17:46.540 And then redistributes it to various charities that they see as effective.
00:17:50.620 Of course, most of them won't work with us, but we did get a grant from a survival and flourishing fund, which is technically effective altruists,
00:17:56.500 which I really appreciate.
00:17:57.860 That was a game changer for us.
00:17:59.280 Yeah.
00:17:59.800 It was a game changer and it's allowed us to create this movement.
00:18:02.640 And we'll talk a bit about why I think the effective altruist money gets piddled away so much.
00:18:07.320 And they really haven't been able to make things like AI safety, which we're going to compare ourselves to in just a little bit,
00:18:12.740 take off in the way that natalism has.
00:18:15.060 Even though they spent millions of dollars on promoting it every year.
00:18:19.960 So here next, you see the worldwide map, which I found really effective, are very interesting of where the search volume is strong.
00:18:29.260 So basically across the whole world, you know, the United States, Central America, South America, all natalism is winning over effective altruism.
00:18:36.920 Russia, it's winning, France and Spain and Portugal, it's winning, Eastern Europe, it's winning, India, it's winning, Africa, it's winning, Middle East, it's winning.
00:18:46.760 So where is effective altruism bigger?
00:18:49.340 Canada, one, the UK, Germany, Italy, Norway, Sweden.
00:18:54.440 China, China, where they've just given up, basically.
00:18:56.920 I found, well, you're going to see China as a trend across here, which I find really interesting.
00:19:01.540 And Australia, I found that interesting.
00:19:05.080 It also looks like Indonesia, which is weird.
00:19:07.940 I don't know what's up with that, but I assume that they don't really have a population problem.
00:19:12.160 But let's go to AI safety, right?
00:19:13.920 Like AI safety, because people can say, okay, this stuff is just getting big because the trends are in your favor, right?
00:19:19.500 It's not any effort you or, and I point out, like, we're a team.
00:19:23.900 Like, whether it's us, or Kevin Dolan, or Dan Hess, or Catherine Pokaloc, who wrote Hannah's Children, you know, we all talk, we all work to coordinate.
00:19:34.280 Or even the people at Heritage.
00:19:35.140 Yeah, and Heritage, too.
00:19:36.480 While we may advocate for very different styles of prenatalism or be excited about different policies, we are very much all fighting for the same thing.
00:19:46.720 Yeah, and if they're doing something, they email us and let us know.
00:19:49.900 And if we're doing something, we email them and let them know.
00:19:52.320 We ask their thoughts.
00:19:53.240 Like, there is really, like, an inter-community communication web here, and it is a small team, and most of the team is self-funded.
00:20:01.800 Like, Dan Hess gets, I don't think he's ever gotten any funding for his prenatalist work.
00:20:06.340 No, in fact, he's donated so much money of, like, his own to fund things.
00:20:12.100 Like, at NatalCon, he distributed an incredibly helpful bound fact book packed with graphs, plus a laminated one-pager just summarizing the basics.
00:20:21.300 Oh, can we create a link to that in the, in the, in the-
00:20:23.760 Oh, we can, yeah, we absolutely should, yeah.
00:20:25.140 So do check out Dan Hess, aka Morpher's fact sheet and report on prenatalism.
00:20:30.540 It's just like, okay, if you need a primer, go here.
00:20:34.320 Well, it's like literally every statistic on it I've ever seen.
00:20:37.040 Yeah, no, it's amazing.
00:20:38.240 And, and, you know, he did that at his own expense.
00:20:40.840 He, he came to NatalCon with that, gave it away for free to everyone who went to make them more informed.
00:20:47.340 I mean, this, this man is so awesome.
00:20:50.120 And no one is paying him to do this.
00:20:51.740 He has a date, he works full time and takes care of his family of six and, and helps his wife.
00:20:57.160 And he also does this somehow and is very successful at it.
00:21:00.980 We don't get any pay from the work that we do.
00:21:03.560 We also have full-time day jobs and other projects.
00:21:07.620 We've spent hundreds of thousands of dollars of our personal money on prenatalist advocacy
00:21:11.820 without ever getting any money from it, for it.
00:21:14.540 So like, this is just like an us thing.
00:21:17.240 Like, I think the prenatalists spend a lot of money and make these types of sacrifices on this
00:21:21.700 because they do genuinely believe this stuff.
00:21:24.600 Yeah.
00:21:24.800 And this is when we're going to go into the AI safety stuff.
00:21:26.840 It's some evidence that the key people in this movie don't, movement, don't genuinely believe it.
00:21:31.060 And that's one of the reasons why it hasn't done as well.
00:21:32.980 But okay, so, and the China thing, I wonder why China is more effective outbursts than natalists.
00:21:37.760 You think they'd be like freaked out, right?
00:21:39.880 By the way, for people who are wondering-
00:21:41.140 No, I mean, one, I think this is one of those things that,
00:21:44.000 this is actually the theme of your talk at NatalCon.
00:21:46.580 When you try to impose natalist policies on a population, it backfires, they push back,
00:21:54.080 they want nothing to do with it.
00:21:55.900 And that's what's happening in China now.
00:21:57.860 That's, it's all part of lying flat too.
00:21:59.900 You know, be productive, go have kids.
00:22:01.420 And they're like, you know what?
00:22:02.360 We're the last generation.
00:22:03.500 Thank you very much.
00:22:04.140 Goodbye.
00:22:04.960 And I think that's part of what's going on.
00:22:08.280 I'd also note that we'd mentioned this in other episodes, but you can check this out
00:22:11.520 yourself.
00:22:12.680 The, the base camp discord, for example, is significantly more active.
00:22:17.100 I'd say maybe four times as active as the EA forums.
00:22:20.280 So this isn't just like us saying like, Oh, you know, whatever our podcast is, if you go
00:22:25.640 to the podcast ranker, it does better than the best effective altruist podcast by a large
00:22:32.300 margin.
00:22:33.000 Huge shout out to everyone from the podcast and from the discord who came out either to
00:22:39.660 NatalCon and, or to our meetup before it on Friday.
00:22:43.040 It was so cool to see you guys, you know, who you are.
00:22:46.780 Like what, what a dream come true.
00:22:49.000 It was to see some of you for the first time, some of you again, it was, I just, I, we only
00:22:53.900 regret that we didn't have more time to hang out and chat, but it was great.
00:22:57.680 And it always destroys us when somebody is disappointed.
00:23:02.280 We had one person who was like, I wanted more ultra rich people here.
00:23:05.440 You know, he's like, I thought the tech elite, tech elite, we're going to come like all the
00:23:08.900 founders fund funders.
00:23:10.080 Where's Andreessen Horowitz?
00:23:11.240 Where's founders fund?
00:23:12.240 Where is, yeah.
00:23:12.700 And I was like, all these people have like really interesting startups that are doing well.
00:23:16.440 Is that not good enough for you?
00:23:17.740 And he's like, no, no.
00:23:19.160 So that was, that was, you know, I always, you know, if you're, he was the only one person
00:23:23.740 I think who was disappointed in the event, but it's also because he is one of the world's
00:23:27.660 most elite people.
00:23:29.360 And so I think it's really hard to win, you know, unless we had like.
00:23:35.440 The president of five major countries present, I don't think we would have really impressed
00:23:39.540 him.
00:23:40.600 What do you think China being more effective altruist than natalist is about?
00:23:43.940 Like, that seems weird.
00:23:45.380 I think that effective altruism is the philosophy of people who are negative utilitarians.
00:23:50.880 And I think that China is a negative utilitarian country.
00:23:54.200 I think they want to end suffering.
00:23:56.280 I think they want to end their suffering.
00:23:57.620 I think they want to end their cycle.
00:23:59.620 Tornatalism is the philosophy of expansionist countries that want to grow and that want to
00:24:04.580 see more and do more.
00:24:07.020 And that's all those other countries I'm looking at.
00:24:10.700 Yeah.
00:24:11.260 Well, by the way, if you want to recreate this, the way I did this is every time I did this,
00:24:15.240 I tried to compare like to like.
00:24:16.880 So here we have natalism as a belief with compared to effective altruism as a cause area.
00:24:23.060 And then in these next graphs, we're looking at natalism as a belief compared to AI safety
00:24:27.420 field of study.
00:24:28.580 Yeah.
00:24:28.800 Because you don't want to compare search terms because there could be misspellings or something
00:24:32.400 else weird like that.
00:24:34.740 So the first graph here I'm putting up is in the world.
00:24:38.180 And you can see that in the last year, natalism has soundedly beat, but pretty narrowly as
00:24:43.140 well, AI safety as something people are searching for.
00:24:46.500 You can see Eliezer Yukowsky's world tour as a big spike in AI safety.
00:24:50.200 And you can see our first virality around natalism as a big spike in natalism.
00:24:53.900 Here's a graph of this on the world stage, but it gets more interesting if you look at
00:24:59.000 a map of the countries.
00:25:00.940 That's this next graph of countries here.
00:25:03.480 Natalism is everywhere in the world bigger than AI safetyism, except for England and China,
00:25:12.240 which is really interesting.
00:25:15.540 What if anyone needs to be freaking out about natalism?
00:25:18.380 It's China, but this is where we are.
00:25:20.060 Well, I think that's part of the issue, though, right, is that China has a problem because
00:25:26.980 they don't care.
00:25:28.780 They're not freaking out about it.
00:25:30.340 If someone's freaking out about it, some of them are at least trying.
00:25:34.680 And I think that in China, only the CCP is trying.
00:25:39.760 Yeah.
00:25:40.380 And here we've got quiverful versus natalism.
00:25:43.060 I thought this was interesting because I wanted to look at the more traditional Christian thing.
00:25:46.880 Full's in red, natalism's in blue, and you can see quiverful has basically entirely died.
00:25:51.420 And we're going to do a separate episode on how the quiverful movement died.
00:25:54.720 I really want to do that.
00:25:55.780 Yes.
00:25:57.340 But yeah, natalism is now maybe a hundred times bigger than quiverful is termed in terms of
00:26:02.800 a belief system.
00:26:03.780 I remember when that was, you just couldn't not hear about it.
00:26:06.420 It was such a thing.
00:26:07.260 And we look now at quiverful versus pronatalist as search terms, because before I was trying
00:26:14.560 to compare like to like, so I was like, okay, let's see if we're looking at them as search
00:26:18.040 terms and quiverful belief system, natalism as a belief system.
00:26:21.120 You can see here, natalism has overtaken it pretty soundedly, but not as soundedly, maybe
00:26:25.960 like 4x bigger these days, and really only started regularly beating it, I'd say, this
00:26:33.040 year, which is pretty, this last year, like 2024.
00:26:37.180 Year of tipping points.
00:26:39.320 And now I want to talk about like why AI safety has done such a bad job at this.
00:26:43.640 So this last graph I had earlier this morning, you searched and you find Eliezer Yukowski being
00:26:48.040 drafted, dwarfed by us.
00:26:49.820 I assume it's because you misspelled his name, because in reality, he's still slightly edging
00:26:55.040 us out in terms of public attention.
00:26:58.260 We did that search, by the way, because on our flight back from Austin, both of us watched
00:27:01.500 simultaneously on YouTube, a Strange Aeons video on Harry Potter and the Methods of Rationality,
00:27:08.060 the Harry Potter fan fiction that Eliezer Yukowski wrote.
00:27:11.480 It's just so much fun, speaking of effective altruism, speaking of Eliezer Yukowski, to see
00:27:17.340 someone from completely outside that community learn about it and freak out.
00:27:24.180 And study it and be like, wait, what's going on here?
00:27:26.180 Early in our relationships, I was listening to the audiobook of the Methods of Rationality.
00:27:31.480 And you got into it a little bit.
00:27:33.400 We both stopped it because it was like really...
00:27:35.440 You made it further than I did.
00:27:36.940 It just got so repetitive.
00:27:38.480 And it didn't deliver on its value proposition.
00:27:40.380 I wanted to see the proposed science or reasoning behind magic, because that was Harry...
00:27:48.700 Well, in this fan universe, Harry Potter is this child of like a brilliant engineer.
00:27:54.100 He's a genius.
00:27:54.960 And he gets into Hogwarts and says, I'm going to find out the science behind this and then
00:28:00.720 use my knowledge to save humanity and expand throughout the universe.
00:28:06.240 And I mean, it sounds great.
00:28:07.740 You know, you hear this and you're like, this is a great premise.
00:28:09.900 Like, oh my gosh.
00:28:10.720 And he, you know, like as soon as he gets his Green Gods account, he's like, I can play
00:28:13.820 arbitrage with this.
00:28:14.700 Like, it starts out so promising.
00:28:17.100 And then it just...
00:28:18.100 Yeah.
00:28:18.180 Anyway, definitely check out the Strange Islands video because she's also hilarious and wonderful.
00:28:23.220 But I found a number of things really telling.
00:28:26.120 And I think it's likely Elie Isaacowski's influence as sort of the face of AI safety.
00:28:32.000 I mean, he's sort of the mirroring face to us.
00:28:34.380 If we're the face of pronatalism, whether the pronatalists like it or not...
00:28:38.620 It's got to be him.
00:28:39.580 Yeah.
00:28:40.040 He's the face of AI safety.
00:28:42.420 Yeah.
00:28:42.820 Fun thing.
00:28:43.480 So I asked Rock if it could pair any anime character to Malcolm.
00:28:48.760 Like, which character is Malcolm Collins most like?
00:28:53.000 First, it said Leluke from Code Geass, which I absolutely love.
00:28:57.660 And Simone, I told Simone this.
00:28:58.920 And she goes, oh, that's no surprise.
00:29:00.900 You like his character only because he reminds you of yourself.
00:29:04.540 And I'm like, okay, whatever.
00:29:05.940 But then the next one was, it said Light Yagima from Death Note.
00:29:10.400 And I was like, okay, I like that too.
00:29:11.860 And when I asked it, what character would you pair Elie Isaac with?
00:29:15.600 It said L.
00:29:16.900 And Simone got annoyed at that because she's like, but he's incompetent.
00:29:20.100 But he's also slovenly and an antagonist, which works for me.
00:29:25.140 And he got into this very differently.
00:29:27.660 One of the things she noted is that while he was writing Harry Potter and the Methods of Rationality,
00:29:32.300 he would regularly tell people, oh, I'll put out more chapters or faster chapters if you give more money to our nonprofit.
00:29:38.480 Well, this is because he, I think, it's because he co-founded Miri, originally the Singularity Institute,
00:29:44.140 which was supposed to do research on AI stuff, which would also explain why he became one of the early AI Doomer people.
00:29:50.520 And I think because people have dunked on Miri for not being very productive,
00:29:54.720 that among other things, Elie Isaac was not contributing a whole lot because he was just messing around and writing a fan fiction.
00:30:01.960 And so I think he felt pressured by the team to write a fan, or sorry, to use his fan fiction to shill.
00:30:08.460 I don't think that's it.
00:30:09.860 You don't think so?
00:30:10.400 They weren't like, dude, you're not doing anything.
00:30:11.980 At least make us money.
00:30:12.700 No, I think what you and other people don't understand is a nonprofit, especially if you founded and owned the nonprofit,
00:30:19.100 can sometimes just be a thing that pays you a salary.
00:30:21.860 Well, yeah, and I think he was getting a salary from Miri, not doing anything,
00:30:24.660 and the other Miri team members were giving him shade for that.
00:30:26.820 I don't even know if Miri had a board back then, Simone.
00:30:29.860 My read.
00:30:30.420 Yeah, but he still, he co-founded it, and he still had co-founders, and I think he got shade for it.
00:30:34.580 But my read is what was happening is he was begging for money, and most of that money just paid for his lifestyle.
00:30:43.640 I don't think that much of the money at all was going to AI research.
00:30:49.040 Well, it wasn't, because Miri is famously unproductive.
00:30:53.620 She mentioned that in the Strange Aeon space, that they're known for not getting a lot of papers out.
00:30:58.620 They're known for not getting a lot of real research out, which to me further indicates what I think,
00:31:03.520 which is that it's literally just his GoFundMe account for his lifestyle through the lens of a nonprofit
00:31:11.020 because technically he's getting the word out about rationality,
00:31:14.100 because technically he's getting the word out about AI safety.
00:31:16.560 I would bet that during that period, if you look at it,
00:31:20.520 I would bet over 50% of the money that they raised was just paid out to him in salary.
00:31:25.280 Oh, my gosh.
00:31:26.420 That's what I think was happening there.
00:31:28.080 I wonder, ooh, hold on.
00:31:28.760 So we, our nonprofit, post its financials on a nonprofit.
00:31:32.800 Oh, go to Grok and see if you can find out how much of it went to his personal salary during that period.
00:31:39.240 And what you'll see if you look at our nonprofit's financials is that we take no money from the nonprofit.
00:31:44.480 And when I say we put tons of money towards our nonprofit, there was a Guardian piece, I want to say, a few years ago.
00:31:50.760 It might have been a Telegraph piece where they actually audited our finances to see how much money we were putting into our nonprofit.
00:31:55.420 This was before we got the big grant.
00:31:57.780 And after that, we stopped doing this.
00:31:59.520 But before that, we were putting 45% of our yearly income into our nonprofit, which is a lot when you have, I think back then it was like three kids where we're just like, no, we absolutely have to make this work.
00:32:11.340 So what you see within the-
00:32:12.580 It doesn't look like they make their financials public, which-
00:32:15.320 That makes sense.
00:32:16.220 You have a complete inverse of what's happening within the AIC.
00:32:21.520 Oh, my gosh.
00:32:22.260 Their total revenues fluctuated between $1.5 million and $3 million annually.
00:32:28.220 What?
00:32:29.240 How have we managed to make AIC-
00:32:31.680 With expenses including salaries for its small team of researchers and staff.
00:32:35.560 So in 2019, Miri reported total expenses of around $2.4 million with roughly $1.5 million allocated to salaries for its 20 to 30 employees.
00:32:46.860 So if I take 1.5 divided by 30 is $50,000.
00:32:56.060 And I bet some were making more than others.
00:33:00.100 Yeah.
00:33:00.880 Especially if it was just 20 employees.
00:33:03.220 So, oh, my gosh.
00:33:05.160 That is my bet about what's going on with that.
00:33:08.280 Well, and keep in mind, now I'd point this out.
00:33:09.860 Now contrast this with natalism.
00:33:11.640 I'm trying to think of any natalists who make money off of their work with natalism.
00:33:16.940 Literally just Lyman Stone from what I'm aware of.
00:33:20.860 And maybe-
00:33:20.920 I mean, but also like not that much because Lyman Stone has a day job as a demographer that like literally helps very large corporations determine how many diapers to produce.
00:33:31.860 Like he has to work.
00:33:34.000 And then he also does a lot of work raising money for the Institute for Family Studies, which I'm sure, yes, of course, pays him as well.
00:33:43.420 But he is piecing this together.
00:33:46.020 He has a side gig.
00:33:47.340 Yeah, he has a secondary job, which I think is his primary income.
00:33:50.260 Well, and I think maybe the Institute for Family Studies is his side gig, but he also has to raise money for them.
00:33:55.040 So like it's not – yeah, anyway.
00:33:57.040 Catherine Pakalok, her day job isn't like an author.
00:33:59.360 Catherine Pakalok is a professor of economics.
00:34:01.360 So she has a full-time day job.
00:34:02.520 And she worked with a publisher.
00:34:05.320 I mean, I think her book has been very successful, but a successful book these days, especially if you worked with a publisher, is not making you a lot of money.
00:34:12.740 So I don't think she's getting any material well-
00:34:15.040 If you're in the top 1%, you should see our video on Nobody Reads Anymore, can earn you 20K.
00:34:20.820 Like even if you're in the top 1%.
00:34:22.760 Yeah.
00:34:23.420 Yeah.
00:34:24.040 So like I don't think that that's a significant income source for her.
00:34:27.280 And this matters a lot when you're thinking about this stuff because it – now contrast it with the AI safety people.
00:34:33.920 Every one of the top AI safety people and top effective altruists I know –
00:34:36.960 Making 50 to 100K at least.
00:34:39.500 Making 50 to 100K at least.
00:34:41.600 Their main source of income is donations.
00:34:44.560 And you even saw people like – I think it was like Will McCaskill or some of the other people who were running like major like effective altruism organizations.
00:34:52.800 Write pieces a few years back, especially when San Pape and Freed was big, about why people running these organizations should be able to indulge in luxuries.
00:35:02.960 Remember they did things like bought a castle at one point.
00:35:05.940 They – this is one of the major EA groups.
00:35:07.960 Yeah, they bought a castle.
00:35:09.400 They bought a college at Oxford and they were going to turn it into – or Cambridge.
00:35:13.260 I can't remember which.
00:35:14.300 And they were going to try to turn it into like a new college.
00:35:16.620 That's cool.
00:35:17.360 And they did that for sex parties.
00:35:18.740 What?
00:35:19.160 Oh, God.
00:35:20.780 Well, this is what they said.
00:35:22.020 That came – whatever school this was.
00:35:25.000 I don't remember which one it was.
00:35:25.880 But I remember doing a tour and I was like, whatever happened to that?
00:35:28.040 And somebody was like, oh, yeah.
00:35:28.960 It became known for sex parties.
00:35:30.760 Well, I mean, it was college.
00:35:33.640 Well, and effective altruist.
00:35:36.160 Well, yeah, but I mean like rationalism and effective altruism as we –
00:35:40.220 You should see our episode on the EA to sex worker pipeline if you want to learn more about the lifestyles that are common within this community.
00:35:47.240 But I think through action, what we see between these two communities is one community that really believes what they're saying and working on and another community where it's a performative way to make an income.
00:36:00.400 Yeah.
00:36:00.960 And I think that if you are freaked out about like AI safety or something like that –
00:36:06.060 Well, and also like who's reading these reports that they're making?
00:36:09.100 Who's reading these studies?
00:36:09.800 No, but look at who's putting their money where their mouse is of these two movements.
00:36:15.860 It was – contrast Eliezer Yukowsky begging, you know, with the update he made to the book, you know, and putting out chapters at a whatever pace to us who do daily videos every weekday at 8 a.m. EST on this channel.
00:36:34.160 Contrast that.
00:36:35.400 It's a wild difference.
00:36:37.560 And we don't ever – I don't think we've ever really begged.
00:36:39.900 We've said we would love it if you put us in your will or something, put the nonprofit in your will.
00:36:45.260 And if you do, this is the promise we have, by the way.
00:36:48.180 No, no.
00:36:48.500 That's if you put techno – the Technopuritan Federation.
00:36:52.300 Oh, yeah.
00:36:52.560 This is Technopuritan.
00:36:53.460 So not even this.
00:36:54.020 But if you put Technopuritan in your will and it ends up becoming a big thing and you send us your genes, we will put those genes along with like the amount that you contributed in case it does end up becoming a big thing and ends up creating like artificial realities and ends up needing resources on that.
00:37:09.900 Or instead of cloning people in the future or doing something, that you might be one of the resurrected people.
00:37:16.180 I feel like that's a fun little thing you can do.
00:37:19.000 But literally no one has told us they're doing that.
00:37:21.520 So like it's not like we're getting anything from that.
00:37:24.060 And in terms of like active donations, the main ones we get are hate donations for when people get pissed off by progressives and they live in a more progressive environment.
00:37:33.860 Two people donate $10 a month minus PayPal fees.
00:37:38.380 Because it's like $9.50.
00:37:41.200 Oh, they do.
00:37:41.960 That's really nice.
00:37:42.740 So we make $20 a month.
00:37:46.220 For daily, like 45-minute long episodes.
00:37:49.720 But no, I mean like to those who don't – that is like really genuinely kind.
00:37:53.540 We appreciate that.
00:37:54.520 No, that is genuinely kind.
00:37:56.260 I really appreciate it.
00:37:57.900 And the money does – like as I've pointed out, you are getting much more out of that than you're getting out of like effective altruism in what they're doing, right?
00:38:05.440 Like we have so little and yet we've been able to – at the conference, the other thing that really got me at the conference is I've been to Effective Altruism Conference.
00:38:13.020 This was one of the most competently organized conferences I've ever been to.
00:38:16.520 Shout out to Luke and Catern.
00:38:18.300 I mean, oh my gosh.
00:38:19.260 The quality of the food, first of all, was like good quality food.
00:38:23.160 There was many corn dogs.
00:38:24.340 It was really like nice.
00:38:27.240 The people I met there were – you know, I had one person who emailed after this and they're like, it exceeded all my expectations.
00:38:33.440 This is somebody who's living in Austin.
00:38:35.080 You know, people were coming in from all around the world.
00:38:37.700 I was so excited to talk to them.
00:38:38.820 Yeah, Hong Kong, Singapore.
00:38:40.100 It was pretty wild.
00:38:41.160 Yeah.
00:38:42.380 Yeah.
00:38:42.820 And we had a lot of like really amazing speakers and we didn't have – last time like people got in a fight, like I didn't get in a fight with any Catholics this time, you know, which was really nice.
00:38:54.980 This is over IVF versus, you know, non-IVF stuff.
00:38:58.560 But yeah, I think that it was really like a victory lap for the peronatalist movement, which was really fun.
00:39:05.000 Yeah.
00:39:05.120 Do you have any final thoughts on this or the Wired piece or the recent press we've been getting?
00:39:12.720 I think that people who want to see this movement as evil are going to continue to do so, but it seems to be only to our benefit.
00:39:21.300 And I am just really happy with how the conference went and how the movement is growing.
00:39:25.360 And it gives me hope for the future because a lot of the people at the conference were really – like there was not a lot of cultural overlap and it made me fear less for a cultural mass extinction, which is something that we're really afraid about.
00:39:39.140 Like we want a lot of different opinions in the future.
00:39:41.940 And NatalCon felt like anything but a monoculture, which was really cool.
00:39:48.320 Yeah, it was really interesting.
00:39:50.640 These are not small differences.
00:39:52.240 This is us thinking they're murdering babies.
00:39:54.100 They think we're murdering babies and yet we get along.
00:39:57.600 Like it is absolutely wild that we're able to get along and progressives have the like smallest difference and they go naives out at each other.
00:40:07.100 And the conference was really good.
00:40:08.420 One of the important things, like if you go to this conference and you're like, okay, I maybe met some people or whatever, but did the world really get anything out of this conference?
00:40:16.280 And what I would note is I think the biggest impact of the conference isn't like us having people at it or whatever.
00:40:21.480 However, it is because you guys go and because you make this happen, you make this like a really packed event – I mean packed.
00:40:29.740 Like this conference was large, like much larger than I anticipated.
00:40:33.840 And we then are able to have lots of reporters there.
00:40:36.880 And we didn't have all the reporters as we mentioned, like it was a wire, but we did have a lot.
00:40:39.680 We had New York Times, we had CNN, we had NPR, we had Mother Jones.
00:40:42.960 Like, oh no, Mother Jones was kept out, but they did a great piece.
00:40:45.520 I like their piece.
00:40:46.920 But anyway, lots of reporters there.
00:40:49.880 They – the reporters only cover the movement when the conference happens.
00:40:54.100 Like they need a forcing function to cover the movement.
00:40:56.720 And we can go out and they can do the umptites profile on us, but, you know, that's going to die down after a while.
00:41:03.680 They need something new to be scared about.
00:41:06.220 And you guys making this conference happen, going there, talking with reporters, you all came together and gave them something to be scared about.
00:41:15.740 And because of that, every one of you who contributed to this, you contributed to this press cycle that's happening right now.
00:41:21.760 And this press cycle that's happening right now is going to give us a second boost, is going to help us move things forward, and is going to help keep this conversation centered in a way that when people are afraid of controversy or afraid of doing new things, like the, you know, Effective Altruism Organization, like big gala that happens every year, and I'm sure they spend tons of money on, you know, that doesn't end up getting less oppressed.
00:41:48.240 It doesn't end up moving any idea forward.
00:41:50.600 It doesn't end up doing anything meaningful for the world.
00:41:54.460 But just by being at an event like this, you guys did a lot.
00:41:57.840 And I really, really deeply appreciate that.
00:42:01.200 In fact, I'd say, like, if you're like, oh, you know, the money that I paid for this, like $900 or whatever, right?
00:42:08.300 Would you have rather I just give you two $900 or $1,000?
00:42:13.840 Categorically, I would rather it have gone to the conference, given the amount of press that the conference has generated.
00:42:18.940 Like, that's what I would have paid.
00:42:21.340 I'd probably pay, what, $50,000 for a press cycle this big?
00:42:25.280 And I think that that's probably about what went in from our fans.
00:42:28.080 And that's amazing.
00:42:28.960 I have more than that.
00:42:30.000 Like, a PR person.
00:42:31.740 I could pay a PR firm hundreds of thousands of dollars and not get a press cycle this big.
00:42:35.980 Oh, 100%.
00:42:36.800 And that is huge, because that's what we need to keep this centered in the public mind.
00:42:44.380 But it's also why, you know, if you look at our video on hard EA versus soft EA, we recently started hard EA.com, which is meant to sort of bring funding to some of the other underfunded cause areas where people are finding companies in those spaces.
00:42:57.920 Right now, we've drained the amount of money that we had put in for it.
00:43:01.040 So if anybody wants to, like, put a big grant in for that so we can do something big, let us know.
00:43:05.560 But right now, we've invested in, like, 10 companies, like 10 to 40K each, which I'm really excited about because I'm excited about a lot of the technology that they're bringing.
00:43:15.040 But also, it's an investment and not a donation, which means if any of these companies does well, then we'll have even more money to put into this type of thing.
00:43:21.100 Into future companies.
00:43:22.100 Which is something I'm really excited about.
00:43:22.760 The gift that keeps on giving.
00:43:25.380 Yeah.
00:43:26.600 So thank you all for making this happen.
00:43:29.280 You know, thank you to Kevin Dolan for – and Luke.
00:43:33.020 I mean, Luke seems to be the really competent operator that made this happen.
00:43:36.840 But yeah, this was wild.
00:43:39.340 Yeah, you guys are amazing.
00:43:40.480 And also Kevin Dolan's wife and kids who were there.
00:43:45.260 Oh, my gosh.
00:43:45.800 They were so cute.
00:43:46.920 But yeah, anyway, I love you and I love Nadalism.
00:43:49.220 I love NadalCon and it was really good to see all y'all who showed up.
00:43:52.900 So thanks for coming.
00:43:54.180 Yeah.
00:43:54.900 One funny anecdote I'll give because this is actually something that I was talking to, like, just last week I was talking to one of the, like, leading people of the Effective Altruist Movement.
00:44:03.940 And I was asking them, like, how can we work together?
00:44:07.260 And they just didn't seem to understand, like, I was like, look, I got a hard yay here.
00:44:13.580 I am open to working with you guys.
00:44:15.860 Like, we have a bigger movement in terms of, like, members and focus now.
00:44:20.520 You know, we've got people in the White House.
00:44:23.660 You guys have been begging for that for years.
00:44:26.140 You guys have been spending hundreds of millions of dollars.
00:44:28.380 Like, if you guys can throw a little money our way, we're going to be able to throw a little attention your way and help some of your projects or projects that at least we can both agree on come to fruition.
00:44:41.700 And they basically didn't understand why they were like, I don't understand.
00:44:47.040 And, like, how or why we would work together.
00:44:49.900 And it helped me understand why the movement's gotten as bad as it has.
00:44:53.560 Specifically, what they pointed out was that the very, very large EA orgs that get all this money, like OpenPhil, they're getting money, huge amounts of money from lots of donors.
00:45:06.220 And they have to be afraid of doing anything that would anger any of the donors.
00:45:11.000 That makes sense.
00:45:11.540 And so they end up taking the most conservative, boring approach.
00:45:16.960 But a conservative approach like that means they're donating to things that would be raising money anyway.
00:45:21.420 And so the money is just completely pissed away.
00:45:24.260 Which runs against the fundamental principles of effective altruism.
00:45:28.160 Because in addition to looking at the severity of a problem and how tractable it is, how feasible it is to actually address it, you're looking at how much attention and help it already has.
00:45:38.240 And you don't want to focus on things that already have help.
00:45:41.900 You want to focus on the things that are overlooked, which is one of the reasons why Malcolm chose to address demographic collapse.
00:45:47.280 Oh, I'm so sorry.
00:45:49.360 So, come on, man.
00:45:51.380 Love you to death, Simone.
00:45:52.560 Love you, too.
00:45:54.200 Octavian, what do you want to give Mommy?
00:45:56.460 I love this flower right here.
00:46:01.260 Why do you want to give her that flower?
00:46:02.840 Because I love Mommy.
00:46:04.640 I love different flowers.
00:46:06.560 Mommy all the time, because I love her.
00:46:10.300 Okay.
00:46:11.020 Hey, Titan, where are you going?
00:46:14.600 Can I see the video?
00:46:17.680 What did you see today?
00:46:20.400 I've been having fun.
00:46:22.360 I can't go outside.
00:46:24.560 You've been having fun?
00:46:25.800 Yeah.
00:46:26.460 What were you doing?
00:46:27.440 I was playing with rocks.
00:46:29.840 You're playing with rocks?
00:46:31.080 Yeah, rocks.
00:46:32.160 At where?
00:46:32.640 At the creek?
00:46:33.540 Yeah.
00:46:33.860 Hey, who's the rock?
00:46:36.560 And, and, um, did you see a snake?
00:46:40.580 Uh, yeah.
00:46:42.020 How did you?
00:46:42.600 It was very, with Mommy.
00:46:44.140 It was very scary.
00:46:46.240 And then you kept everyone safe?
00:46:48.460 Yeah.
00:46:49.100 Mom, Daddy, look what I found!
00:46:51.920 What is that?
00:46:53.240 I think I just saw it in the ground.
00:46:56.400 It's a big...
00:46:57.600 It's a drop there.
00:46:59.680 Do not tap the window with rocks.
00:47:01.800 No!
00:47:02.380 Oh, that's...
00:47:03.800 No!
00:47:04.000 Oh!
00:47:04.320 That's it.
00:47:04.840 That's it.
00:47:20.980 That's just...
00:47:23.120 That's...