We're back from Natalcon, and we've got a new piece from Wired about how much it costs to attend the conference, and why it's not as expensive as you might think. Plus, we talk about the pros and cons of NatalCon.
00:00:00.000Hello, Simone. I'm excited to be with you here. Coming back from NatalCon. We've had pieces on us in the past few days in the BBC, in the New York Times. There's going to be a CNN one we know. There's going to be an NPR one we know that haven't come out yet. One on an Italian station, one of the two major stations of the country. But Wired did the most unhinged piece.
00:00:22.500Which is weird. I grew up loving Wired. This is so strange.
00:00:25.380I thought of them as like a semi-professional. Not semi-professional. I actually thought of them as like a premium.
00:00:30.600Absolutely. Beautiful print magazine. Loved their pieces. This is weird.
00:00:35.620Yeah, but their piece was completely both unhinged and non-factual. Like they got almost every fact wrong in ways that even your average base camp watcher would know.
00:00:49.340I don't know. Even your understanding of basic linguistics would know.
00:00:52.900Yeah. We're going to read a bit of the piece and then go over statistics on what's been happening with the pronatalist movement when contrasted with other movements recently.
00:01:01.380Because we've now significantly passed in terms of like search traffic, effective altruism, AI safety, other stuff like that, which is really cool.
00:01:08.960The title of the Wired article is Far Right Influencers Are Hosting a 10K Per Person Matchmaking Weekend to Repopulate the Earth.
00:01:18.200So they're claiming that it costs $10,000 to attend NatalCon, which is that true.
00:01:25.640Well, okay. So if you're a watcher of this show, you know, we've been promoting the conference constantly for a long time.
00:01:31.220We've been promoting it with small discounts, but it also means, you know, the conference cost nothing near $10,000 to go to.
00:01:38.940I know it was expensive. I think it was around $1,000.
00:01:41.800It was $1,000. So then $900 with our 10% discount.
00:01:45.020Yeah. Which is a lot, I know, but they also did a really good job of making it good and fun.
00:01:52.560And I do think that that amount of money is needed to screen out crazy people.
00:01:57.760Oh, plus also like, have you seen how much any event space charges for food? Like one cookie, $10.
00:02:05.800And they served breakfast, lunch, and dinner on Saturday, plus snacks, plus open bar at dinner.
00:02:12.200And then there was dinner and open bar on Friday. I just, you can't use professional venue spaces and basically do anything less than that.
00:02:20.300I think that what people are thinking when they think of like costs or they're comparing it to something is they're not comparing, they're comparing it to a con, like a furry con or something like that.
00:02:30.460Yeah. Where there's an artist alley, there's absolutely zero food. And then there's just some speakers and it's at a bigger event space. Yeah.
00:02:36.940Yeah. They're, they're not thinking like, okay, they've got to pay for the speakers. They've got to pay for the hotels for the speakers. I don't think any of the speakers are paid, but they got to pay for the hotels for the speakers.
00:02:45.600They, they sometimes need to pay for flights. They need to pay for all of the venue space. They need to pay for security. They need to pay for insurance. And then they need to pay for all the meals and the open bar.
00:02:56.220They need to pay for security too, because there were protesters out the first night. There were lots of people trying to get in who didn't have registration.
00:03:03.300Yeah. And it wasn't just like a normal venue. Like it was like a really nice venue. And then they had the second one where they rented out a museum. So like it, like I'd say for what it was, it seemed reasonably priced to me.
00:03:14.020Just so people know, we donated to this to make it happen because I knew that like last year they were in the red in running it. I wanted to help them not, not risk that again. You know, Kevin Dolan took a personal hit on that. And so, but we, we weren't organizing it. You know, we weren't personally going to suffer if this didn't end up working out.
00:03:31.600But the $15,000 thing is insane. Sorry. Yeah. The, the $10,000 in the title of the Wired article being that factually inaccurate.
00:03:44.920Yeah. To literally have a title with that egregious of a factual error is wild. This, I just, I don't know. I, I, I knew journalism was largely bankrupt in terms of quality, but I guess I, having loved Wired so much,
00:04:00.700I'm so, I'm still so stunned. Here's the thing I know. And it, somebody can be like, well, how can you be sure that nobody made a $10,000 ticket purchase? Like maybe at the last minute or something like that. And the, the answer is because we donated $10,000 to the conference. That's what we donated.
00:04:19.280Yeah. I think someone else probably in the media knew that. I think we'd mentioned that to someone or it was out somewhere.
00:04:25.040Yeah. But the point being is it's, that was the second biggest donation to the conference. So I know they didn't get money from anyone else. Here's what I assume probably happened because Kevin loves effing was journalists. He probably, and I think he even mentioned, he did this at one point when he knew a journalist was hostile, gave them a comically high cost figure to, to be like, Hey, I'm okay.
00:04:48.040He was having you come here and write a hit piece. If you pay me $10,000.
00:04:54.560Okay. But, but it would mean that they didn't do any research on like the conference or anything around it. But anyway, go into the piece. Let's, let's read a bit because it gets really unhinged.
00:05:04.160Should I just go straight to the unhinged part or should I read it?
00:05:13.260Natal conference organizer, Kevin Dolan, a father of at least six, according to Politico, has previously stated that eugenics, the belief that white people are genetically superior and the pronatalist movement are very much aligned, which is the most unhinged definition of eugenics I have ever heard.
00:05:32.660One, in case you are one of these unhinged people, eugenics as defined, look at Wikipedia and Wikipedia is not a conservative dominated space. There's, this is well documented too. People have actually looked at the backgrounds of all, you know, active profile or page editors.
00:05:46.940So Wikipedia defines it as believing there are good traits and bad traits on a universal level and trying to maximize the good traits and minimize the bad traits on a societal level.
00:05:58.560We don't agree with that, but that's what the sort of going definition is in society. And when we don't support the concept of eugenics by that definition, for sure.
00:06:07.060Some people also, especially on the tech, right, who talk about eugenics, and I think this is more where Kevin Dolan is, are just really going to the roots of the word, which is you, good, genics, genes, good genes, like having an interest in passing on favorable genes and traits,
00:06:26.600which really is a culture and context-specific thing, meaning that each group has, you know, things that they want to pass on to future generations that they should be proud of.
00:06:35.080But they, no, they just say that eugenics is, quote, the belief that white people are genetically superior, which is.
00:06:42.900But I would assume, like, a lot of people accuse us of.
00:06:45.360A definition of white supremacism, right? But not.
00:06:47.800Yeah, so I, I, and this is something that, like, presumably went through editors or something like that, right?
00:07:13.900Can you take that paragraph, copy and paste it into Claude and ask what grade level they would assume the writer is?
00:07:21.860But while she's doing that, this is wild to me because, you know, sometimes we get called eugenicists and I have to be like, actually, we're not technically eugenicists because eugenics means that you want to coerce an entire population to do something using either the government or using laws or using social pressure.
00:07:40.500And we believe in polygenics, which is everyone being able to make the genetic choices that they want.
00:07:45.020So, one, we're not even eugenicists by, like, the standard definition, but I can understand how someone can make that mistake.
00:07:50.920They hear, oh, they do genetic selection.
00:07:53.660They understand, like, was there vague understanding of eugenics that it means, you know, trying to make genes better in any potential way.
00:08:03.840And I can understand that misunderstanding of the definition of eugenics if you haven't recently read the definition just based on, like, a vague understanding.
00:08:12.820However, this misunderstanding of eugenics is so egregious for what is presumably, like, a science article thing, right, that they could make it, I'm shocked.
00:08:25.520So, this is what Claude says when I asked it, what is the grade level of the following paragraph and does it contain any errors?
00:08:33.520Claude says, I can analyze this paragraph for reading level and errors.
00:08:36.620The paragraph is written at approximately a high school, ninth to tenth grade reading level.
00:08:40.460It uses some advanced vocabulary, organizer, eugenics, genetically, and pronatalist, which is this high school vocabulary now?
00:08:48.300Well, because, yikes, also, we need to not send our kids to high school.
00:08:53.160Anyway, and has complex sentence structure with multiple clauses and em dashes.
00:08:58.360There is a factual error in the paragraph.
00:10:20.340Oh, it just called us influencers, which is – whenever I hear influencers, I just think of, like, a makeup influencer or a Mormon influencer.
00:10:28.040But it is kind of sad to see, like, how many publications have sort of descended into this realm of just, like, tabloidism, that Wired would get, like, factual errors in both their title and, like, basic language errors, like, of scientific facts in their text.
00:10:43.460Well, it's also just – there are just so many unhinged things.
00:10:48.840We have to just, like, other little details that just show how – it's written from a progressive perspective that's so myopically stuck in its own ideological bubble that it only sees things through a political lens.
00:11:06.080And I mean that because the article also criticized the fact that the AT&T Hotel and Conference Center allowed NatalCon to be there in the first place.
00:11:16.160They included a quote from a representative of UT Austin who just said, we allow anyone to host here.
00:11:24.560We support the First Amendment right to free speech.
00:12:07.800Like, you go to a conference and the conference is – you know, there were, like, people of all ethnicities there, people of all sexualities there, people of all belief systems there.
00:12:17.440There were even – apparently there was one homophobic person, which I feel bad about because they were not mean, but they were unapproving of one of the gay couples that went.
00:12:27.120And I felt really bad about that because they had come based on, you know, liking us.
00:12:30.500And I'm like, you know, you're going to get that at something like this.
00:12:34.680But from what I heard, there was only one openly homophobic individual at the event, which is saying a lot for what is apparently a Nazi white supremacist event.
00:12:44.680But what it showed me is just, like, you can't say anything outside of their bubble.
00:12:51.360But I want to go to graphs now because I want to focus on the growth of the pronatalist movement, which I have found really telling of – like, which I have found has shocked me.
00:13:02.540So the first graph I'm going to put on the screen here is the search volume on Google Trends of pronatalism versus effective altruism in the United States, where blue is pronatalism.
00:13:15.580And as you can see, pronatalism in the past year has jumped to being maybe like five or six times, I guess, on average, the search volume of effective altruism or sort of interest online, which is wild to me.
00:13:30.000Because, you know, growing up in the Silicon Valley area, I thought of effective altruism as sort of everything, like one of the biggest movements in the world, like the big social scene for smart people.
00:13:40.740And if you look at a map of the United States, a heat map of where people are searching for these, I found this really funny.
00:13:47.480What you'll see if you're watching this on a podcast is pronatalism is blue, effective altruism is red.
00:13:54.240Effective altruism is just California, Idaho, and then a few New England states, including, like, in New York and Massachusetts.
00:14:02.200Yeah, and with the exception of Idaho, it seems to be really concentrated around densely urban areas.
00:14:08.300When you look at the district area, it is clearly also the densely urban areas in California.
00:14:48.560But I found that, too, really interesting.
00:14:50.320And then if you look at it on the global level, again, you can see that we now have moved from being, like, well below effective altruism in terms of search volume to being maybe four times larger.
00:15:02.140And pretty reliably so for the past year or so.
00:15:06.020And I note here that effective altruism has hundreds of millions of dollars going to it.
00:15:12.020We got one $500,000 grant a few years ago and made all this happen.
00:15:29.420And so I know that he's not getting big donations from other people.
00:15:34.200And if you look at the global map here, this one I absolutely love.
00:15:39.440Actually, just to make a note, I think that the biggest donation made relative to demographic collapse was the one that the Elon Musk Foundation made to the Dean Spears-led demography research at the University of Austin.
00:15:58.240Dean Spears has said in recent stories, basically like, oh, Elon Musk's donation hasn't affected our work in any way.
00:16:05.100He's like trying to disassociate from Elon Musk, if anything.
00:16:08.740So basically, like they are one of the least vocal presences in the movement's public discourse that would be associated with search volume.
00:16:18.600And I think that really speaks volumes to what's going on here, that this is not a funded public discourse.
00:16:27.280It's driven by some very strong personalities and some very strong concerns.
00:16:32.880Yeah. And if you look at the global breakdown of where people are focused on each of these, sorry, actually, we should probably mention what effective altruism is for some of our viewers who may not know.
00:16:43.280Effective altruism is a fairly large, or I thought large, like this is why I'm benchmarking ourselves off of it, movement around the Silicon Valley community.
00:16:52.940This is what Sam Bankman-Fried was funding.
00:16:54.880It first emerged at Oxford University, though.
00:16:57.500Yeah, from like Sam Altman gives it funding type stuff.
00:17:02.340Like a lot of the guy who founded Ethereum gives it a lot of funding.
00:17:07.740Pretty much if you're like an alt-productive intellectual and you are progressive coded or don't want to like offend people by being overtly conservative coded, this is where pretty much all of them have been for a long time.
00:17:22.280And it runs a number of very large organizations.
00:17:25.980We'll get to some of them in a second.
00:17:27.500That they do get, I think it's $400 million a year is what I remember the last time I had in post.
00:17:46.540And then redistributes it to various charities that they see as effective.
00:17:50.620Of course, most of them won't work with us, but we did get a grant from a survival and flourishing fund, which is technically effective altruists,
00:17:59.800It was a game changer and it's allowed us to create this movement.
00:18:02.640And we'll talk a bit about why I think the effective altruist money gets piddled away so much.
00:18:07.320And they really haven't been able to make things like AI safety, which we're going to compare ourselves to in just a little bit,
00:18:12.740take off in the way that natalism has.
00:18:15.060Even though they spent millions of dollars on promoting it every year.
00:18:19.960So here next, you see the worldwide map, which I found really effective, are very interesting of where the search volume is strong.
00:18:29.260So basically across the whole world, you know, the United States, Central America, South America, all natalism is winning over effective altruism.
00:18:36.920Russia, it's winning, France and Spain and Portugal, it's winning, Eastern Europe, it's winning, India, it's winning, Africa, it's winning, Middle East, it's winning.
00:18:46.760So where is effective altruism bigger?
00:18:49.340Canada, one, the UK, Germany, Italy, Norway, Sweden.
00:18:54.440China, China, where they've just given up, basically.
00:18:56.920I found, well, you're going to see China as a trend across here, which I find really interesting.
00:19:01.540And Australia, I found that interesting.
00:19:05.080It also looks like Indonesia, which is weird.
00:19:07.940I don't know what's up with that, but I assume that they don't really have a population problem.
00:19:13.920Like AI safety, because people can say, okay, this stuff is just getting big because the trends are in your favor, right?
00:19:19.500It's not any effort you or, and I point out, like, we're a team.
00:19:23.900Like, whether it's us, or Kevin Dolan, or Dan Hess, or Catherine Pokaloc, who wrote Hannah's Children, you know, we all talk, we all work to coordinate.
00:19:36.480While we may advocate for very different styles of prenatalism or be excited about different policies, we are very much all fighting for the same thing.
00:19:46.720Yeah, and if they're doing something, they email us and let us know.
00:19:49.900And if we're doing something, we email them and let them know.
00:19:53.240Like, there is really, like, an inter-community communication web here, and it is a small team, and most of the team is self-funded.
00:20:01.800Like, Dan Hess gets, I don't think he's ever gotten any funding for his prenatalist work.
00:20:06.340No, in fact, he's donated so much money of, like, his own to fund things.
00:20:12.100Like, at NatalCon, he distributed an incredibly helpful bound fact book packed with graphs, plus a laminated one-pager just summarizing the basics.
00:20:21.300Oh, can we create a link to that in the, in the, in the-
00:20:23.760Oh, we can, yeah, we absolutely should, yeah.
00:20:25.140So do check out Dan Hess, aka Morpher's fact sheet and report on prenatalism.
00:20:30.540It's just like, okay, if you need a primer, go here.
00:20:34.320Well, it's like literally every statistic on it I've ever seen.
00:31:57.780And after that, we stopped doing this.
00:31:59.520But before that, we were putting 45% of our yearly income into our nonprofit, which is a lot when you have, I think back then it was like three kids where we're just like, no, we absolutely have to make this work.
00:32:31.680With expenses including salaries for its small team of researchers and staff.
00:32:35.560So in 2019, Miri reported total expenses of around $2.4 million with roughly $1.5 million allocated to salaries for its 20 to 30 employees.
00:32:46.860So if I take 1.5 divided by 30 is $50,000.
00:32:56.060And I bet some were making more than others.
00:33:20.920I mean, but also like not that much because Lyman Stone has a day job as a demographer that like literally helps very large corporations determine how many diapers to produce.
00:34:05.320I mean, I think her book has been very successful, but a successful book these days, especially if you worked with a publisher, is not making you a lot of money.
00:34:12.740So I don't think she's getting any material well-
00:34:15.040If you're in the top 1%, you should see our video on Nobody Reads Anymore, can earn you 20K.
00:34:41.600Their main source of income is donations.
00:34:44.560And you even saw people like – I think it was like Will McCaskill or some of the other people who were running like major like effective altruism organizations.
00:34:52.800Write pieces a few years back, especially when San Pape and Freed was big, about why people running these organizations should be able to indulge in luxuries.
00:35:02.960Remember they did things like bought a castle at one point.
00:35:05.940They – this is one of the major EA groups.
00:35:36.160Well, yeah, but I mean like rationalism and effective altruism as we –
00:35:40.220You should see our episode on the EA to sex worker pipeline if you want to learn more about the lifestyles that are common within this community.
00:35:47.240But I think through action, what we see between these two communities is one community that really believes what they're saying and working on and another community where it's a performative way to make an income.
00:36:09.800No, but look at who's putting their money where their mouse is of these two movements.
00:36:15.860It was – contrast Eliezer Yukowsky begging, you know, with the update he made to the book, you know, and putting out chapters at a whatever pace to us who do daily videos every weekday at 8 a.m. EST on this channel.
00:36:54.020But if you put Technopuritan in your will and it ends up becoming a big thing and you send us your genes, we will put those genes along with like the amount that you contributed in case it does end up becoming a big thing and ends up creating like artificial realities and ends up needing resources on that.
00:37:09.900Or instead of cloning people in the future or doing something, that you might be one of the resurrected people.
00:37:16.180I feel like that's a fun little thing you can do.
00:37:19.000But literally no one has told us they're doing that.
00:37:21.520So like it's not like we're getting anything from that.
00:37:24.060And in terms of like active donations, the main ones we get are hate donations for when people get pissed off by progressives and they live in a more progressive environment.
00:37:33.860Two people donate $10 a month minus PayPal fees.
00:37:57.900And the money does – like as I've pointed out, you are getting much more out of that than you're getting out of like effective altruism in what they're doing, right?
00:38:05.440Like we have so little and yet we've been able to – at the conference, the other thing that really got me at the conference is I've been to Effective Altruism Conference.
00:38:13.020This was one of the most competently organized conferences I've ever been to.
00:38:42.820And we had a lot of like really amazing speakers and we didn't have – last time like people got in a fight, like I didn't get in a fight with any Catholics this time, you know, which was really nice.
00:38:54.980This is over IVF versus, you know, non-IVF stuff.
00:38:58.560But yeah, I think that it was really like a victory lap for the peronatalist movement, which was really fun.
00:39:05.120Do you have any final thoughts on this or the Wired piece or the recent press we've been getting?
00:39:12.720I think that people who want to see this movement as evil are going to continue to do so, but it seems to be only to our benefit.
00:39:21.300And I am just really happy with how the conference went and how the movement is growing.
00:39:25.360And it gives me hope for the future because a lot of the people at the conference were really – like there was not a lot of cultural overlap and it made me fear less for a cultural mass extinction, which is something that we're really afraid about.
00:39:39.140Like we want a lot of different opinions in the future.
00:39:41.940And NatalCon felt like anything but a monoculture, which was really cool.
00:39:52.240This is us thinking they're murdering babies.
00:39:54.100They think we're murdering babies and yet we get along.
00:39:57.600Like it is absolutely wild that we're able to get along and progressives have the like smallest difference and they go naives out at each other.
00:40:08.420One of the important things, like if you go to this conference and you're like, okay, I maybe met some people or whatever, but did the world really get anything out of this conference?
00:40:16.280And what I would note is I think the biggest impact of the conference isn't like us having people at it or whatever.
00:40:21.480However, it is because you guys go and because you make this happen, you make this like a really packed event – I mean packed.
00:40:29.740Like this conference was large, like much larger than I anticipated.
00:40:33.840And we then are able to have lots of reporters there.
00:40:36.880And we didn't have all the reporters as we mentioned, like it was a wire, but we did have a lot.
00:40:39.680We had New York Times, we had CNN, we had NPR, we had Mother Jones.
00:40:42.960Like, oh no, Mother Jones was kept out, but they did a great piece.
00:40:49.880They – the reporters only cover the movement when the conference happens.
00:40:54.100Like they need a forcing function to cover the movement.
00:40:56.720And we can go out and they can do the umptites profile on us, but, you know, that's going to die down after a while.
00:41:03.680They need something new to be scared about.
00:41:06.220And you guys making this conference happen, going there, talking with reporters, you all came together and gave them something to be scared about.
00:41:15.740And because of that, every one of you who contributed to this, you contributed to this press cycle that's happening right now.
00:41:21.760And this press cycle that's happening right now is going to give us a second boost, is going to help us move things forward, and is going to help keep this conversation centered in a way that when people are afraid of controversy or afraid of doing new things, like the, you know, Effective Altruism Organization, like big gala that happens every year, and I'm sure they spend tons of money on, you know, that doesn't end up getting less oppressed.
00:41:48.240It doesn't end up moving any idea forward.
00:41:50.600It doesn't end up doing anything meaningful for the world.
00:41:54.460But just by being at an event like this, you guys did a lot.
00:41:57.840And I really, really deeply appreciate that.
00:42:01.200In fact, I'd say, like, if you're like, oh, you know, the money that I paid for this, like $900 or whatever, right?
00:42:08.300Would you have rather I just give you two $900 or $1,000?
00:42:13.840Categorically, I would rather it have gone to the conference, given the amount of press that the conference has generated.
00:42:36.800And that is huge, because that's what we need to keep this centered in the public mind.
00:42:44.380But it's also why, you know, if you look at our video on hard EA versus soft EA, we recently started hard EA.com, which is meant to sort of bring funding to some of the other underfunded cause areas where people are finding companies in those spaces.
00:42:57.920Right now, we've drained the amount of money that we had put in for it.
00:43:01.040So if anybody wants to, like, put a big grant in for that so we can do something big, let us know.
00:43:05.560But right now, we've invested in, like, 10 companies, like 10 to 40K each, which I'm really excited about because I'm excited about a lot of the technology that they're bringing.
00:43:15.040But also, it's an investment and not a donation, which means if any of these companies does well, then we'll have even more money to put into this type of thing.
00:43:54.900One funny anecdote I'll give because this is actually something that I was talking to, like, just last week I was talking to one of the, like, leading people of the Effective Altruist Movement.
00:44:03.940And I was asking them, like, how can we work together?
00:44:07.260And they just didn't seem to understand, like, I was like, look, I got a hard yay here.
00:44:15.860Like, we have a bigger movement in terms of, like, members and focus now.
00:44:20.520You know, we've got people in the White House.
00:44:23.660You guys have been begging for that for years.
00:44:26.140You guys have been spending hundreds of millions of dollars.
00:44:28.380Like, if you guys can throw a little money our way, we're going to be able to throw a little attention your way and help some of your projects or projects that at least we can both agree on come to fruition.
00:44:41.700And they basically didn't understand why they were like, I don't understand.
00:44:47.040And, like, how or why we would work together.
00:44:49.900And it helped me understand why the movement's gotten as bad as it has.
00:44:53.560Specifically, what they pointed out was that the very, very large EA orgs that get all this money, like OpenPhil, they're getting money, huge amounts of money from lots of donors.
00:45:06.220And they have to be afraid of doing anything that would anger any of the donors.
00:45:11.540And so they end up taking the most conservative, boring approach.
00:45:16.960But a conservative approach like that means they're donating to things that would be raising money anyway.
00:45:21.420And so the money is just completely pissed away.
00:45:24.260Which runs against the fundamental principles of effective altruism.
00:45:28.160Because in addition to looking at the severity of a problem and how tractable it is, how feasible it is to actually address it, you're looking at how much attention and help it already has.
00:45:38.240And you don't want to focus on things that already have help.
00:45:41.900You want to focus on the things that are overlooked, which is one of the reasons why Malcolm chose to address demographic collapse.