Evolutionary Psychology & Pronatalism with Dr. Geoffrey Miller
Summary
In this episode, Dr. Jeffrey Miller joins us to talk about the pros and cons of monogamy and polyamory. Dr. Miller and his partner, Diana Fleischman, have been at the forefront of the pro-polyamory movement in the past and present, and now they re joined by Simone and Malcolm to discuss the evolution of human sexuality and sex.
Transcript
00:00:00.000
polyamory can be a legitimate way to run your relationships. So the antinatalist version of
00:00:05.900
polyamory would be, you should simply be maximizing your sexual network and your sexual pleasure and
00:00:13.160
your little highly open-minded adventures and, you know, organizing your gangbangs and threesomes
00:00:20.460
and going to Burning Man and having kids as sort of secondary. And then there's a pronatalist
00:00:26.680
version of polyamory that says, hey, why don't you consider maybe a group living situation,
00:00:32.520
which might make it easier to raise kids collectively with your little trusted polycule,
00:00:39.200
right? Would you like to know more? Hello, this is Malcolm and Simone. We are here with a special
00:00:45.260
guest today, Jeffrey Miller. We actually had his partner on in one of the early episodes.
00:00:51.400
Yeah. Name again? Sorry. Diana Fleischman. Diana Fleischman. Yes. And he, I've respected him since
00:00:59.520
before we wrote our sexuality book on like sexuality and human evolution, because this is a topic that
00:01:04.800
he is both an expert on and the type of outspoken expert that gets him in trouble with university
00:01:09.820
departments and stuff at times, which is of course the experts who we like talking to most. I thought
00:01:15.660
a fun topic for this episode, because, you know, Diana has also been, you know, a core leader in the
00:01:20.740
pronatalist movement in terms of pronatal thought and stuff like that. And so if you, I don't know
00:01:25.300
if there's an extension of that, but in addition to that, is to apply your deep understanding of
00:01:32.120
the evolutionary conditions that sort of led to modern human sexuality and how those are interacting
00:01:39.100
with this new environment that we're in, with ways that they can be shorted out and how that might
00:01:45.360
lead to changes in humanity going forwards. So go.
00:01:50.860
Yeah. I work in this field called evolutionary psychology and we try to understand human nature
00:01:55.700
and we do it through mostly applying evolutionary biology theory to human prehistory and trying to
00:02:03.560
analyze the challenges that our ancestors faced in terms of surviving and reproducing and raising kids
00:02:09.160
and living in groups. Right. So that's my kind of framework. A key insight, I think, from evolutionary
00:02:16.500
psychology is we did not really evolve directly to try to maximize like baby count, right? To try to
00:02:25.340
maximize fitness in a direct way. Instead, what we did through thousands of generations of ancestral history
00:02:34.080
society was do the things that tended statistically to lead to babies, right? Even though we might not
00:02:43.200
be consciously maximizing number of offspring or building a dynasty or whatever, right? So what tends
00:02:52.340
to lead to babies? Being sexually attracted to good, high quality partners, falling in love with them,
00:02:59.540
developing relationships with them, right? All of that stuff tends to lead to babies.
00:03:04.080
What's another thing that tends to lead to babies ancestrally? Achieving social status,
00:03:09.540
right? And prestige and influence and being valued in your group, right? Now the problem is you can
00:03:16.880
short circuit both of those, right? You can fall in love, have a great sex life, use contraception,
00:03:24.920
no babies, right? Chase social status and prestige and influence and never really cash that out
00:03:34.220
in terms of maybe either having relationships or having babies. So the vulnerability that we face
00:03:42.840
as evolved brains running around the world is that you can short circuit all these ways that tended to lead
00:03:50.200
to having kids in prehistory and that don't necessarily do that anymore. So that's, I think,
00:03:57.020
the central issue that pronatalism needs to address, but that civilization itself needs to address.
00:04:03.000
So I want to pull on one of the things you said here, because one I think is intuitive to people,
00:04:07.300
the ways that people can sort of masturbate, literally masturbate the desire to sleep with attractive
00:04:15.000
people. That is one that, you know, obviously there's like a clear pathway there. But one of
00:04:20.720
the things that's changed really recently is the ability to masturbate feelings of community and
00:04:27.760
social hierarchy, especially in a Skinner box-like fashion that is likely to lead to addiction of these
00:04:36.600
dopamine, dopaminergic pathways. And what we are seeing with this generation that's growing up now is
00:04:43.100
like really quickly dropping rates of sex. And I suspect from what you're saying here is that it's
00:04:49.980
actually these communities like TikTok and stuff like that that's causing this. Because you can play
00:04:57.460
in these social environments in a way that engaging through Facebook didn't feel like true social
00:05:04.420
connection. Can you talk to why things like Facebook did this worse or were better or like didn't lead to
00:05:12.220
as many people dropping out of in-person social interaction as this new wave of, of online apps?
00:05:20.500
I mean, social media companies have just gotten better and better and better at
00:05:25.120
hacking the American, the, the, the evolved social instincts of people, right? To keep them engaged in this
00:05:33.960
sort of pseudo status, right? So if you're on Twitter or X, right, people pay a lot of attention to a follower
00:05:40.700
count likes, engagements, impressions, et cetera. If you're on TikTok, which I'm not, I don't know what
00:05:46.120
the metrics are, but it's also highly engaging and addictive. And what tends to happen is you're
00:05:51.740
getting a lot of cues, right? That I'm popular, influential, respected, but it's not actually cashing
00:06:00.700
out into in real life relationships, whether it's friendships or collaborations at work or sexual,
00:06:10.680
intimate, romantic relationships. And of course, it's going to get even worse once you get augmented
00:06:16.320
reality and virtual reality and people will be able to go fully online and have their avatars and
00:06:21.380
interact with other people and have this sort of whole sex life or, or whatever you want to call it
00:06:27.420
that is purely simulated social status, right? And, and interaction. Now, of course, they're real
00:06:35.960
people possibly on the other end of it, or maybe it's just AI bots that you're interacting with,
00:06:43.540
but apparently Gen Z has kind of forgotten how to connect the dots between that kind of online status
00:06:53.120
and any kind of real life relationships and sex and reproduction.
00:06:58.080
So this brings me to a really interesting, if you're, if you're looking at the current
00:07:01.340
evolutionary pressures that we're going through, because the old evolutionary instincts aren't
00:07:05.340
getting us through this, it's like, there's going to be two dominant strategies. One strategy is to
00:07:12.380
have new instincts selected for an individuals that somehow get around the way that these masturbatory
00:07:20.760
pathways are being blocked, i.e. they just don't get as much social status from online environments
00:07:24.840
or whatever. The other pathway is the sort of dominance of our sort of psychological mindset.
00:07:32.560
I guess what I'd say, like, like our cognition over our pre-evolved intuition combined with a culture
00:07:40.040
that values its continued survival. Can you, do you see it differently than this? Like, do you think
00:07:46.900
that both of these will be stable groups that will make it through the end of this? And if so, what do
00:07:51.860
you think these groups will be like characteristically to, to get through this crucible of like AI girlfriends?
00:08:01.140
Yeah. So I, I take those two paths to be kind of like pro natalist Luddite Amish fundamentalist,
00:08:08.420
right? Could be, could be Christian, could be Muslim, whatever. Anybody who's like, like trad life maxing and
00:08:14.980
they're, they're, they're baby maxing and dynasty building based perhaps largely on a, a religious
00:08:20.980
worldview, right? The second strategy would be kind of what Diana Fleischman, my wife and I are doing,
00:08:26.500
which is like get enough insight into evolutionary psychology and the science and enough sort of
00:08:33.220
metacognitive awareness of what you're doing and what's driving you that you can kind of intelligently
00:08:39.380
rediscover the, the joys of pro natalism without necessarily having a religious framework for doing
00:08:45.140
that. Now I suspect that that second strategy, you know, some people will do it, but it won't be super
00:08:54.260
popular. I think the first strategy, the kind of trad life religious pro natalism is actually likely to be
00:09:02.980
more popular, common, effective, and probably will take over the, the gene pool in the long run.
00:09:11.540
And I've, I've actually wrote a thing about this back in 2007, about how do you avoid this trap,
00:09:18.980
right? Of technology that short circuits, um, reproductive success. And I pointed out there's already
00:09:26.900
selection for religiosity for conservatism for pro natalism and for resistance to the kinds of
00:09:37.540
social media status games that, that a lot of people get caught up in.
00:09:43.060
Yeah. Well, so, I mean, I agree with everything you're saying, like, obviously we see ourselves as
00:09:47.380
largely in this one group. I'm wondering if you were going to characterize, because this one group,
00:09:52.580
that the heady group, that's like, okay, I will cognitively decide to do this stuff versus the
00:09:59.540
group that just as reacting to the new evolutionary pressures. What specific sociological traits do you
00:10:06.820
think are going to be selected for in this other group that allows them to make it through this
00:10:12.980
crucible? Do you think it's primarily, you mentioned a few, conservatism, which I think we definitely see
00:10:17.540
in the data is being very strongly selected for a level of technophobia. What, what instinct drives
00:10:23.380
tech? Is it fear of change? Is it like, what, what's specifically being pulled on and reinforced there,
00:10:28.500
you think? I think it's almost like a kind of deep instinctive moral disgust at any kind of fitness
00:10:38.740
fakery, right? What you could call faking fitness cues, where as long as something isn't very directly
00:10:47.540
grounded or dare we call it based in the real physical world and real sexual relationships
00:10:55.220
and real kids, right? I think there's a kind of technophobic conservatism that says, I just don't
00:11:03.620
trust the smartphones. I just don't trust the AR. It doesn't seem real to me. It's disgusting.
00:11:08.900
I love this because I think you picked up on something I hadn't noticed before.
00:11:12.820
These communities heavily overlap with communities that denigrate cosmetic amplification like
00:11:21.940
cosmetic surgery and even things like makeup are seen as like lower status in these communities,
00:11:28.820
which would align with the point that that's the instinct that's being selected for.
00:11:34.020
Yeah, I think that's right. There's an overlap with moral disgust towards porn,
00:11:41.860
moral disgust towards even jobs that don't really involve physical effort and sweat,
00:11:48.020
like keyboard oriented jobs, right? There's an overlap with disgust towards anything that seems
00:11:55.140
behaviorally addictive, which could include porn, gambling, substance use, even TV movies, et cetera.
00:12:04.660
There's a widespread suspicion of just, man, anything that's a distraction from actually having your
00:12:13.300
relationship and your family and your kids and your real work that probably involves some Ford pickup
00:12:19.860
truck and going to Home Depot or whatever, right? I think that kind of package of technos,
00:12:27.460
let's call it technoskeptical rather than technophobic, right? Yeah. Because these folks are like
00:12:34.980
using power tools, right? They're not just hacking away at things with flint hand axes, but they're
00:12:41.780
broadly suspicious of technology and media. And partly that's because media is a conduit for
00:12:51.460
leftist propaganda and antinatalist propaganda, right? So partly it's not just the technology
00:12:56.660
itself. It's sort of the shadowy forces behind the technology who are trying to convince you to do
00:13:02.100
stuff that's contrary to your actual fitness interests. So I wanted to jump in and ask, I mean,
00:13:07.300
you have two small daughters at home and yet you're not going to sort of raise them in this technophobic,
00:13:12.420
I imagine, maybe I'm wrong, kind of conservative culture. You're going to raise them to be, I
00:13:17.140
imagine, like pretty technophilic, probably more on the progressive end, you know, like more,
00:13:22.740
you know, pluralistic. How is it one, is it important to you to pass on a pronatalist culture to
00:13:30.660
your daughters who are now very, very little? So I guess you've got time to plan. Is it important
00:13:37.380
that you pass on a culture period that like represents your values? Like how much do you
00:13:42.980
care about that? And then three, what are you going to do to try to ensure that that culture
00:13:49.300
is passed on despite not being technophobic and being like, no, we're going to shelter you. We're
00:13:55.380
going to, you know, remove you from this part of society, et cetera.
00:14:02.020
That's an interesting and tricky question. So on the one hand, I'm a big believer in behavior
00:14:07.140
genetics and that the traits I have and that my wife Diana has will tend to get passed on. And if
00:14:14.740
we value kids and family, probably our kids will as well, regardless of what we teach them explicitly,
00:14:21.540
regardless of what the family culture says. On the other hand, I do recognize that when your
00:14:28.020
surrounding culture is antinatalist and is trying to get you to get caught up in credentialism and
00:14:36.820
careerism and consumerism and faking your virtual status, you do need a kind of countervailing force
00:14:43.940
within a family, right? That tries to instill in your kids a kind of skepticism about that surrounding
00:14:50.900
culture, right? Just so that your genes for pronatalism don't get swamped by a surrounding
00:14:58.500
culture that's antinatalist. I do have an older daughter in her twenties who is very much kind of
00:15:05.300
trying to figure all this stuff out and looking for a long-term mate and trying to like balance her
00:15:11.460
career as a professional artist versus her dating life and trying to figure out, you know,
00:15:16.980
is it viable to have kids and to have a career, blah, blah, blah. So that's a very much a live
00:15:21.940
issue to me. And I can kind of, I take a keen interest in how millennials and Gen Z are trying
00:15:27.380
to navigate those issues. I was wondering how you think about all this because you've advocated in
00:15:32.340
the past for polyamory. How do you think about all of this in the context of polyamory as a dating
00:15:36.900
strategy that is, that is like newly elevated and may have existed in a historic context, but didn't for a long
00:15:42.980
time. Yeah. So it, yeah, it's not necessarily I'm advocating for polyamory. A lot of what Diane and I
00:15:49.620
say is polyamory can be a legitimate way to run your relationships if you have the skills and the
00:15:56.900
intelligence and the self-control and the conscientiousness and emotional intelligence and
00:16:02.180
a lot of traits that would be required. So it's like having, it's playing on expert level in terms of
00:16:10.980
relationships and monogamy is probably easier for most people to manage. And I think there's
00:16:18.580
antinatalist versions of polyamory, right? And there's pronatalist versions. So the antinatalist
00:16:23.940
version of polyamory would be, you should simply be maximizing your sexual network and your sexual
00:16:30.740
pleasure and your little highly open-minded adventures and, and, you know, organizing your,
00:16:37.460
your gang bangs and threesomes and going to Burning Man and, and having kids as sort of secondary.
00:16:43.780
And then there's a pronatalist version of polyamory that says, Hey, why don't you consider
00:16:49.060
maybe a group living situation, which might make it easier to raise kids collectively with your little
00:16:55.620
trusted polycule, right? Your little group or, you know, seek benefits from your relationships that
00:17:03.940
actually feed into like the viability of your family. Right. And that could be financial benefits,
00:17:10.980
career benefits, social benefits, parenting benefits, whatever. So
00:17:19.460
Have you seen this work? Yeah. There's a lot of aspects of the polyamory culture that I really,
00:17:24.180
really don't like because it seems very self-indulgent, woke, leftist, non-binary, hates babies,
00:17:31.220
blah, blah, blah. I don't like any of that stuff. Right. But I think there are pockets of wisdom
00:17:36.420
from trying to combine your like sexual network with your social and parenting network that can make
00:17:42.340
sense. Yeah. So what I wanted to ask you here is, is I've, I've heard this as a thesis, like,
00:17:47.860
it sounds like it would work. I don't know if I've ever seen anyone actually practically implement it
00:17:54.020
in a high fertility family context or in a high fertility social circle outside of like Mormons or Muslims
00:18:00.420
where it's not really polyamory. It's more polygyny. Have you, or am I just not exposed to the culture
00:18:07.460
enough? What you typically would see is people who are kind of like a little bit Asperger-y and
00:18:19.380
open-minded and polyamorous in their twenties. Right. And then they settle down with a primary partner
00:18:24.980
and they mostly have kids with that one partner, but they still have like secondary and tertiary
00:18:31.220
partners on the outside who may or may not be contributing to child care or sharing rent or
00:18:39.140
mortgages or, you know, living together. The difficulty of course, is you get this huge selection bias in media
00:18:47.060
where the people who are actually busy dynasty building with open relationships are just too
00:18:54.420
damn busy to be on Twitter or TikTok. Fair. Right. And they're just doing their thing and maybe they're
00:19:01.620
showing up at Burning Man once in a while and, and, you know, but, but they're not
00:19:07.140
spending a lot of their time broadcasting what they're doing. So we don't know how many people
00:19:14.580
are actually doing that. This is interesting as a way to enforce interpersonal, like real human,
00:19:20.820
interpersonal connections in a world that is providing too few dopaminergic rewards that can
00:19:27.060
only be achieved through human interpersonal interaction. But it seems like an alternate
00:19:31.780
cultural subset for doing that, which is really interesting. One thing I wanted to pull on that,
00:19:36.740
that I think is when you were describing this conservative mindset of like working the land
00:19:41.060
with your hands and everything like that, I think people can hear this and it sounds really appealing
00:19:46.020
in a moderated format. The problem was, it is, and we've gone into the data on this,
00:19:51.300
is it is a strategy that works better, the more extremely it is implemented. So if you look,
00:19:57.860
there was a great study done in Pennsylvania, Pennsylvania, Dutch speakers, i.e. Amish or
00:20:01.940
Anabaptist communities, and the ones who didn't have cell phones at all, like that was the highest
00:20:07.540
correlatory thing with how high fertility the individual was. So if you have a strategy that
00:20:14.580
works kind of good when implemented slightly, but really good when implemented in the extreme,
00:20:20.420
it leads to an outcome where the extreme iteration ends up dominating that cultural group within a few
00:20:26.420
generations. Where that becomes relevant is it means that even if you have a smaller group that's
00:20:31.940
trying this like intentional polyamorous strategy, not polyamorous strategy, this intentional high
00:20:38.100
fertility strategy that is culturally experimenting with various things that might be addictive,
00:20:42.980
whether it is polyamory or traditional media or anime, you know, that it can stay technophilic and
00:20:50.420
technologically productive in a world where AI gives them an enormous leg up, they ultimately win,
00:20:57.540
even if they have much smaller population numbers.
00:21:07.860
so I feel this constant tension, right, where on the one hand, I'm extremely aware of very, very rapid
00:21:15.220
technological progress in domains like AI and crypto and virtual reality and so forth. And on the other
00:21:21.860
hand, I have this sweeping deep time, multi-generational perspective from evolutionary biology. And it's very,
00:21:28.820
very hard to weave those together, right? If humanity survives, if humanity survives, then in 10, 20, 50 generations,
00:21:38.580
the kinds of people who are going to be around are quite likely to be, you know, more similar to the Amish and
00:21:45.780
the Anabaptists and fundamentalist Muslims than your typical Bay Area, dual income, no kids, AI developers.
00:21:55.860
Yeah. On the other hand, if let's say you get a kind of technophilic subculture that can kind of hack
00:22:09.460
human biology enough, right? You could imagine the Bay Area couples going, you know, we actually want to
00:22:19.060
do dynasty building. We don't want to adopt the values of those Anabaptists. So we're going to
00:22:24.100
do the pre-implantation embryo selection for pronatalist traits, right? And we're going to do the gene
00:22:30.740
editing to make sure our kids really, really actually want to have kids, whatever traits are
00:22:36.900
entailed in that. And that will help them kind of take over the human gene pool eventually.
00:22:47.700
So it's really, really hard to predict how all of this is going to play out. But I think if there is
00:22:52.500
still a human gene pool, it's going to inevitably be dominated by people who actually do succeed in
00:22:58.580
surviving and reproducing, duh. That makes sense. But I also am curious if we
00:23:07.780
somehow managed to like make you emperor of, we'll say the United States, long enough to implement some
00:23:13.220
pro-natalist policy changes that would enable, we'll say like that, like more technophilic,
00:23:21.380
pluralistic cultures to maybe not extinct themselves as quickly or at all. Are there any
00:23:28.180
things that you would do? Like assuming that you couldn't enable a policy or make a blanket rule
00:23:33.300
or a couple of laws that would not be reversed after your short stint was over, what would you do?
00:23:40.180
Like what would you change about modern society or rules or access to certain things or social media
00:23:45.380
in a way that you think could make humans more resilient in the face of all this technological
00:23:50.580
change that can be pretty difficult for encouraging parenting?
00:23:57.140
Honestly, the number one thing is pause AI development. Do, you know, Frank Herbert,
00:24:02.500
author of Dune and some of the later Dune books talked about a butlerian jihad,
00:24:10.900
In the far future. I actually can't remember where he, where he talks about it, but somewhere
00:24:15.220
He wants to establish why they're not using AI. So I think he does it to, but continue.
00:24:19.060
Right. Yeah. So the idea is thou shalt not make a machine in the image of the human mind.
00:24:25.380
More generally, I think a prohibition on developing any new basically species of intelligent entities
00:24:33.780
that could plausibly endanger or replace humans. The reason for that is not just avoiding the extinction
00:24:42.180
risk. Right. The reason partly is I think there's an optimal rate of technological change in terms of
00:24:50.180
helping people feel pronatalist. Right. Helping people feel like I have wisdom and knowledge that
00:24:56.580
that is worth passing on to my kids. Once the rate of technological change gets too fast.
00:25:02.100
Right. Then people feel like, oh, I'm, I'm Gen X. I have nothing to say to Gen Z.
00:25:08.180
They're living in a completely different culture and world and, and technosphere. And so I think if
00:25:14.740
you really want to encourage people to take parenting seriously, you have to make them feel like
00:25:21.380
their wisdom and knowledge is, is going to be relevant to their kids and grandkids.
00:25:26.580
And I think the current rate of technological change, not just with regard to AI, but with regard to many,
00:25:34.020
many things is too fast to make people feel comfortable with parenting and grandparenting.
00:25:42.180
It makes them feel too obsolete too quickly. So I think we've been, you know, progress maxing for the
00:25:50.420
last two or 300 years. And that in itself, I think has antinatalist depressing effects on a lot of people.
00:26:00.660
So I want to hear it because our viewers will know that the, the, the, the views that you're
00:26:06.100
espousing here are very, uh, antithetical to our views on this topic, which is what we love. Oh my
00:26:12.340
gosh. Which we love. So I want to get your thoughts on, on sort of what we teach on this subject, which
00:26:17.140
is to declare preemptive war on that, which is different to use to eventually declare war on that,
00:26:24.180
which is better than you. If we create, if we make humanity hostile to whether it's gene edited humans
00:26:31.300
or AI or anything like that, we create a mandate. If somebody does accidentally create one of these
00:26:38.020
things for this thing to come after us, because we've gone out and said, we cannot allow you to
00:26:42.980
exist because we see you as a threat. Do you feel that we potentially increase danger from these sources
00:26:50.900
by banning them instead of sort of entering all this, which is what we call it. The covenant of
00:26:56.020
man, anything created by man is allowed to exist by all other things created by man. So long as it
00:27:01.700
doesn't try to subjugate any other of the sons of man.
00:27:04.100
I think my view on this is hopefully if we do actually invent artificial super intelligences that
00:27:16.820
are smarter than us, that they will have really good insight into why we did a Butlerian Jihad,
00:27:24.580
into why we tried to ban them at least temporarily, into why we were extremely wary of embracing that,
00:27:31.300
that rate of technological progress, right? Hopefully they're at least as smart as, you know,
00:27:36.900
me or you or the other people worried about AI safety and they won't hold it against us,
00:27:42.900
right? That we were, that we were wary that they'll go, okay, fair enough. Like if we were in your
00:27:49.540
position, we would have had exactly the same risk aversion and caution and concern about extinction risk.
00:27:56.740
And, and we understand as AIs, hopefully the burden is on us as AIs to show that we're safe to you guys,
00:28:08.900
Interesting take. Yeah. Very different. So for me, when I, when I look at this from my perspective,
00:28:14.020
you know, as somebody who's engaged in like genetic selection with their kids and stuff like that,
00:28:17.700
I look at older media, which I see as being very bigoted against these things like Star Trek,
00:28:22.340
right? Like it is so bizarre that in Star Trek, the gene selected humans, you know, Khan and his,
00:28:28.980
his group are just in intrinsically bad people for whatever reason, when you augment humans,
00:28:34.740
they just bad people. Same with if humans, you know, combined with AI, right? Like the board,
00:28:40.580
they just cannot understand human individuality, get rid of them all. And I, and I, and I engage with
00:28:46.420
these works and I'm like, wow, he had an enormous amount of prejudice, Gene Rottenberry and writing,
00:28:51.860
writing this, but you're hoping that the AI is more enlightened than someone like me and able to say,
00:28:57.380
no, no, no. I can understand why he would have this blinding prejudice every time he encounters
00:29:02.420
something different and potentially better, which might be the case. Yeah. I hope.
00:29:06.900
Yeah. So just to be clear, like I'm much more anti AI and wary of AI than of most biotechnology and
00:29:18.740
reproductive technology, right? So I have no problem with embryo selection and genomic engineering and
00:29:26.580
genetic testing and reproductive technology and IVF and surrogates and all that. That's all fine. You know,
00:29:32.500
whatever helps people get better babies and more babies, no problem. And the reason is the rate of
00:29:38.660
change that you can achieve with any foreseeable technology like that is actually still really
00:29:44.580
quite slow, right? We have no plausible way to genetically engineer people with brains like 10
00:29:52.100
times bigger than ours. Whereas we could easily build machines that run 10 times, a thousand times
00:29:59.220
faster than us. So there's kind of an intrinsic slowness technology as applied to humans that makes me
00:30:11.140
So actually, I want to ask you a question about this to see what your thoughts are. So one of the
00:30:14.660
companies Simone and I actually was the nonprofit foundation are looking at investing in right now
00:30:19.060
is a company that will allow gene changes in living adult humans. I didn't believe in the
00:30:25.540
technology when they first told it to me. Then we went through the mechanism of action. I've got a
00:30:28.820
background in science. It actually looks very plausible what they're doing. I'm wondering what you think
00:30:32.580
of this kind of technology, not intergenerational genetic alteration, but intragenerational genetic
00:30:43.380
I think it's certainly feasible that you could have methods for doing that. I guess as somebody trained
00:30:52.100
pretty deeply in evolution and biology and evolution and genetics, I tend to be a little bit wary of
00:30:58.820
overhyped interventions where it's like we can maximize this trait and there are no side effects
00:31:08.580
on any other trait, right? Typically what you see with complex biological systems and genomic
00:31:13.860
regulatory networks is it's really, really hard to improve any given trait without having some
00:31:20.420
unanticipated side effects on lots and lots of other traits. So you have to be really, really careful
00:31:25.780
about how you test like the whole spectrum of stuff that could go wrong.
00:31:30.420
I really agree. This is why prisons are so useful. Well, I mean, if we're doing intergenerational
00:31:36.500
genetic selection and being able to instill specific genetic traits in a living individual to see how it
00:31:42.500
changes their behavior patterns using this sort of technology would be very interesting within some
00:31:47.540
of these more authoritarian government systems. Oh God.
00:31:52.340
Yeah. So from that point of view, like maybe the big progress will come out of China or
00:31:59.300
whatever. I would, I mean, I think, I think consent is important and I think, you know,
00:32:04.660
with these kinds of technologies, there's like, there's the marketing issues and the ethical issues and
00:32:10.340
then there's, does the tech actually work? And I think what you don't want is for the tech to be like
00:32:15.220
stigmatized by testing for side effects in like non-consenting populations. Okay.
00:32:22.500
Yeah. I don't know. I agree. You would need a massive cultural change for that to be an accepted
00:32:27.860
practice. But actually it's very interesting when you think about it from an ethical perspective,
00:32:31.940
because people might be looking at this and being like, what a horrifying thing to do. And it's like,
00:32:35.220
well, actually, if you look at things like IQ, which would be one of the things we'd be most interested
00:32:38.820
in, it is one of the things that is most correlated with rates of murder, like high IQ, very low rates
00:32:43.940
of murder, high IQ, very low rates of stealing, high IQ, very low rates of graping someone. And so
00:32:50.340
there would be a reason to test the technology within these populations, because in a way them
00:32:55.300
doing this stuff, isn't their fault. It's in part their genes, which we now potentially have the
00:33:02.020
technology to relieve them of. Yeah. So, you know, you could, you could maybe envision some kind of
00:33:10.020
prison intervention that makes people less likely to be committing crimes again, just by making them
00:33:18.660
smarter, right? Yeah. Or making them more conscientious or, or whatever. And I think that's...
00:33:26.420
Yeah, you could, you could have an opt-in and, you know, more generally anybody who's sort of at risk of,
00:33:32.660
of, of doing bad stuff, you know, you could offer this kind of intervention and really ethically,
00:33:39.620
I don't think it's that different from campaigns to like reduce lead exposure, you know, to, to kids
00:33:46.500
to, to try to boost their IQ, or, you know, I'm quite involved in effective altruism and they have
00:33:51.380
a lot of interventions about trying to reduce like intestinal parasites or the effects of malaria or
00:33:57.060
other stuff that is known to reduce IQ, as well as hurting your health in, in lots of other ways.
00:34:03.300
And of course it's, it's ironic that a lot of the people who are most skeptical about IQ,
00:34:11.140
right? If you, if you say things like, yeah, but here's an intervention that can help reduce
00:34:15.940
lead exposure and boost IQ. They're like, oh, that's great, actually. Right.
00:34:20.340
To believe in IQ, as long as it's, it's associated with some intervention that protects, uh, kids from
00:34:27.700
IQ damaging environmental factors. Well, this has been a fantastic conversation. We loved having
00:34:34.260
you on and we would suggest that people check out your books and, uh, have a fantastic day.
00:34:40.100
Yes. Thank you so much. And also everyone check, you can basically see a jumping off point for a lot
00:34:46.340
of Jeffrey's work at primal poly.com. So please do check it out. He's also pretty active on Twitter.
00:34:52.580
Also under at primal poly, right? Yeah, that's right. Yeah. So please do. Yes. And thanks again, Jeffrey.