The Tucker Carlson Show - April 13, 2026


Tucker Debates Biotech CEO on Baby Customization, Eugenics, and God’s Existence


Episode Stats


Length

1 hour and 39 minutes

Words per minute

200.93033

Word count

20,014

Sentence count

1,360

Harmful content

Misogyny

11

sentences flagged

Toxicity

14

sentences flagged

Hate speech

61

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 square knows that in hospitality efficiency is everything that's why their system lets you take
00:00:07.980 payments track sales handle inventory manage staff send invoices and keep up with finances
00:00:13.320 all in one place apply through orders with zero mistakes get the data you need and keep everything
00:00:19.200 working together so you're ready for whatever's next learn more about their customizable plans
00:00:24.240 at squareup.com
00:00:30.000 Thanks for doing this. I appreciate it. I'll just say at the outset, which I told you off camera,
00:00:36.480 I disagree with this conceptually, I think, but I'm also completely ignorant of the details.
00:00:43.520 Yeah.
00:00:43.900 So I kind of want to know what this is before even asking you questions about whether it's
00:00:49.060 a good idea. Can you just, I'll just stand back and let you explain what you're doing.
00:00:52.900 Yeah. So first, thanks for having me on.
00:00:54.500 Of course.
00:00:54.760 so patients
00:00:57.420 there's one way of reproducing
00:00:59.300 via IVF
00:01:00.000 right 0.99
00:01:00.360 so you can conceive naturally 0.96
00:01:01.220 via sex
00:01:01.740 or maybe if you're infertile 0.97
00:01:02.940 or if you have some sort of
00:01:04.320 heartache disease
00:01:04.980 or for some other reason
00:01:05.860 you do IVF
00:01:06.680 when you do
00:01:08.140 yeah
00:01:08.380 I'm sorry 0.97
00:01:09.000 I specialize in dumb questions 0.77
00:01:10.320 can you just explain 0.99
00:01:11.020 for people who don't know
00:01:12.280 what is IVF
00:01:12.740 yeah what is IVF
00:01:13.960 IVF stands for
00:01:14.700 in vitro fertilization
00:01:15.660 so basically
00:01:16.480 imagine the egg
00:01:18.040 and the sperm
00:01:19.000 right
00:01:19.380 the foundation of life
00:01:20.260 to make an embryo
00:01:20.920 it's basically putting those things together
00:01:22.360 in a clinic
00:01:23.460 right
00:01:24.700 And then basically you take that embryo 0.99
00:01:26.660 and you transfer it into a woman 1.00
00:01:27.840 and then it would implant 0.97
00:01:28.880 and the woman's pregnant.
00:01:29.940 So conception takes place outside the womb. 0.98
00:01:31.940 Correct. 0.71
00:01:32.420 Okay.
00:01:32.860 Yeah.
00:01:33.660 And so during this process of IVF,
00:01:35.820 what you do is today,
00:01:37.080 even if nucleus didn't exist,
00:01:39.100 even if genetic optimization didn't exist,
00:01:40.720 you make several embryos.
00:01:42.160 Okay.
00:01:42.500 So in an IVF clinic,
00:01:43.640 you make several embryos.
00:01:45.520 The amount of embryos you end up making,
00:01:47.240 it varies,
00:01:47.800 but you might have four or five.
00:01:49.280 You actually do genetic testing on these embryos
00:01:51.180 to identify things like chromosome abnormalities,
00:01:53.300 like Down syndrome, for example, right? So that's very commonplace. So that's done in basically
00:01:56.920 every IVF clinic in the United States. They will actually screen embryos, the genetics of the
00:02:01.040 embryos to see if they have some sort of severe chromosomal abnormality. What we do is we basically
00:02:06.440 provide more information on embryos. So we also read the DNA, but now we give information on
00:02:11.160 things like other hereditary disease risk, also chronic diseases, things like cancers, Alzheimer's,
00:02:16.820 diabetes, also traits like IQ or height or et cetera. So to be clear, we're not changing any
00:02:24.420 DNA. There's this process in IVF where you make embryos. Already genetic testing is done in
00:02:29.120 embryos. What we do now is we provide you a little bit more information on your embryos.
00:02:33.400 So basically that information can be used, then implant which embryo the couple deems to be best.
00:02:39.820 So basically give more information to couples to then choose which embryo they want to implant.
00:02:43.440 I don't want to derail this conversation two minutes in.
00:02:46.160 Okay.
00:02:46.640 But you just said we can tell the IQ of a person by the genetics?
00:02:52.700 So IQ-
00:02:53.640 I was reliably informed IQ is not real, okay, and it's not determined by genetics.
00:02:57.920 So I think it's helpful to think about all these different characteristics from diseases to traits, right?
00:03:04.380 People know intuitively something like height, for example, right?
00:03:06.940 Height, they say, oh, that's genetic or something like breast cancer, eye color, right?
00:03:12.080 These things people intuitively know are genetic.
00:03:14.360 And so you can actually basically take these different phenotypes and measure how genetic any phenotype is.
00:03:19.760 So what does it actually mean?
00:03:21.760 The most simple way of explaining it is imagine you took two identical twins.
00:03:25.600 So they have the same DNA, right?
00:03:27.260 And then basically you separated the twins.
00:03:28.800 They grew up in different environments.
00:03:30.060 Sometimes in pop culture, people hear about these different things where you actually take twins and they have, again, the same DNA.
00:03:34.180 They're identical DNA.
00:03:35.580 And then they grow up in different places for whatever reason.
00:03:37.580 So they're subject to different environments.
00:03:38.760 And you can actually measure basically how much more similar they are across all these different phenotypes to see basically how genetic something is.
00:03:46.100 Twin studies.
00:03:46.580 Yes, twin studies, yes.
00:03:48.460 And so using twin studies, you can actually get measurements of things from diseases, right, like cancers and diabetes and Alzheimer's, as mentioned, to things like height or IQ or BMI, etc.
00:03:59.400 So twin studies show that IQ specifically is about 50% genetic.
00:04:02.520 But to be clear, IQ is just one of over 2,000 factors that we actually look at, right?
00:04:07.260 principally parents and patients, they come for disease. They always come for disease. And remember
00:04:11.540 that when the embryos you're picking from, the most important determinant of the genetics of
00:04:16.640 your embryos is, well, your partner, right? So you're actually not changing DNA. This is not
00:04:21.280 gene editing. You're not changing DNA. You're not making like an embryo's DNA better. You're
00:04:25.420 basically reading the embryo's DNA that you have. So when you pick your partner, you're basically
00:04:29.440 picking the kind of genetic pool, and then you can basically pick which embryo you deem to be
00:04:33.160 best based off of your preferences and values. I mean, this like, again, I just want to say
00:04:40.760 thank you for doing this. I'm not here to attack you at all. I think this is one of the most
00:04:45.620 important conversations we can have. And I agree. You're much younger than I am. So you weren't
00:04:51.380 here for the debates that took place in the early 1990s about what traits are the product of
00:04:57.720 genetics and which are the product of environment. But up until pretty recently,
00:05:02.120 the public conversation has settled on a consensus
00:05:05.600 that everything is environment
00:05:07.560 and that genetics aren't real.
00:05:09.840 And this was at the very center
00:05:11.200 of our national debate about race
00:05:12.760 and crime and educational achievement, income.
00:05:19.620 And it all grew out of or was crystallized
00:05:22.680 by a book called The Bell Curve.
00:05:23.840 Have you heard of this?
00:05:24.680 Yeah, I have, yeah.
00:05:25.180 Yeah.
00:05:26.040 So it seems like that debate is over
00:05:30.220 And I'm not, this is not an attack at all.
00:05:32.900 It's just like crazy to me
00:05:34.020 that people are just saying this out loud.
00:05:35.040 Yeah, genetics plays a big role.
00:05:36.880 Yeah, genetics plays a role.
00:05:38.320 So I think in society today,
00:05:40.420 when people think about like height or cancers,
00:05:44.300 and to be clear, I'm not talking about,
00:05:45.740 there's hereditary disease risks like PKU,
00:05:48.020 Tay-Sachs, cystic fibrosis, beta thalassemia.
00:05:50.640 These are conditions we also screen for, right?
00:05:52.560 To make sure that parents can reduce suffering
00:05:54.320 each generation.
00:05:55.180 So that's also part of what we do.
00:05:56.920 And those conditions are basically
00:05:58.500 deterministic in nature, right?
00:06:00.040 So if you have two bad copies of like cystic fibrosis, you're going to get cystic fibrosis and it's debilitating.
00:06:04.980 And so there's like policies, you know, that basically encourage, you know, Americans and people around the world to do screening, to not pass down basically an invisible genetic burden to their child.
00:06:14.960 Right.
00:06:15.580 Right.
00:06:15.820 That's like classical kind of genetics.
00:06:17.820 So I think it's interesting because you make. 0.94
00:06:19.060 Well, it's eugenics, right?
00:06:20.760 No, no, no, not eugenics. 0.73
00:06:21.860 Eugenics. 0.82
00:06:22.040 How is it not? 0.97
00:06:22.620 It's improving the human species through breeding.
00:06:25.520 Eugenics refers to basically corrosive use, corrosively controlling human reproduction.
00:06:30.040 Right. Forced sterilizations, even euthanasia, controlling who can get married to who. 0.99
00:06:35.800 So no, no, no, no, no. Those are methods by which you implement in eugenics, but they're 0.96
00:06:41.060 not the only ones. Eugenics simply means there's nothing inherently where you can disagree
00:06:45.080 with the concept, but the concept is, corrosive or not, the improvement of a species, in this 0.73
00:06:51.080 case, the human species through selective breeding.
00:06:53.760 Well, but there's no selective breeding. Remember, patients choose who they marry. 0.94
00:06:57.340 And then in the embryos they have, right, you're not changing the embryos.
00:07:00.780 In the embryos they have, patients can make their own choice in which embryo they want to implant. 0.67
00:07:05.440 So juxtapose like eugenics. 0.78
00:07:06.740 How is that non-selective breeding? 0.84
00:07:08.140 This is literally breeding is by definition the process of bringing new life into the world.
00:07:16.340 And you're deciding which of these embryos becomes a person.
00:07:19.880 And so that is breeding.
00:07:21.660 It's not choosing people's marriages.
00:07:23.320 It's not giving them forced vasectomies, but it is breeding.
00:07:28.260 That's what breeding is.
00:07:29.480 Well, I would say that in IVF clinics for the last couple of decades, there's been this
00:07:34.660 process of basically taking these embryos, getting more information on the embryos, and
00:07:37.980 then picking which embryo you want to implant, right?
00:07:41.000 Again, you're not changing DNA.
00:07:42.760 You're not controlling who can get married to who.
00:07:44.660 Like, just to be clear, if you go back, eugenics is a term it came up with in the late 19th
00:07:50.060 century by a scientist named Francis Galton, okay?
00:07:52.560 he was a british scientist yeah a bunch of havelock ellis yeah but yeah he came up with
00:07:56.120 the term eugenics interestingly the term eugenics was actually about 20 years before the term
00:08:00.660 genetics this is really interesting a lot of people don't know that yeah this is very important
00:08:04.900 eugenics um naturally did not require genetics genetics when they when the term was coined it
00:08:10.840 was the science of heredity right of passing down um information the remember the unit of heredity
00:08:16.160 identified as dna that was only until the 1940s right right and then i didn't find the structure
00:08:21.620 of DNA was actually after World War II in the 1950s. So we didn't even know for basically
00:08:26.180 in 1927, and I think it was Buck versus Bell, the US Supreme Court deemed forced sterilizations
00:08:32.320 constitutional. Okay. At that point, we had no idea that DNA was actually the genetic basis.
00:08:37.960 This is really, really important. People always get this wrong because they don't follow the
00:08:41.100 timeline. Eugenics as a corrosive ideology to control populations had nothing to do with
00:08:48.020 in the lack of genetics, period.
00:08:49.500 It had nothing to do with the genetics.
00:08:50.460 Why was it corrosive?
00:08:52.320 Well, I think if you basically
00:08:53.820 force sterilize somebody against their will,
00:08:56.380 I mean, I think that's against
00:08:57.320 the fundamental liberty of a person.
00:08:59.500 Of course, there's no question.
00:09:00.880 I couldn't agree more.
00:09:02.560 But again, that was just one manifestation of it.
00:09:05.320 So force played no role in a lot of it.
00:09:08.380 It was steering people, giving them options,
00:09:10.500 telling them that, you know,
00:09:12.020 if you married this kind of person,
00:09:13.380 here's the outcome you're likely to get
00:09:14.900 when you have children.
00:09:17.120 Well, force did play, I mean, again, in 1927, the United States, the Supreme Court deemed constitutionally that forced sterilizations are constitutional.
00:09:23.980 I'm just saying that, and I couldn't be more opposed to that, in fact, to the whole program.
00:09:28.620 But I just want to note as a factual matter that forced sterilizations were an incredibly ugly, evil manifestation of an idea that was not limited to forced sterilizations. 0.70
00:09:40.840 The idea is the same idea you're articulating, which is people should try to improve the human species by selective creation of children.
00:09:50.080 So, yeah, I disagree with that.
00:09:51.640 How is it different?
00:09:52.860 So, Nucleus ultimately, and what we give patients, ultimately what patients actually want, right?
00:09:56.860 Again, patients are choosing their partner.
00:09:58.620 They're choosing to do IVF.
00:10:00.440 They have basically options.
00:10:02.740 They have similar embryos.
00:10:03.940 They get information.
00:10:05.260 There's actually no best embryo, right?
00:10:08.160 So Nucleus is a company and no patient can ever say, oh, this is the best embryo because
00:10:13.100 there's no fundamental virtue rooted in biological characteristics.
00:10:18.220 So the idea that you could even have a best, for example, is misguided principally, in
00:10:23.320 my view, because something like virtue, and I think of two kinds of virtues, there's natural
00:10:27.340 virtue and then divine virtue, it's fundamentally not biological, it's not physical.
00:10:32.000 Genetics can only program for physical things, and then people can basically make their choices
00:10:35.260 within the partners that they choose
00:10:36.520 and then doing IVF
00:10:37.240 to then pick the embryo
00:10:38.280 that sets the best set
00:10:39.440 of biological characteristics to them.
00:10:40.820 But there is no virtue.
00:10:41.780 There's no morality in that decision.
00:10:43.580 Well, I've noticed.
00:10:44.280 Yeah.
00:10:44.580 But so do you think
00:10:46.360 that it's equally virtuous
00:10:47.900 to have a child,
00:10:49.640 intentionally have a child,
00:10:50.680 which we can now do
00:10:51.520 with the genetic testing
00:10:52.600 you're describing,
00:10:53.860 who has Down syndrome,
00:10:56.740 Tay-Sachs, and CF?
00:10:59.660 Is that as virtuous
00:11:00.920 as having a child
00:11:01.800 who has none of those things?
00:11:02.840 Because I thought you just said
00:11:03.820 that it's good to get rid
00:11:04.820 of those things? To be clear, virtue is independent of, virtue is independent of biological
00:11:08.760 characteristics. Parents can choose based off their preference, what they want, what is best.
00:11:13.620 So let me give an example. Let me give an example. So there was a case in reproductive medicine where
00:11:17.440 a deaf couple, they want to have a deaf child. Yep. That, that to them was what was, was best 0.77
00:11:22.380 basically. Right. That term best is relative, context specific to the parent. We have patients,
00:11:29.000 for example, that might have, you know, Huntington's, which is a severe neurodegenerative
00:11:33.160 disease. Yeah, very severe. It's autosomal dominant means it's passed down, right? And by
00:11:37.580 the way, this is actually interesting. Something like Huntington's or schizophrenia, these are
00:11:41.440 exactly the kind of conditions that in the 20th century, they would say, hey, these people are
00:11:44.820 unfit, right? They should not reproduce, right? Because they have some sort of neuropsychiatric
00:11:49.160 or some sort of debilitating condition that runs in the family. Like in my case, you know,
00:11:55.080 one of the reasons why I started the business is because one of my family members, she unfortunately
00:11:58.660 She went to sleep and she passed away in her sleep.
00:12:02.000 So these things are deeply personal to people.
00:12:04.360 Is that the result of a genetic anomaly?
00:12:07.100 Yeah, a condition that can cause irregular heart beating,
00:12:09.380 can cause sudden death.
00:12:10.440 Everyone loves relaxing at home.
00:12:12.080 Cozy Earth can maximize that experience.
00:12:14.140 If you haven't tried their robes or their slippers,
00:12:16.100 you may be missing out.
00:12:17.740 Soft, breathable, lightweight, the epitome of comfort,
00:12:21.100 perfect for slow mornings, put one on after the shower,
00:12:24.100 hang out in front of the fire.
00:12:25.420 You put on the robe, you don't want to take it off.
00:12:28.220 We haven't even mentioned the slippers, which are warm and comfortable and easy to wear on the house.
00:12:32.080 By the way, at this point, you can wear them to Walmart.
00:12:34.060 No one will say anything.
00:12:35.500 With Mother's Day coming up, Cozy Earth can provide the perfect gift, something she will use and appreciate every day.
00:12:42.140 If you're nervous about making a purchase, don't worry.
00:12:45.120 Cozy Earth backs everything with a 100-night sleep trial and a 10-year warranty, all risk-free.
00:12:51.360 Visit CozyEarth.com.
00:12:52.960 Use the code Tucker for 20% off.
00:12:54.440 That's CozyEarth.com, promo code Tucker for 20% off.
00:12:57.800 We've got a post-purchase survey.
00:12:59.820 Mention you heard about CozyEarth from us on this show.
00:13:02.380 I don't want to sidetrack you, but you threw in schizophrenia, rather?
00:13:06.540 Yeah.
00:13:07.260 Is there, I don't know the answer,
00:13:08.740 is there evidence that that is genetically predisposed to?
00:13:12.160 Schizophrenia is very strongly,
00:13:13.720 there's a very strong genetic basis to schizophrenia, right?
00:13:16.020 Really?
00:13:16.540 Correct, yeah, yeah.
00:13:17.240 And we know that.
00:13:18.180 Yes, that is a very well-established science.
00:13:21.100 Yeah, sorry, I'm learning.
00:13:23.700 Yeah, no, it's interesting.
00:13:25.100 So, okay, but you said a minute ago that there is a nationwide, indeed, a global effort to get rid of conditions like... 1.00
00:13:36.400 But again, deafness is a great example. 0.99
00:13:39.300 It's not for me to tell a deaf couple whether they should or shouldn't have a deaf child. 1.00
00:13:42.480 No, no, I understand.
00:13:43.220 But that can apply across everything now, right?
00:13:45.140 If somebody wants to have a child based off their extent of what they deem to be best, based off their lived experience, that's their right and that's their choice.
00:13:51.480 So I'm not saying that it's better to have a child that is not deaf, for example. 0.96
00:13:56.500 I can't do that. 0.96
00:13:57.140 I can't possibly say that it depends.
00:13:58.960 I think that's entirely the choice of the family.
00:14:03.380 Okay, that's a consistent position.
00:14:05.160 I wonder though, but you described something that's absolutely real, which is a system globally that is designed to minimize, to reduce the incidence of certain conditions, right?
00:14:17.580 So you said that that's the policy, like you genetic test all the embryos at every IVF
00:14:23.540 clinic because you want to make sure we have less Down syndrome, for example.
00:14:27.440 But no, but again, what's important here is there's not some sort of broad centralized
00:14:31.800 body being like, oh, we need to all do this sort of testing embryos.
00:14:35.320 That decision rests in the parent's choice.
00:14:37.980 A parent can choose not to screen embryos for Down syndrome, okay?
00:14:41.380 They could make that decision.
00:14:42.900 And if they make that decision, they can then transfer that embryo and have that baby.
00:14:46.540 That's entirely their choice.
00:14:47.400 So you think there's no, and I don't, I mean, let's not be disingenuous.
00:14:53.640 There is a global effort to reduce the incidence of certain conditions.
00:14:58.040 Of course, everyone just assumes like you can't, I mean, that's why the incidence of
00:15:01.620 Down syndrome has fallen off a cliff.
00:15:02.920 There's been an elimination of Down syndrome, not entirely, but pretty much.
00:15:06.400 Those are parents making choices though.
00:15:07.880 Those are parents and couples making choices.
00:15:09.300 And so you don't think that healthcare systems steer people in certain directions or have
00:15:12.660 a preference?
00:15:13.520 I think the healthcare system, unfortunately, right now is a sick care system.
00:15:16.500 I mean, the healthcare system actually is very much not in the business of prevention.
00:15:20.580 I mean, it's interesting.
00:15:21.880 I was looking at these stats, which is the US healthcare system spends about $5 trillion,
00:15:26.780 which is a lot.
00:15:28.500 About, I think, $4 trillion goes to chronic disease treatment.
00:15:31.600 So things like cancers and diabetes and Alzheimer's.
00:15:34.240 In 2021, four times as many people died of a chronic disease than COVID.
00:15:39.500 Four times as many people died of a chronic disease than COVID at the peak of the pandemic.
00:15:42.920 So you have to ask, what is the real pandemic here?
00:15:47.360 And on that point, if you think about it, and also, by the way, of the $5 trillion, so $4 trillion, about 80% is chronic disease.
00:15:54.580 About $500 billion is about rare diseases.
00:15:57.040 So these rare genetic conditions that I outlined.
00:15:59.300 So genetics has a strong impact on both hereditary disease, like cancer, as I outlined, like chronic diseases, as well as rare disease.
00:16:06.080 So genetics can help impact $4, $4.5 trillion of healthcare expenditure.
00:16:09.620 But, and there is a but, remember, those $4.5 trillion.
00:16:11.820 dollars somebody's making money for someone being sick well yeah and that's horrible that's horrible
00:16:17.120 but it's of course you say of course but i think that we can't just take that as a given right like
00:16:20.900 genetics as a science if deplored can be used for parents to make their own decisions to dramatically
00:16:25.240 reduce breast cancer risk diabetes risk if there's something in their family schizophrenia
00:16:28.980 alzheimer's help reduce that next generation so these things can be used to basically help build
00:16:33.440 what we call generational health effectively um so i don't save a lot of money through improving 0.55
00:16:39.140 the species through eugenics.
00:16:40.760 People made this argument for over 100 years. 0.99
00:16:42.960 I get it.
00:16:44.200 I'm just wondering, well, I'm wondering a lot of things.
00:16:47.980 Well, one thing to say, remember, too,
00:16:49.900 that IVF is about 2% of the way babies are born in the United States.
00:16:53.480 Most babies are still born naturally conceived.
00:16:55.460 So we actually have a service for those couples as well,
00:16:59.520 where you can just basically take a cheek swab.
00:17:01.500 You can do something called procreation simulation
00:17:02.980 and simulate basically the risk for your child.
00:17:05.340 Okay.
00:17:05.460 Okay. And that is a service that can basically help any couple too. So I just want to be clear
00:17:10.340 that it's not just IVF patients as well. These are couples that then can employ the screening
00:17:14.420 and then to have a healthy baby. What about sex? What about sex?
00:17:21.540 Well, I mean, the number one thing that people have used prenatal testing for is choosing the
00:17:28.040 sex of their child. So that's what explains the demographic imbalance in China, as you know. So
00:17:33.520 So that's like the number one thing globally.
00:17:37.220 India, same.
00:17:38.020 And India actually outlawed it to be clear too. 0.93
00:17:39.800 So in IVF clinic, you can't even pick sex in India 0.73
00:17:42.040 because there's a disbalance.
00:17:43.840 Well, legally, but of course it happens all the time
00:17:45.920 because there's a global preference for sons.
00:17:48.320 And that's why you see so many more boys than girls
00:17:51.120 when in fact it's the opposite.
00:17:53.080 In the United States, actually, if you look at the IVF,
00:17:55.360 it's about 50-50.
00:17:56.860 I'm not talking about the US,
00:17:58.380 but how do you feel about that?
00:18:00.280 Would it be okay with you if someone came in and said,
00:18:02.900 get rid of the girl embryos?
00:18:05.180 So to be clear,
00:18:05.780 in the United States,
00:18:07.860 this has played out
00:18:08.560 over the last 20 years.
00:18:09.600 People have been able
00:18:10.200 to pick the sex of their child
00:18:12.220 in IVF clinics,
00:18:13.060 both in the United States
00:18:13.640 and then, again,
00:18:14.820 at some point internationally too,
00:18:16.000 but eventually became outlawed
00:18:16.940 for the reason you outlined,
00:18:17.800 which is people generally
00:18:18.820 pick slightly more boys.
00:18:21.100 I mean, it's illegal
00:18:21.920 and it's much harder
00:18:22.900 in these countries.
00:18:23.840 Okay.
00:18:24.700 In the United States, though,
00:18:25.700 if you actually played out
00:18:26.480 people making their own choices,
00:18:27.680 it ends up being about,
00:18:28.940 again, 50-50.
00:18:30.620 So this is actually interesting
00:18:31.440 because people-
00:18:31.900 But what do you think of it?
00:18:32.900 Is it valid for someone to come in and say, I mean, you said this is an ethically neutral question about whether or not to have a child with this or that genetic condition, but what about sex? Is that ethically neutral? Is it okay, in your view, for a couple to say, I don't want any girls?
00:18:50.880 In my view, that is the prerogative of the parents to pick which sex they want. 0.69
00:18:54.880 And if you play that out across many, many, many couples making their own independent choices, right, which is an embodiment of this kind of liberty and choice, you see it ends up being about 50-50, which I think actually undercuts this idea that everyone's going to pick, you know, a boy, for example, right?
00:19:08.780 There's this notion.
00:19:09.620 Well, it's culturally specific in its time, you know.
00:19:11.280 Exactly.
00:19:11.760 Of course.
00:19:12.340 But that applies across any traits then, Tucker, which is people, there's not a universal best.
00:19:17.060 It's very much case-specific to the specific family history, specific values, and culture.
00:19:22.660 Of course, of course.
00:19:23.460 But I think we're talking about two different things.
00:19:25.640 You're talking about outcomes, and I'm talking about the process and whether the process itself is valid. 0.74
00:19:31.640 And right, and I totally, I've actually seen the numbers, so I know that you are absolutely right on the question of sex selection.
00:19:38.520 But you think it's okay, there's no moral problem at all, because these are questions of life and death.
00:19:44.260 So I do think moral questions are relevant questions.
00:19:48.480 You don't think there's any moral question around choosing by sex.
00:19:52.440 To be clear, I think that there is no universal biological best period across any phenotype
00:20:00.520 because biology is inherently neutral.
00:20:02.700 Now there is universal morality, okay?
00:20:05.860 Specifically, again, two kinds.
00:20:07.580 There's natural virtue, right?
00:20:09.200 And also divine virtue.
00:20:10.520 Natural virtue can come from the cultivation of the soul,
00:20:13.700 which is independent of biology.
00:20:15.160 It's not in the physical plane.
00:20:16.640 And so I think-
00:20:17.900 How is that different from divine virtue?
00:20:19.800 Divine virtue to me is more about union with God.
00:20:23.680 So natural virtues-
00:20:25.040 If there's no God, where does the soul come from?
00:20:26.420 There's God.
00:20:27.140 There's God.
00:20:27.560 What do you mean?
00:20:28.020 Why is there no God?
00:20:28.700 Of course I agree.
00:20:29.320 But I don't know why there's a distinction
00:20:31.520 between the virtues again.
00:20:33.840 We're in the weeds.
00:20:34.040 Natural virtue, I'll tell you why.
00:20:35.540 Natural virtue can be intellectually derived.
00:20:37.580 Wisdom, courage, justice.
00:20:40.180 temperance it's kind of classic uh aristotle uh and then there's things like grace and and
00:20:45.420 revelation which come from god you can't necessarily a human being's mind is limited
00:20:48.980 it's finite right you can't necessarily grasp that so there has to there's a there's a so you
00:20:53.640 can one you can derive from like thinking like what leads to basic eudaimonia human flourishing
00:20:57.240 right that that kind of virtue natural virtue right coming from aristotle and the other kind
00:21:01.460 is um thinking about uh divine virtue which is what goes beyond the intellect right which thomas
00:21:05.940 aquinas basically brought together and thought about okay there's there's this idea of natural
00:21:09.140 virtue that the Greeks came up with. And then, of course, there's this idea of divine virtue
00:21:14.100 coming from the Old and New Testament about union with God. And all religions actually talk
00:21:19.520 ultimately about surrendering. Personally, I do believe in God, just so you know if that's not
00:21:22.600 clear. Well, here's something that thieves count on. Security cameras usually stop where Wi-Fi
00:21:27.180 stops, right? Makes sense. So if you've got a barn, a job site, equipment parked outside,
00:21:32.500 long driveway, criminals know there's a good chance that nobody is watching this because
00:21:37.460 there's no Wi-Fi. And that's why we like Defend by Tacticam. It's a new sponsor of this show.
00:21:44.540 Defend's cameras don't run on Wi-Fi. They run on cellular, just like your phone. So they work
00:21:49.260 everywhere. If you've got cell signal, you've got security. Middle of nowhere, edge of your
00:21:54.360 property, construction site, wherever you need it. You don't need Wi-Fi. Big difference. And you can
00:22:01.320 see why it matters. So we use these cameras in places where Wi-Fi doesn't reach. The setup is
00:22:05.380 super simple. You mount the camera, open the Defend app, and you are live. You get clear footage,
00:22:11.040 night vision alert sent right to your phone. It's great for construction sites, ranches,
00:22:15.740 farms, or anyone with a property that stretches beyond a router. And here's something we really
00:22:21.360 appreciate. Defend does not sell your data. Not to tech companies, not to advertisers, not to China.
00:22:26.600 No one. Your footage belongs to you. And that's big. Plan started about five bucks a month. No
00:22:32.180 contract cancel anytime. Visit defendsellcam.com. That's Defend Sell Cam.
00:22:39.300 Visit BetMGM Casino and check out the newest exclusive, the Price is Right Fortune Pick.
00:22:45.280 BetMGM and GameSense remind you to play responsibly. 19 plus to wager. Ontario only.
00:22:50.460 Please play responsibly. If you have questions or concerns about your gambling or someone close
00:22:54.320 to you, please contact Connects Ontario at 1-866-531-2600 to speak to an advisor. Free
00:23:00.980 of charge. BetMGM operates
00:23:02.840 pursuant to an operating agreement with
00:23:04.720 iGaming Ontario.
00:23:07.720 Want to go electric
00:23:09.120 without sacrificing fun?
00:23:11.720 That's the Volkswagen
00:23:13.040 ID4. All electric
00:23:14.900 and thoughtfully designed to elevate your
00:23:16.980 modern lifestyle. The Volkswagen
00:23:18.720 ID4 is fun to drive with instant
00:23:21.020 acceleration that makes city streets
00:23:22.860 feel like open roads. Plus a
00:23:24.880 refined interior with innovative
00:23:26.700 technology always at your fingertips.
00:23:29.080 The all electric ID4.
00:23:30.920 You deserve more fun.
00:23:32.220 Visit vw.ca to learn more.
00:23:34.880 SUVW, German Engineered for All.
00:23:37.740 Dot com.
00:23:38.840 What kind of God do you believe in?
00:23:40.640 So I've meditated for about seven years.
00:23:45.140 And what I keep coming across is the best way to articulate,
00:23:48.940 I see God as an experience versus an ideology,
00:23:52.220 which is that there's a quote, it's actually from Rumi.
00:23:56.600 I think he articulates, well, Rumi's a Persian poet.
00:23:59.240 But he says, imagine you go to the ocean and you come back with a pitcher of water.
00:24:05.540 So the pitcher in my mind is the ego, is the logical mind.
00:24:08.680 And then the ocean is God, the source, the one, the divine, whatever you want to call it.
00:24:13.780 That's how I think about God.
00:24:14.720 So I think from my experience meditating and from what I've seen, the, again, human mind, the intellectual mind is limited and finite.
00:24:22.660 And there's basically this vastness.
00:24:24.760 It's hard to describe, which is why often the Sufis would use poetry to actually describe God.
00:24:29.240 Um, because it's, it's, it's, it's this, it's, it's hard to it. You can't describe it directly
00:24:33.220 because it's too big. Precisely. It's infinite. It's vast. That's why I like the ocean as an
00:24:36.960 example. Another way I like to think about it is like, if you're a raindrop and it's easy for us,
00:24:42.300 especially in modern society to think the raindrop is the world, but eventually you return to the
00:24:46.300 ocean and you realize it's much bigger. And so, um, so that's your conception of God.
00:24:52.020 Yes. That's my, again, I think God is more, is more an experience. It can't get, God cannot be
00:24:56.340 conceptualize it cannot be articulated it's not a logical thing you cannot use logic to articulate
00:25:00.100 god i mean to me that's a it's incompatible um but so i think you can try to use metaphors you
00:25:05.720 try to explain it um i always like the sufi poets because i feel like they do a really really nice
00:25:10.040 beautiful job of that um certainly of describing the vastness and fundamental incomprehensibility
00:25:15.740 of god for sure oh i couldn't agree with you more and only poetry can capture that but it
00:25:20.960 leaves unanswered the core question for the three abrahamic religions which is what does god want
00:25:26.000 for us to do and believe?
00:25:30.180 And what's your view on that?
00:25:31.520 Well, Islam specifically,
00:25:33.500 Islam literally means 0.99
00:25:34.400 surrendering to one. 1.00
00:25:35.820 Yes.
00:25:37.440 I think that's the answer.
00:25:39.080 In other words, 1.00
00:25:40.360 Islam,
00:25:41.040 and you can,
00:25:41.500 I'm not Christian,
00:25:42.200 you're a Christian,
00:25:42.680 so you can tell me more
00:25:43.300 about the Christian's view,
00:25:44.200 but there's a concept
00:25:44.800 of surrender in Christianity.
00:25:46.260 So in Islam,
00:25:46.840 it means literally Islam
00:25:47.920 means to surrender.
00:25:49.080 Yeah, it's an experience.
00:25:50.220 It's the whole thing.
00:25:50.740 The whole thing.
00:25:51.100 Jesus literally surrendered
00:25:52.440 to being tortured to death.
00:25:53.320 Exactly.
00:25:53.700 Yeah, of course.
00:25:54.180 And then in Buddhism as well,
00:25:55.480 They call it different things in Buddhism. It's a little bit more like surrendering to the illusion of the ego, for example. But the concept of surrendering, I think, is basically universal.
00:26:07.040 There's no question.
00:26:08.120 And so, yeah.
00:26:09.460 But, right?
00:26:10.960 That's my answer.
00:26:12.160 That's the very beginning. That's the conceptual understanding of it. But then you move immediately into what does God want you to do? What powers does he have? What powers do you have? What are the things you're allowed to do? What are the things you're not allowed to do?
00:26:24.960 So, I mean, that's just a product of logic, but it's also like pretty spelled out in every one of the three religions that derive from Abraham.
00:26:34.800 So, what's your view of that?
00:26:36.540 Like, are there things that God won't allow us to do?
00:26:39.120 The way I think about this is there's sort of three different moral philosophies somebody could adopt.
00:26:44.840 There's one, this idea of consequentialism, which is basically the end justifies the means, which you see a lot of in today's culture.
00:26:50.720 I hadn't noticed that.
00:26:51.660 Yeah, unfortunately, even in Silicon Valley, which we can talk about.
00:26:54.260 Even in Silicon Valley?
00:26:55.480 Yeah, especially.
00:26:56.000 Did you just say that?
00:26:57.120 Especially.
00:26:58.240 Especially.
00:26:58.860 Even in Silicon Valley.
00:27:00.180 Especially in Silicon Valley.
00:27:01.760 Then there's a...
00:27:02.720 Sam Altman may even be doing it.
00:27:05.440 I mean, yeah, we can talk about that.
00:27:08.240 And the thing is,
00:27:09.480 when people realize or not
00:27:10.680 there are more philosophies,
00:27:11.760 they end up succumbing to one anyways
00:27:12.980 where they recognize it.
00:27:13.940 Of course.
00:27:14.080 There's consequentialism.
00:27:15.060 Everybody's religious.
00:27:16.140 Yes, and then...
00:27:16.800 Yeah, exactly.
00:27:17.540 And then there's this concept of deontology,
00:27:19.320 which is sort of like maybe, you know,
00:27:20.520 the end does not justify the means
00:27:21.940 and there's rules, right?
00:27:22.940 Murder is bad, lying is bad, and, you know, it's kind of, no matter what the specific
00:27:26.780 circumstances are, these things are wrong, right?
00:27:28.660 There's that moral philosophy, you can adopt the ontology, which can be cycler or non-cycler
00:27:32.400 is my understanding of it.
00:27:34.180 Then there's virtue ethics.
00:27:35.380 Not really.
00:27:36.020 Okay.
00:27:36.660 If they're rules, why are they rules rather than preferences?
00:27:41.160 If you came up with them, they're preferences.
00:27:43.440 If the power that created the universe came up with them, then they're rules, they're
00:27:47.900 laws.
00:27:48.880 So one has no meaning at all.
00:27:50.440 Yeah.
00:27:50.820 Nothing can be better than anything else.
00:27:52.560 And the other is absolute.
00:27:54.380 So like, no, there can't be a secular, sorry, Aristotle, a secular understanding of absolute value.
00:28:03.780 I think there cannot be a secular understanding of divine virtue.
00:28:07.220 We can get more into this, what I mean there.
00:28:09.980 But let me just outline this quickly and then I think I'll bring it around.
00:28:12.140 So there's consequentialism, which is most people I think in contemporary society adopt.
00:28:15.380 There's deontology, right?
00:28:16.680 Which is, as you rooted, rooted in some sort of, maybe there's some universal, this is good, this is bad.
00:28:21.080 um then there's you know virtue ethics right which basically the the instead of saying oh
00:28:25.640 the the the consequence instead of saying oh this action is good because the consequence was good
00:28:30.200 or this action is good because the action is inherently good or wrong because of some secular
00:28:35.060 non-secular set of rules you're saying hey the the the actual thing that you need to measure and
00:28:39.060 you need to you need to think about is the moral character of the person doing the action and then
00:28:43.080 if the moral character if they possess these kind of cardinal virtues things like temperance and
00:28:47.220 justice and wisdom, for example, then it so follows that the action they do would be virtuous,
00:28:54.540 right? So you try to cultivate the soul basically, and then in cultivating the soul and cultivating
00:28:58.220 virtue confers basically virtue in the action, right? So basically the first two, in my view,
00:29:02.740 in my view, deontology and consequentialism is very much about the action, right? It's saying,
00:29:06.920 hey, is this outcome good based off some thing you're trying to maximize? And then deontology,
00:29:12.460 which is this concept of, forget about if the outcome is good or not, is this the right or
00:29:15.380 wrong thing. Then the concept of virtue ethics, which is instead of saying, you know, looking at
00:29:19.340 the action, right? Because ultimately human beings produce action. Actions, you know, aren't just
00:29:22.880 there. Human beings produce action. The quality of the action should be measured or it's deemed
00:29:27.240 virtuous if the person can strive and embody virtue, okay? And so personally, and I'm still,
00:29:34.700 by the way, talking about natural virtue right now. I'm not even talking about divine virtue.
00:29:37.020 I'm talking about in the intellectual plane, things that people can think about and reason
00:29:40.080 or argue over, things of the mind, not things that go beyond the mind, right? And so in the
00:29:44.500 constant of virtue ethics, I think this is the try to moral philosophy we try to embody in saying,
00:29:49.980 hey, and this comes back all the way to embryonic selection, which is, hey, there is no biological
00:29:54.440 best. There is none, right? Again, the soul, which is non-physical, ultimately does not rest,
00:30:03.340 it cannot be programmed in biology. So people can have different preferences. Somebody could say,
00:30:07.300 you know, I want my son or daughter to be a lawyer. Someone else could say, you know,
00:30:10.360 athlete. Someone else could say an entrepreneur. Someone else could say an artist. These are
00:30:13.700 different outcomes that are based off people's local preferences physical preferences contextual
00:30:18.800 preferences but they're smaller right they're smaller preferences they're not a divine uh
00:30:23.780 preference there's no such thing as that yeah i well of course i disagreed that there's no divine
00:30:29.060 preference but i there's no divine preference in biology because the divine isn't rooted in the
00:30:34.240 it's not it's not um well it depends where you think biology came from i guess i guess that's
00:30:38.520 true i mean i also don't create life no no so this is actually a paradox that i struggle with
00:30:46.640 too because another thing that i think a lot about is something called panpsychism which is
00:30:50.660 this idea that basically each object has its its consciousness even like a rock right um and this
00:30:56.780 might sound strange to people but doesn't sound strange doesn't sound strange okay i don't think
00:31:00.120 you're fully off base i don't know the answer i don't know yeah i don't know either crazy thing
00:31:03.400 So this idea that, you know, rock has a consciousness, it's a being, albeit, you know, not as sophisticated as human consciousness, but it's there. And it provides this idea that consciousness is this kind of spectrum, all the way up to, let's say, humans.
00:31:18.940 and then each thing
00:31:21.320 has this consciousness
00:31:22.120 and accordingly
00:31:23.400 it's kind of made
00:31:24.620 and it's endowed
00:31:26.500 with something
00:31:27.120 that goes beyond
00:31:27.860 just kind of
00:31:28.660 its weight
00:31:29.180 or matter basically.
00:31:30.540 It's basically
00:31:30.820 very non-imperious
00:31:31.660 non-materialist
00:31:33.580 and it basically
00:31:35.320 believes this idea
00:31:36.020 that again
00:31:36.480 God has given
00:31:37.120 this consciousness
00:31:37.660 to everything.
00:31:39.060 And I tend to
00:31:39.920 I actually like that a lot
00:31:41.060 I actually like that a lot
00:31:42.200 for a lot of reasons.
00:31:42.740 Okay so can I just ask you
00:31:43.540 a couple fundamental questions?
00:31:44.720 Sure please.
00:31:45.480 So you just said
00:31:46.320 I think you said
00:31:47.080 um, that people cannot create life? I think nature has a greater intelligence
00:31:55.020 and human beings, sometimes people will say we are part of nature, but we are nature.
00:32:00.480 But life, so you're in the life business, right? I mean, obviously you're-
00:32:04.980 We, what IVF does, for example, is they, they use natural laws. We didn't make these natural laws,
00:32:10.600 right? We, we use natural laws that exist. And then we, and then basically, and to be clear,
00:32:15.720 we're not an IVF clinic.
00:32:16.720 We work in IVF clinics
00:32:17.780 and IVF clinics are the ones 0.75
00:32:19.000 that are doing IVF.
00:32:19.640 We provide more information.
00:32:21.120 But in the context of IVF,
00:32:22.860 you are using natural law.
00:32:25.280 You are not making natural law.
00:32:27.080 You're not,
00:32:27.500 you can't make a baby.
00:32:28.760 This is, you know,
00:32:29.440 and I think there's a good chance
00:32:30.500 you may be violating natural law,
00:32:31.780 but I, you know, I don't know.
00:32:32.800 I'm not in charge,
00:32:33.680 but I just,
00:32:34.260 I want to get to the fundamental
00:32:35.700 question though,
00:32:36.740 which is who creates life?
00:32:41.680 I think I would say God,
00:32:44.760 but to be clear
00:32:46.480 so this is
00:32:48.380 this is complicated
00:32:49.080 but
00:32:49.500 you're not the only one
00:32:50.840 who doesn't
00:32:51.540 who isn't certain
00:32:52.340 I mean I
00:32:53.140 I don't know
00:32:53.680 obviously I don't know
00:32:54.800 but I
00:32:55.640 and I don't mean
00:32:57.020 to put you on the spot
00:32:57.700 who creates life
00:32:58.420 yeah I mean
00:32:59.140 come on
00:32:59.720 I shouldn't be even
00:33:01.300 asking questions like this
00:33:02.200 and expecting you to have
00:33:03.200 some cogent answer
00:33:03.900 because I don't think
00:33:04.560 anyone does
00:33:05.020 no
00:33:05.220 other than to say God
00:33:06.400 or to say more precisely
00:33:08.180 not us
00:33:08.920 not us
00:33:09.540 is that fair to say not us
00:33:11.220 yeah that is
00:33:11.420 that is fair to say not us
00:33:12.540 and we operate within that plan
00:33:13.500 and to be clear
00:33:14.180 the stories of sci-fi, right?
00:33:15.840 Like Frankenstein, for example,
00:33:17.400 or even Jurassic Park,
00:33:18.200 some example,
00:33:18.640 but Frankenstein,
00:33:19.300 this idea that we can make life, right?
00:33:20.460 We cannot make life.
00:33:21.660 That's the lesson of these stories.
00:33:24.080 Let me just say,
00:33:24.580 I think you've thought a lot more
00:33:25.600 about this in your average businessman.
00:33:27.700 So I'm, I was gonna,
00:33:30.160 I don't know how I was gonna handle this,
00:33:31.580 but you're a lot more thoughtful
00:33:33.320 than I expected for a young entrepreneur.
00:33:35.560 So thank you.
00:33:36.500 Thank you, Tucker.
00:33:37.020 No, I mean that totally sincerely.
00:33:38.780 You've actually thought a lot about this.
00:33:40.140 And I don't know the answers
00:33:41.480 to any of these questions really,
00:33:43.260 but giving my best shot um so but we both agree that some higher being created life we know that
00:33:50.580 we didn't we so we could we could assign it to nature we could assign it to god but we don't
00:33:54.320 create life we don't create life we operate within nature right amen for decades russell brand was
00:33:59.860 one of the most famous actors and comedians and agnostics in the world today he is one of the
00:34:07.920 most sincere Christians we know, a follower of Christ. His personal transformation is remarkable.
00:34:15.620 We saw it up close. He has now recounted it in an amazing book called How to Become a Christian
00:34:21.000 in Seven Days, and it recounts what happened to him, and it makes the case to all of us for
00:34:26.820 stepping away from our secular assumptions and returning to the only thing that matters,
00:34:30.800 which is God. I've read it. It's amazing. And right now, there's only one place to get it,
00:34:34.240 tuckercarlsonbooks.com. This is the first release from our new publishing company. We created
00:34:39.640 Tucker Carlson Books to bypass the censors and bring you things that are actually worth reading
00:34:44.420 and sharing. And we're starting this venture with what matters most, and that's Russell Brand's
00:34:51.460 message of the promise of forgiveness and joy through Jesus. We're proud to launch our new
00:34:55.960 bookstore with Russell Brand's How to Become a Christian in Seven Days. It is the message this
00:35:00.600 country needs most. Find us today on Tucker Carlson books. Some say the bubbles in an
00:35:04.760 arrow truffle piece can take 34 seconds to melt in your mouth. Sometimes the very amount you're
00:35:09.820 stuck at the same red light. Rich, creamy, chocolatey arrow truffle. Feel the arrow
00:35:15.800 bubbles melt. It's mind bubbling. Do we have the right to take life? So, so this is, so, so, so
00:35:27.700 no we don't um now if we talk about embryo because i assume this was your i'm not sure
00:35:36.580 i mean it has all kinds of implications including for the iran war but i'm just
00:35:40.240 it's all around us the thoughtlessness with which we take life it's it's not aimed at you it's aimed
00:35:46.560 at everybody everybody on the globe but it begins with the question do we have the right to take
00:35:51.000 life so again let's think about the different moral values that someone could have here if
00:35:54.660 someone has consequentialism, they could say, hey, look, we want to, you know, commit murder
00:35:58.460 for this good. And maybe they have some good that they do to be good. I'm highly familiar with
00:36:01.340 justifications for murder. I just want to know what you think. I'll tell you what I think,
00:36:05.560 but I just tell you that there's this kind of, it's like very pluralistic. And then somebody
00:36:08.080 could say murder is always bad, which is fine. I respect that opinion. Absolutely. And then there's
00:36:12.260 sort of this, this last bucket, which again, I'm going to keep coming back to this idea of virtue
00:36:15.040 ethics, which is what do you, like, how do you, do you, can you have a cultivation of the spirit
00:36:19.080 of the soul to think, hey, you know, what, what is right in this situation? Because society does
00:36:23.260 not have a definitive answer to this question, right? People will sometimes say, knee-jerk,
00:36:26.600 they'll say, oh, murder's always bad, but then they'll be pro the death penalty, right? Or they
00:36:30.200 pro war. People are inconsistent. There's no doubt about it. And they ignore their own
00:36:35.820 failings and highlight those of others. They've got planks in their eyes and they're picking
00:36:40.280 the sod instead of yours, famously. So I get it. People are flawed. But I do think that we can,
00:36:45.780 through a little bit of rigor, arrive at what's right or wrong. And what can we say about the
00:36:51.460 right of a person to take another person's life well i don't i i personally i don't think there
00:36:56.820 there is a right i personally don't think there's a right in any circumstance i i don't see that i
00:37:00.960 don't see that i mean and of course there's a question like what what is you know i don't think
00:37:06.240 there's a right period i just don't think so um well i'm with you i'm with you now i is right i
00:37:11.280 think we both understand it's hard not to want to exercise that right when you can or someone
00:37:17.600 annoys you or there's a country you don't like or there's a okay or so then what can we say
00:37:24.060 about an embryo in a lab yeah yeah is that life so going back to the pan-psychic philosophy right
00:37:30.340 which is this idea no no no no no no no no no no no no no no no no no no no no no no no i'll give
00:37:34.760 you a proper answer but these things are not these are things are not simple i can't be like oh yes
00:37:37.480 it's like let's just bear with me for a second there is a spectrum of consciousness there's a
00:37:41.680 spectrum from uh you know rocks to sent to a sentient being all the way to a more conscious
00:37:45.940 you know, being like a human, a more complicated, evolved, fully conscious being. And the question
00:37:50.140 is, where does an embryo sit in that? That is the fundamental question. And does an embryo have a
00:37:53.580 soul, for example? That is the key question. That is the key question in my view. I totally agree.
00:37:58.840 That is the key. Like, let's just like make no mistake. Anytime somebody argues about an embryo
00:38:03.800 and IVF, and to be clear, I just want to be very clear on the purposes of our business.
00:38:06.560 We do not do IVF. We work within IVF clinics. I understand.
00:38:08.880 Right. I just want to be very clear to everyone listening.
00:38:10.240 You're just at the intersection of like every big trend.
00:38:13.020 No, we have a huge responsibility.
00:38:14.160 Right. Yeah.
00:38:14.660 And so I think it's important to, before we can even argue, oh, is embryo life? It's like, well, where does the life come from, right? Is it the physical thing, right? For me, I think about when I think about death, I think death is a doorway. That's my own personal belief. This is a vessel, right? You're not the physical. We're not the physical. We're something else. We're metaphysical. We're soul, okay?
00:38:34.780 and so then the fundamental question um is that okay well um does an embryo have a soul um and
00:38:40.660 then i think about it i always like to think about things uh inductively so i don't i just
00:38:43.660 don't want to think about embryo but i think about you know there's a huge diversity in a range of
00:38:47.060 life and i can in my head at least and again this is the feelings of the intellect i think let's
00:38:51.980 only do so much okay but when i think about i think okay i i think about a rock which i think
00:38:57.000 has some kind of maybe proto-consciousness some like very very limited consciousness that we don't
00:39:01.740 understand maybe through some psychic or meditative work, you could try to, you know, become a rock
00:39:06.100 and try to understand it's like more subjective experience if it exists, right? All the way to
00:39:10.420 an embryo, to a dog, to a human. And so because of this spectrum, it comes down to this question
00:39:15.380 of at what point basically do we have this, is there a soul in an embryo? And I tend to think,
00:39:22.440 and I don't know obviously, but I tend to think, I tend to think that an embryo doesn't have a soul
00:39:29.980 now
00:39:31.120 why do you think that
00:39:32.420 well I don't know
00:39:33.920 I don't know
00:39:35.080 but why would you think
00:39:37.240 I would think that
00:39:37.980 there's a couple reasons
00:39:40.160 why
00:39:41.340 which is an embryo
00:39:42.260 so I can take a more
00:39:43.460 reductionist approach
00:39:44.120 and I could say
00:39:44.540 an embryo is principally
00:39:45.520 a cell
00:39:46.600 and when you
00:39:47.280 reproduce already
00:39:48.420 embryos actually
00:39:49.120 it's not just one cell
00:39:49.580 yeah
00:39:50.000 it divides
00:39:51.300 exactly
00:39:51.620 it divides and becomes many cells
00:39:52.580 but principally at first
00:39:53.240 it begins
00:39:53.620 just this one cell
00:39:54.780 I thought it was
00:39:55.440 it was the sperm and the egg
00:39:56.840 made the embryo
00:39:57.680 yeah
00:39:57.960 oh so by definition
00:39:59.360 yeah it's a cell
00:39:59.900 yeah sperm meets egg it's a cell and then it starts dividing um and it becomes more and more
00:40:05.580 of a uh eventually into a human um sorry i was gonna say something i just lost my train of
00:40:10.540 thought um so the question was you said you you tend to think that an embryo does not have a soul
00:40:17.580 and i asked why would you assume that yeah yeah i was articulating why um so when you when you
00:40:24.600 look at the way that, when you look at the way that actually people conceive naturally,
00:40:31.340 what ends up happening is that you have these formations of kind of small formations of
00:40:37.660 an embryo, okay, right, which is this, an egg meets the cell, and then it travels down
00:40:41.920 and tries to implant, and then many times actually naturally, it doesn't implant successfully.
00:40:46.580 So nature already has it such that, figure out IVF, in natural conception, it is the
00:40:50.660 case that basically you have these embryo formation and then ends up not forming. And now the way I
00:40:58.680 see it is I see that nature wouldn't make it such that or God wouldn't make it such that an embryo
00:41:04.860 would have a soul if in natural procreation, it is the case that the embryos come and go. Because
00:41:10.160 I don't think God, in my personal belief, I don't think God would basically be getting rid of souls.
00:41:15.500 I just don't think so. Now, do I think that there's a fundamental beauty, not just, I mean,
00:41:21.060 absolutely to an embryo in that, and this is really important for me to say, because I don't
00:41:25.320 know how else to say it. I do think it is similar to like a wave that forms and then again, returns
00:41:29.840 to the ocean because everything returns to the ocean. So I don't see it as something that's like,
00:41:33.480 oh, the embryo is being discarded. I see it as returning back to the source, even if I don't
00:41:37.600 believe that it has an explicit soul. Does that make sense? So it's a little more of a nuanced
00:41:42.200 argument. It does make a kind of sense. Right. Yeah. It does make a kind of sense. I don't think
00:41:47.120 it's insane. And again, I think it's, I think you've thought about this in a way that I'm very
00:41:51.860 impressed by, even if I don't agree. And I just wish more people in your business would like
00:41:56.640 think about this because that, you know. It's important. Yeah. Right. It is. It's very
00:42:01.880 important. It may be the most important thing. It is. So I guess the difference between a wave
00:42:09.200 and IVF is the human choice involved in the latter.
00:42:16.040 And so I guess the core problem that I have with this
00:42:20.080 is that I'm not convinced that we have a right
00:42:23.320 to make certain choices.
00:42:25.380 Do people have the right to make any choice available to them?
00:42:28.660 I think people don't have the right.
00:42:31.640 In our culture, people will conflate greater performance
00:42:35.580 with being morally better, which I think is a big problem.
00:42:38.660 So there's two kinds of value. There's instrumental value and there's moral value. Instrumental value is contingent. And this is actually really important. All of biology, all of nature is contingent value. For example, you would maybe want an entrepreneur potentially to be more risk-seeking, but you wouldn't want your surgeon to be more risk-seeking, right?
00:43:01.380 In other words, the value of phenotypes
00:43:02.960 actually changes depending on the environment.
00:43:04.460 And this is obvious to say,
00:43:05.800 but it's actually, I think people miss this sometimes
00:43:07.300 because they think there's a universal best.
00:43:08.900 They'll say, hey, if you optimize for X phenotype
00:43:11.120 that I deem to be best, it will lead to a better person.
00:43:13.640 It doesn't lead to a better person.
00:43:14.900 It might lead to a more optimized outcome,
00:43:16.780 but it doesn't lead to a better person.
00:43:18.460 Dude, you're destroying your own case.
00:43:19.760 No, I'm not, though.
00:43:20.760 Yes, you are.
00:43:21.340 What you're saying is right.
00:43:23.120 No, no, no.
00:43:23.520 You're telling the truth about the way people are,
00:43:26.100 which is lacking foresight
00:43:27.840 and understanding of the holistic picture.
00:43:30.520 So if people have the choice to choose their own children,
00:43:34.100 we're going to have a nation of private equity people.
00:43:37.460 No, I'm serious.
00:43:38.360 They're going to optimize for what's good right now.
00:43:40.520 Yes, that is.
00:43:41.240 Okay, so this is actually interesting.
00:43:43.400 A couple of things.
00:43:44.080 Oh, wow.
00:43:44.460 You know I'm right.
00:43:45.100 Oh, wow.
00:43:45.540 No, no, no.
00:43:46.220 You just explained it better than I could.
00:43:47.320 Tucker, Tucker, this is so interesting
00:43:48.780 because you're making an assumption.
00:43:49.940 So there's many parts of this.
00:43:50.620 About the way people are?
00:43:51.500 Yes, I am.
00:43:52.020 There's many parts of this.
00:43:53.260 The first part is,
00:43:54.360 will people basically all choose in the same direction?
00:43:56.360 And interestingly, again,
00:43:57.680 people actually want very different things.
00:43:58.700 And we see that every day with patients, right?
00:44:00.840 Which is like, there's this idea that like rich people will come in and be like, oh,
00:44:03.440 every rich person is going to pick the same way.
00:44:04.940 As you mentioned, sex is actually a great proxy for this, right?
00:44:07.540 Sex selection in the United States is about 50-50.
00:44:10.100 And so if you think about, you know, any possible phenotype, like even when somebody comes and
00:44:14.440 says, I want to optimize for type 2 diabetes risk, someone else might want to do schizophrenia
00:44:18.320 or Alzheimer's, depending on their family history.
00:44:20.780 Somebody else might want to do height, for example, if they're both shorter parents,
00:44:23.780 they might want to have a taller kid.
00:44:24.800 To be clear, the traits always come after diseases, but nevertheless.
00:44:27.320 us so what i'm saying is that there's this notion there's this idea of a universal best
00:44:32.620 biologic characteristic it doesn't exist it doesn't exist no no we're arguing two different
00:44:36.780 things i'm not saying i agree with you completely and i believe that the diversity baked into
00:44:42.420 humanity comes from god he created different tribes okay he did that on purpose yeah that's
00:44:49.160 my belief and they're different from each other by definition they're different tribes and they
00:44:53.280 have different characteristics and a lot of those as you have been brave enough to admit are genetic
00:44:57.140 and that's a fruit of the creation.
00:45:00.000 Yes.
00:45:00.220 God did that.
00:45:00.840 We didn't.
00:45:02.140 People are very different.
00:45:04.500 They demand uniformity.
00:45:05.640 And by the way,
00:45:06.440 if you think we're going to get diverse outcomes,
00:45:09.160 have you been around rich people?
00:45:10.700 They're not only very similar,
00:45:13.160 they dress the same,
00:45:14.300 they have exactly the same attitudes,
00:45:15.860 they want their kids to get into the same six schools.
00:45:18.000 I've lived in this world my whole life.
00:45:19.400 It's the opposite of what you're describing.
00:45:21.260 They will all change the same.
00:45:22.560 Rich is the same thing.
00:45:23.660 Rich people make up a very, very small set of society.
00:45:25.980 There's a big world out there.
00:45:26.920 What set of IVF patients do they make up?
00:45:29.180 What percentage? Rich people?
00:45:30.780 About all of them.
00:45:31.840 I wouldn't say it's about all of them.
00:45:33.240 There are a lot of people that take...
00:45:34.240 People who are dialed in to this technology.
00:45:36.980 People do IVF if they can't generally,
00:45:39.160 almost always, because they can't conceive naturally,
00:45:40.900 to be clear.
00:45:41.620 What does IVF cost?
00:45:44.380 It can cost quite a bit.
00:45:46.000 I know.
00:45:47.300 I'm not attacking anyone.
00:45:48.580 I know, Tucker, but this is important to say,
00:45:50.640 which is people will conceive naturally first.
00:45:52.280 Natural conception is free, to be clear.
00:45:54.080 um but yeah i that's what it cost me let's assume let's actually play this out because
00:45:59.420 actually it's really really interesting and i actually think you do touch on a fundamental
00:46:02.660 uh point on the way that people tend to move together especially especially wealthy people
00:46:08.000 they tend to do the same thing they tend to prevent it's every group i don't mean to pick
00:46:11.720 on rich people at all i'm one of them but i just am very familiar with them and yeah but but social
00:46:18.020 societies are governed by herd instincts that's why it's a society and not just a collection of
00:46:23.740 permits so i think there's there's a couple ways that i think about this there's the kind of on
00:46:29.300 the ground what i'm seeing which i can tell you about what i'm seeing and then i can tell you
00:46:31.820 about the more we can talk about like more broadly how this play out where the fact that people are
00:46:35.680 pretty mimetic and what they pick okay on the ground what i'm seeing is i see couples again
00:46:40.680 a diverse range of couples to be clear like like this technology is going to get cheaper and cheaper
00:46:47.600 whole genome sequencing specifically this is actually interesting um the cost of reading
00:46:51.500 all of somebody's DNA, it used to be about a billion dollars, one billion, right? So the
00:46:55.320 Human Genome Project in the early 2000s, it cost a billion dollars. When I started the business
00:46:59.520 about six years ago in 2020, it was about a thousand dollars, right? So a billion dollars
00:47:06.200 to a thousand dollars, that's the kind of wonder of making things cheaper and making things more
00:47:09.520 accessible. So I do think there's a point where this technology, anyone can actually access.
00:47:13.440 That's like really important to say. And that's one of my missions is to say, hey, this shouldn't
00:47:16.960 only belong for people who have means, it should belong to everybody, right? Because ultimately,
00:47:20.680 every parent should have the right to reduce the suffering in their future child.
00:47:24.540 I mean, I just think every parent should have that right.
00:47:26.540 I would never argue against the desire to reduce suffering, I guess.
00:47:32.560 But then you have to ask yourself, if the reduction of suffering is the most virtuous
00:47:38.960 thing you could do, why are the societies on this planet with the least suffering falling
00:47:44.780 apart the quickest?
00:47:46.000 Have you ever noticed that?
00:47:47.660 Well, I think in more contemporary society,
00:47:51.360 we've lost the concept of virtue generally, in my view.
00:47:54.700 But is there a connection between suffering and virtue?
00:47:58.280 And of course there is.
00:47:59.700 It's a one-to-one.
00:48:01.080 And there is no virtue without suffering, actually.
00:48:03.820 And suffering is...
00:48:04.840 So in other words, if you had a drug that could eliminate anxiety,
00:48:07.920 just take a pill, no more anxiety.
00:48:10.100 You could call it, I don't know, pick a name, benzodiazepines.
00:48:12.540 and all of a sudden you could just like eliminate this suffering and would there be downsides to
00:48:19.280 that oh there would be mass overdose deaths there would be the zombification of the entire
00:48:24.680 population there would be addiction physical addiction that you could die because of which
00:48:30.960 so i guess what i'm saying is i'm not making a case for anxiety which is horrible anyone who's
00:48:35.380 ever had it knows how horrible and terrifying it is i'm only saying that maybe there's a purpose
00:48:40.460 to suffering. We don't want to deal with it. None of us does. I certainly don't.
00:48:44.280 We can't transcend suffering in the same way we can't. Maybe we shouldn't.
00:48:48.140 But we can't. It's like saying, let's transcend gravity. We're in this world,
00:48:52.740 we're in this natural plane. We're trying to transcend suffering. And all I'm saying is
00:48:55.980 societies, I'm not for suffering. I'm against suffering. I hate war.
00:49:00.780 I don't like suffering at all. And I think we should try to alleviate it.
00:49:04.000 All I'm saying is maybe these aren't decisions that
00:49:07.780 are up to us and maybe there's a larger picture that we can't see and maybe we should pay close
00:49:15.520 attention to our successful attempts to eliminate suffering and assess the fruits like what happened
00:49:21.360 did it work or did it cause even more exquisite suffering more grotesque suffering i think that's
00:49:27.840 a very fair in the context of you know uh you know there's a great example of obviously opioids
00:49:33.680 People get addicted.
00:49:34.600 They think they're getting rid of pain.
00:49:36.260 What are opioids exactly?
00:49:37.080 Yeah, in getting rid of pain,
00:49:38.400 you're actually creating more suffering.
00:49:39.480 And that's a fair point.
00:49:41.140 I think in the context of genetics,
00:49:43.120 what we're doing is,
00:49:44.580 it's actually interesting
00:49:45.260 because it's non-invasive.
00:49:47.940 Genetic, the optimization technology
00:49:50.100 costs a couple thousand dollars,
00:49:51.460 which is a lot, right?
00:49:52.480 Which is a lot.
00:49:53.040 But it's going to keep coming down.
00:49:54.020 It's going to come down.
00:49:55.200 And so suddenly now,
00:49:56.420 at the very beginning,
00:49:58.200 you have these embryos.
00:49:59.340 Eventually, you're already doing IVF.
00:50:00.640 You're already picking an embryo.
00:50:01.580 You get more information.
00:50:02.820 You can pick an embryo with a 50% reduction risk in breast cancer.
00:50:06.300 You can have an embryo without BRCA, which is a breast cancer marker.
00:50:10.160 You can, you know, schizophrenia, debilitating condition, really impacts families.
00:50:14.300 Horrible.
00:50:14.620 Horrible.
00:50:15.360 Horrible.
00:50:16.080 The worst.
00:50:16.440 And in fact, these are the very people who wouldn't want to have a child, who wouldn't want to.
00:50:22.560 But now because of the advent of more advanced screening, they are more comfortable having a child.
00:50:26.900 And that actually, I think, gets lost too.
00:50:28.980 I'm with you.
00:50:29.800 Pro-genetic technology is fundamentally anti-eugenic.
00:50:32.860 It's actually pro-genetic technology or pro-natalist in that way because the very people who would have been deemed unfit by some definition, right, because they have more suffering.
00:50:41.000 And to be clear, if you suffer more, you have no less moral worth, to be very clear.
00:50:45.220 We've said that already.
00:50:46.020 We've established that.
00:50:46.640 You and I agree on that.
00:50:48.100 But those are the very people that genetics is helping.
00:50:51.460 That's the very people they're helping.
00:50:52.800 The very people who would have been deemed unfit by the 20th century.
00:50:56.640 Now, through this technology,
00:50:57.900 they're actually able to have a child through IVF.
00:50:59.840 They're able to have a child
00:51:00.500 and feel comfortable doing that.
00:51:02.540 Also, there's been, you know-
00:51:03.480 Wait, no, I can't,
00:51:05.300 I'm not criticizing anything you're saying.
00:51:07.860 It's just that I'm a stickler for definitions
00:51:09.520 because it's important.
00:51:10.660 Sure, sure. 0.98
00:51:11.000 This is eugenics and it's, 0.75
00:51:14.220 I mean, if you read the early eugenicists, 0.94
00:51:16.520 some of whom were really smart.
00:51:18.360 I have.
00:51:18.540 Really smart.
00:51:19.160 Eugenics was an international movement, actually.
00:51:21.260 It spanned many, many things to your point.
00:51:22.960 Oh, I'm very aware.
00:51:23.660 and it was thoroughly discredited by the Nazis 0.88
00:51:26.500 who were the most enthusiastic eugenicists of all.
00:51:28.620 I mean, they cleared out the mental hospitals 1.00
00:51:30.260 and they cleared out the disability. 1.00
00:51:32.120 But this is important though. 0.96
00:51:33.020 In that way, it's actually anti-eugenic 0.54
00:51:34.320 because the very people that like the Nazis,
00:51:36.240 for example, would target, right? 1.00
00:51:37.440 People who are sick and kill and kill and murder. 0.97
00:51:40.520 That's kind of been forgotten to history. 1.00
00:51:42.580 Horrible.
00:51:43.080 But those very people are now
00:51:44.280 that can actually access this technology.
00:51:46.300 It's actually interesting.
00:51:47.080 Hold on, hold on, hold on. 0.95
00:51:47.860 So the point, I don't want to bring the Nazis in 0.54
00:51:50.980 because it's so emotionally fraught 0.62
00:51:52.860 and they had all kinds of other sins.
00:51:55.160 But the goal of the eugenicists was the same.
00:52:00.660 It was, let's reduce human suffering.
00:52:02.340 Let's optimize human ability.
00:52:04.780 Let's make this better by being thoughtful
00:52:07.920 about how we reproduce.
00:52:10.340 And let's bring whatever science we have,
00:52:12.700 they had much less than we have,
00:52:14.220 to bear on this question.
00:52:15.580 And they did make the argument
00:52:18.100 that Lothrop Stoddard,
00:52:20.220 who was a Harvard professor
00:52:21.360 and a brilliant, legit, brilliant guy, historian.
00:52:24.720 A lot about him was absolutely virtuous, I would say.
00:52:28.680 But he was also a wild-eyed eugenicist
00:52:32.540 because he was smart and he saw all this human suffering.
00:52:34.620 He's like, let's get rid of it.
00:52:36.040 It's nothing against people with Down syndrome, 0.68
00:52:37.620 but we don't want more of them. 1.00
00:52:39.740 That was his argument
00:52:41.580 because it will reduce human suffering.
00:52:42.980 Fewer kids with Down syndrome, less suffering.
00:52:44.680 Well, it's a moral failure
00:52:45.700 because the eugenicist, in my view,
00:52:47.920 misconstrued the idea of, again,
00:52:49.920 this idea of virtue with biology.
00:52:51.980 There is no virtue
00:52:53.760 in biological characteristics.
00:52:55.020 He wasn't making that case.
00:52:57.000 He was making the case
00:52:57.980 and the smart ones were.
00:52:59.380 Tucker, please.
00:53:00.600 Less suffering.
00:53:01.340 That's what they were saying.
00:53:02.100 Less suffering.
00:53:02.720 But less suffering
00:53:03.420 isn't more virtuous.
00:53:06.300 And that's,
00:53:06.640 it's hard for people to like,
00:53:07.480 what does he mean by that?
00:53:08.940 You know.
00:53:09.260 Well, I agree.
00:53:10.740 Just because, I mean.
00:53:12.320 I believe in a religion
00:53:13.220 with suffering at the center of it.
00:53:14.640 We've all had loved ones
00:53:16.500 that have passed away,
00:53:17.840 God forbid,
00:53:18.320 from some disease, right?
00:53:19.300 I mentioned my cousin, my grandmothers both died of cancer as well.
00:53:23.660 My uncle died of a heart attack, right?
00:53:25.700 When he was playing soccer with my dad, he was 45, he collapsed and he died from a heart
00:53:32.660 attack, which by the way is the number one killer in this country.
00:53:38.100 Just because somebody, you know, had cancer, just because somebody has heart disease, just
00:53:42.600 because somebody has a condition, schizophrenia, Alzheimer's, these conditions, again, they
00:53:45.840 impact 200 million Americans.
00:53:46.840 So this is the problem of our time, okay?
00:53:49.300 does not make them any less of a person.
00:53:51.740 And so the fundamental moral failure, 0.79
00:53:54.640 it was a moral failure of eugenics,
00:53:56.420 which was misconstruing these things, 0.89
00:53:58.180 which I did that it's better to reduce suffering.
00:54:00.620 Better, that plain term of better
00:54:02.140 doesn't come from the physical plane.
00:54:03.620 It comes from something beyond.
00:54:04.940 But I'm not even sure that we're disagreeing.
00:54:07.400 I think we're agreeing that there's no,
00:54:09.780 that your physical condition
00:54:11.100 is not a reflection of your moral value.
00:54:13.620 No, but by the way,
00:54:14.560 the eugenicists got that fundamentally wrong.
00:54:16.180 Why?
00:54:16.860 Maybe I'm sure some did, but-
00:54:18.160 They were consequentialists though.
00:54:19.300 That's actually important.
00:54:20.260 Going back to the kind of different moral philosophies, if you look through the world that way, it actually helps articulate things.
00:54:24.580 They viewed it as the end justifies the means.
00:54:27.000 We should actually do this forced sterilizations. 0.97
00:54:30.160 We should make it constitutional.
00:54:31.020 I think the ends justify the means was a much less common argument among the eugenicists as it is now among the technologists.
00:54:38.960 That's true.
00:54:39.380 That's for sure.
00:54:39.780 That's very true.
00:54:40.640 And so these attitudes not only have not been suppressed or eliminated, they've flowered into like the dominant attitude in the country.
00:54:47.920 So they won.
00:54:49.820 I'm just saying, I'm not trying to,
00:54:51.920 I'm just saying this idea that you can make people better
00:54:55.720 and in fact that you should.
00:54:57.000 No, no, but that's not what we're saying though.
00:54:58.480 Remember, no, Tucker, Tucker, this is nuanced,
00:55:00.880 but it's really important for people to understand.
00:55:02.140 You're saying people have the opportunity to do it.
00:55:04.260 But people have the opportunity.
00:55:05.140 Nucleus, we never say, hey, these are your five embryos.
00:55:07.600 This is the best embryo.
00:55:08.520 We cannot, we are not divine.
00:55:10.040 We can never do that.
00:55:11.280 I understand.
00:55:11.700 But the choices that people make
00:55:13.380 are governed by a lot of things, of course.
00:55:16.920 but one of the, you know, their intuition, their religious views.
00:55:20.260 To be clear, first and foremost, it's the direct experience of suffering.
00:55:24.160 The patients that come to us without fail, and to be clear,
00:55:27.720 they might want to optimize for a trait as well.
00:55:29.360 I'm not saying, of course they would, right?
00:55:30.540 People think about these things realistically,
00:55:31.740 but the first thing they care about is my mother had breast cancer,
00:55:35.140 you know, my dad had prostate cancer, my grandfather had Alzheimer's.
00:55:38.220 So I just think-
00:55:38.760 My sister had schizophrenia.
00:55:39.440 I get it.
00:55:39.840 Right, and yeah, right.
00:55:40.560 So you want to start with the lived experience of the patient
00:55:42.800 and then go from there.
00:55:44.120 But that's all baked in the cake.
00:55:44.680 every person has experienced suffering and every person has seen a loved one die if you live long
00:55:48.300 enough. And I just want to be totally clear so I don't seem self-righteous, which I never want to
00:55:53.520 be. If I had had the opportunity when my children were in utero or before to say no to schizophrenia,
00:55:59.840 no to the things that I really fear, schizophrenia is at the top of the list. I think it's the
00:56:04.940 cruelest thing. But also CF, which is in my family, all these things. By the way, I'm a
00:56:11.200 carrier for cystic fibrosis.
00:56:12.300 Yeah, a lot of people are.
00:56:13.360 Yeah, a lot of people are, yeah.
00:56:14.620 And I don't want my baby,
00:56:16.000 God forbid, to have that.
00:56:16.800 Of course not.
00:56:17.460 No, though, actually,
00:56:18.720 the therapies for CF have,
00:56:20.600 you know, that's a whole
00:56:21.240 separate conversation.
00:56:21.860 I don't want to be boring.
00:56:22.600 But anyway, I would just say,
00:56:24.400 like all expectant parents,
00:56:26.120 if I'd had a chance to reduce
00:56:28.180 or eliminate the risk
00:56:29.020 that my children would have
00:56:30.200 these horrible diseases or conditions,
00:56:33.080 I would have taken it.
00:56:34.260 Absolutely.
00:56:34.500 How could you not?
00:56:35.380 Absolutely.
00:56:35.840 So I'm not judging anybody.
00:56:37.480 I get it completely.
00:56:38.740 I would have done it.
00:56:39.400 But my question is, honestly, what's the effect of giving people this choice, which is to improve, in their minds, you say you're morally neutral on it, not attaching a value to deafness or hearing, but we're not.
00:56:56.820 Okay.
00:56:57.820 But people do.
00:56:58.940 Everybody does.
00:56:59.720 Everyone other than you does.
00:57:00.780 Everyone other than you does.
00:57:01.820 No, no, no.
00:57:02.320 But to be clear, we can have more philosophy and then say, but most people will reject the idea that there's this idea of conflating reduced suffering.
00:57:08.980 and they would say that's better.
00:57:10.320 Of course.
00:57:10.580 And then we can play that out.
00:57:11.580 So let's play that out.
00:57:12.700 Let's play out how it actually is.
00:57:13.940 So you tell me what you imagine
00:57:16.160 because this is one of the biggest changes
00:57:18.840 in human history.
00:57:20.720 I will say, Tucker, I will say again
00:57:22.080 that people will make different choices.
00:57:23.900 I really want to say that.
00:57:24.680 There's actually two parts to this argument.
00:57:26.440 You're dodging, though.
00:57:27.040 No, no, I'm not.
00:57:28.080 Some people will make different choices.
00:57:32.740 It's a random distribution of choices?
00:57:34.580 Is that what you're saying?
00:57:35.320 I'm not saying that.
00:57:36.380 I'm not saying that.
00:57:37.280 Okay.
00:57:37.500 Um, what I am saying though is people will bring their, so when we think about this,
00:57:41.680 like to make it like more intuitive for people is if you think about like our, there's this
00:57:47.980 concept in cell and molecular biology.
00:57:49.760 Okay.
00:57:50.260 It's called, um, it's basically this concept called, um, uh, it's eluding me basically
00:57:56.800 that the more specialized something is, the more effective it is.
00:58:00.460 So in biology, you see things specialize all the time, right?
00:58:03.280 So for example, things begin stem cells, they become neurons, they become immune cells,
00:58:06.680 they become different parts of the body
00:58:08.640 because these bodies have
00:58:10.240 different functions and so you need different
00:58:12.580 specializations
00:58:13.280 and when you actually I'm a big believer
00:58:16.640 that like everything mirrors everything
00:58:18.400 from the molecular to the celestial
00:58:20.440 everything
00:58:21.200 and so let me keep going with this
00:58:23.940 and so
00:58:24.520 I can remember what it is
00:58:28.480 specialization breeds sophistication
00:58:30.860 okay that's true
00:58:32.460 in cell and molecular biology which is specialization
00:58:34.800 breeds sophistication the more specialized something
00:58:36.440 things, the more sophisticated it is. And so in a society, if you look at people who are really
00:58:41.080 high in their craft, like Alyssa Liu figure skating versus like an Einstein versus like an
00:58:46.640 Elon versus like, I don't know, like an artist like Da Vinci, these people have very different
00:58:51.020 sets of characteristics. And the way nature works is human beings cannot defy nature. It's a seesaw.
00:58:58.940 So let me give an example. Every single time, people always say this to me, they say, oh,
00:59:03.580 people pick for IQ. Let me put aside my moral argument. Let me put aside my people won't
00:59:08.200 actually always pick for IQ, but let's actually assume that's the case. Let's assume that's the
00:59:11.740 case. Let's assume that's the case. Everyone will pick for IQ. One interesting thing about picking
00:59:15.520 for IQ genetically is that when you pick for IQ, and this is interesting because when you tell
00:59:20.540 patients this, you can see how they refactor the decisions. When you pick for IQ, you're actually
00:59:24.460 picking against conscientiousness and extroversion genetically. It's a seesaw, right? It's almost
00:59:29.080 like if you're playing like a FIFA My Player or something and you make somebody stronger,
00:59:32.960 they have less agility, right?
00:59:34.480 So what happens is,
00:59:35.880 and also you're making them,
00:59:37.040 genetically speaking,
00:59:37.980 more likely to be autistic.
00:59:39.700 So these things are genetic. 0.68
00:59:40.660 You can't defy these things, right?
00:59:44.180 So these things go in opposite directions.
00:59:45.800 So you start selecting for one,
00:59:46.960 it actually takes these things away.
00:59:48.280 So it starts becoming more of a value judgment.
00:59:50.580 I understand.
00:59:51.040 So wait, let me play this out.
00:59:52.100 So let's assume that, to your point,
00:59:54.800 there's a fashion of the day, right?
00:59:56.700 People are, you know,
00:59:58.780 we've seen this with fashion,
00:59:59.760 we see this in tech,
01:00:00.520 we see this, you know,
01:00:01.260 VC investors,
01:00:01.840 they all allocate toward AI. People will end up wearing the same thing in Soho and New York.
01:00:06.180 How is this possible? People will go to the same private schools. You were saying this, right?
01:00:09.260 All these things end up kind of the taste follow through. So let's assume all the rich people
01:00:12.600 basically start optimizing for IQ or everyone actually start optimizing for IQ, not just rich
01:00:17.160 people. Everyone starts optimizing for IQ. There's actually an evolutionary mechanism. It's called a
01:00:21.000 frequency dependent selection. What is frequency dependent selection? What it basically means is
01:00:25.620 that the rarer a phenotype becomes relative to the other phenotypes. So in this case, for example,
01:00:31.940 if everyone picked for IQ, extroversion and conscientious starts decreasing, okay, in terms
01:00:37.340 of the prevalence of the population, the more valuable that phenotype becomes. In other words,
01:00:42.380 the rarer that extroversion and conscientious becomes, the more valuable it actually becomes
01:00:46.680 to actually flourish in a population. So you're arguing it's a self-correcting problem.
01:00:50.080 And that's the key point, which is we think as humans, we can defy nature.
01:00:59.620 We cannot defy nature.
01:01:01.140 We have to operate within nature's bounds, within evolution's bounds.
01:01:04.000 We have to operate within this framework.
01:01:06.220 So if that were true, then why did India ban sex-selective abortions?
01:01:10.940 It's interesting because India specifically was about, so let's actually walk through this.
01:01:15.280 india was about 55 45 uh males to females 55 45 right um people actually think often was higher
01:01:23.160 and by the way the natural rate of having a boy is actually slightly biologically higher than a
01:01:27.560 girl so people think it's actually 50 50 it's actually not it's actually like 52 48 so actually
01:01:32.240 to that perspective it's actually it is statistically significant but it's actually
01:01:35.220 not insanely high and on that point also which is actually interesting over a billion and a half
01:01:38.980 people it's yeah it can it can it can absolutely over generations but but actually it's not i think
01:01:43.900 what's interesting here is, this is just a kind of a factoid, but males, babies, they tend to
01:01:50.460 actually have the higher risk of basically dying at infancy. So it ends up happening. If you look
01:01:54.720 at the general population, it's about 50-50, but actually biology has it that it slightly errs
01:01:58.540 toward males. But let's take the sex example. Let's say it plays out that, you know, over many
01:02:04.360 generations, people, let's say it wasn't outlawed or people still practice it anyways, and people
01:02:07.920 start picking across sex. It's actually the same phenomena, whereas the number of males, for example,
01:02:12.760 come down, the number of females come down, because of frequency-based selection, let's say
01:02:17.020 you're in a population, just very simply, there's 70 males, 30 females, the value of female in that
01:02:21.480 population is much higher. And basically, you can model this and show that each successive
01:02:25.700 generation, there are certain sets of genetics that confer a slightly higher probability then
01:02:30.500 of having a female. And so, that will actually propagate such that the genes that confer higher
01:02:35.040 females would keep proliferating through until the population comes back to actually equanimity.
01:02:39.540 So, why did they ban it? 1.00
01:02:40.500 Well, obviously, that's like a longer-term evolutionary thing to saying that things were self-correct.
01:02:44.960 So it actually wasn't self-correcting, and it was making the society unstable.
01:02:49.100 I mean, if human choice on questions of life and death and procreation at this granular level is self-correcting, and it's just inherently good and there are no downsides, then why did the biggest country in the world ban it?
01:03:00.300 To be clear, I'm not saying that there's not short-term material consequence for something like sex selection.
01:03:07.500 Of course, there's especially sex selection. I'm not saying that.
01:03:09.760 Why is that more significant than any other kind of selection?
01:03:12.540 Sorry?
01:03:13.220 Why is that unique?
01:03:15.600 Sex selection?
01:03:16.300 It's not, actually.
01:03:17.180 Well, it's unique in that...
01:03:18.080 Over IQ.
01:03:19.020 I mean, these are deep characteristics.
01:03:22.840 Defining characteristics.
01:03:24.060 It's actually an interesting point you make on sex because if you look at sex,
01:03:28.120 it's a way of kind of playing out what happens when people pick across traits.
01:03:31.700 Because sex is not a disease.
01:03:34.060 It's a choice.
01:03:35.200 Depending on what you want, people make different choices.
01:03:37.340 So it's actually a good kind of heuristic of how people will choose.
01:03:40.520 And on that point, actually, interestingly, sometimes we receive criticism from, for example, the American Society of Reproductive Medicine for saying that traits are not reproductive medicine.
01:03:49.940 However, sex is ultimately a trait that people have been picking for the last 20 years.
01:03:54.100 So there's a bit of this hypocrisy in medicine as well.
01:03:56.300 I guess what I'm trying to get to is really the core question, which is, is there a downside to playing God?
01:04:01.620 Okay, first off, we're not playing God.
01:04:02.940 Well, of course we are.
01:04:04.000 We're making choices that were not available to us until very recently that have never in human history been made by people ever.
01:04:10.260 Not one time.
01:04:10.920 We cannot play God.
01:04:12.840 God created us.
01:04:15.200 God created everything here.
01:04:16.120 We cannot play God.
01:04:18.240 Let me be more precise and use a less charged way to describe it.
01:04:22.520 We are doing things that have never been done in human history.
01:04:25.060 That's actually not true, I would argue, in this case.
01:04:26.980 Well, it's very true.
01:04:29.760 How long have test tube babies, IVF, been around?
01:04:32.000 Yeah, IVF's been around since 1970s, so it's about 40 years, actually.
01:04:35.960 And by the way, it's not like you look around, you're like, oh, that's an IVF baby.
01:04:38.820 I'm not attacking IVF.
01:04:40.340 Yeah.
01:04:40.500 I'm certainly not attacking IVF babies or people at all.
01:04:45.120 I'm merely saying that in the scope of human history, this is brand new.
01:04:50.760 When you say this, though, what do you mean?
01:04:52.320 The ability to choose the traits of your children with this level of precision,
01:04:58.020 to get a certain number of embryos and say,
01:05:00.080 I want the ones that don't have these conditions,
01:05:02.600 that do have these traits.
01:05:04.480 That has never been tried in human history, period.
01:05:07.860 I would...
01:05:08.400 Well, there's no debating that.
01:05:09.960 I would caveat a little bit.
01:05:11.580 When you...
01:05:11.860 So remember, you're picking from...
01:05:13.220 Do the Sumerians do this?
01:05:13.960 Wait, let me just be clear. 0.81
01:05:14.980 You're picking from the pool that...
01:05:17.480 So when you pick your partner, for example,
01:05:19.160 you're setting the possible genetic pool.
01:05:21.000 So for example, two short parents...
01:05:22.780 This is what mating is.
01:05:23.760 Yeah, two short parents are not going to have a tall baby, right? 0.51
01:05:25.920 The same is actually true for genetic optimization.
01:05:27.340 You can't have two short parents have a tall child with this technology.
01:05:30.320 You can't have a taller child.
01:05:31.800 I understand, but the core point is this is something, this is an acceleration.
01:05:39.040 Look, people want this.
01:05:40.640 I wouldn't debate you there.
01:05:42.400 And people do calculate these things as they choose a mate.
01:05:45.680 Of course. 1.00
01:05:46.300 He's too dumb. 1.00
01:05:47.060 I can't marry him. 1.00
01:05:47.820 He's too short.
01:05:48.480 I can't marry him.
01:05:49.640 He's from, you know, whatever.
01:05:51.260 There are lots of genetic qualities that people don't want to pass on.
01:05:53.640 In doing that, they're actually picking, by the way, the most important set of outcomes for their child.
01:05:57.660 Because it's your partner.
01:05:58.540 It's the other side of it.
01:05:59.440 Absolutely.
01:05:59.940 Yeah.
01:06:00.440 But never with this level of precision.
01:06:03.560 Never has there been a menu where you can say, where you can identify qualities that you can't identify by smell or sight.
01:06:11.000 You can't know so much of what you've just described except through brand new science.
01:06:17.940 So I'm not even attacking that.
01:06:19.960 I'm merely asking a question that has to be asked,
01:06:22.860 which is, what are the downsides?
01:06:26.620 So, I mean, we talked about the,
01:06:29.480 I mean, you pointed out one of the downsides,
01:06:30.740 which is like, okay,
01:06:31.440 if everyone starts picking for a specific sex,
01:06:33.940 for example, right,
01:06:35.380 it can create population problems.
01:06:39.160 And even if I would argue,
01:06:40.380 and I did argue,
01:06:41.720 hey, over time, this actually is self-corrected,
01:06:43.680 which I think is true and valid.
01:06:44.720 Have you told the audience that?
01:06:46.220 So this will be self-corrected, right?
01:06:48.220 But obviously in the short term,
01:06:49.420 there's still like an acute problem, right?
01:06:51.740 But I would say actually IVF has been operating
01:06:53.440 for, again, for 40 years.
01:06:56.380 And other policies, like for example,
01:06:58.800 China's one child policy,
01:07:00.200 has led to much greater problems. 0.99
01:07:01.720 IVF is still the way,
01:07:02.740 2% of the way babies are born.
01:07:04.420 I think your principal concern
01:07:06.440 on where this can go awry,
01:07:09.080 I mean, there's a long history
01:07:11.380 in science fiction of people thinking,
01:07:15.020 oh, you know, oh, like, you know,
01:07:17.520 I can, you know, Frankenstein,
01:07:19.260 I mentioned Frankenstein.
01:07:20.260 It's literally that.
01:07:20.820 It's somebody saying, hey, I could make life, right?
01:07:23.980 And then-
01:07:24.160 How about COVID?
01:07:25.920 Jurassic Park, actually, too, is this idea that, hey, I can do this.
01:07:28.920 And then there's negative, unforeseen consequences.
01:07:31.440 I would argue both of those were consequentialists.
01:07:32.940 I don't think that's science fiction.
01:07:34.400 I mean, hey, let's create Lyme disease.
01:07:36.880 Hey, let's create, I don't know, let's strengthen this virus.
01:07:42.200 Oh, gosh, it's out of the lab.
01:07:43.480 Intentionally or not, it doesn't matter.
01:07:44.540 You infect the world with COVID.
01:07:45.640 That just happened five years ago.
01:07:47.320 So it's like we don't need to look far to see the unintended consequences of emerging science.
01:07:53.220 I'm not blaming anyone for it.
01:07:54.900 I think people have a terrible track record for seeing the consequences of their actions.
01:07:59.280 We know that in our own sex lives, don't we?
01:08:01.500 So I think we can just say it's important with something this powerful and potentially transformative to, A, admit that there will be unintended consequences because that's 100% true always.
01:08:14.740 and think through B, what those consequences might be.
01:08:17.540 That's all I'm saying.
01:08:18.240 I agree.
01:08:18.640 I think we should be tangible with them though
01:08:20.160 and make sure people actually understand.
01:08:21.540 So like, again, IVF is the way 2% of the way babies are born.
01:08:25.160 IVF has been operating in the United States
01:08:27.080 for about 40 years.
01:08:29.020 This is not like-
01:08:30.320 40 years?
01:08:31.700 It's 1970s.
01:08:33.040 Oh, I was there, I remember.
01:08:33.980 Yeah, yeah.
01:08:34.500 The test two babies on the cover of Time Magazine.
01:08:36.200 It was, yeah.
01:08:37.060 I mean, people don't call it that anymore, actually.
01:08:37.880 Are there any consequences to that?
01:08:40.100 To IVF?
01:08:41.060 Yeah, have we studied the consequences?
01:08:42.280 Yeah, they've actually tracked children.
01:08:44.820 The study size are a little bit smaller from when I looked into it than one might expect.
01:08:48.960 But basically, they see no material difference, no.
01:08:51.080 Is it true?
01:08:52.760 That what?
01:08:53.420 The size is smaller than I'd expect?
01:08:54.620 There's no measurable difference at all between children born from an IVF procedure and children conceived naturally.
01:09:00.780 Obviously, there's some environmental things you're taking averages.
01:09:03.620 But yeah, when I looked into this, and I've obviously talked to a lot of scientists about this as well,
01:09:07.800 they said, yeah, there's no difference, yeah, which is pretty amazing.
01:09:10.620 amazing. But actually I think it's a testament to nature. Well, we can track it over the course of
01:09:14.560 the decades. Well, this isn't nature, of course. It's something that we are, well, it's by definition
01:09:19.640 not nature. It's something that people are doing in order to improve nature. Like nature would be
01:09:24.680 infertility. I'm against infertility, by the way. I'm not arguing for infertility, but I'm just
01:09:28.000 saying it's whatever it is, it's not nature. It's the opposite of nature. I think we are operating
01:09:31.580 within nature. So let's go into the framework of God created these natural laws. We're using
01:09:36.900 natural laws. We're not making life. We didn't go to a lab and make life. We're using the principles
01:09:42.180 of nature, using the principles of predity, and we're applying them. It's still beautiful. It's
01:09:47.040 still very beautiful. I'm not saying- So I think we are using nature. I'm not saying it's bad or
01:09:51.400 not beautiful. I'm just saying it's not nature any more than nuclear weapons are nature. You
01:09:55.500 can say, well, they're made from atoms, the essential building block of matter. Okay. But
01:10:00.700 But we're exerting force and our will on nature to create an outcome that wouldn't occur if we didn't do that.
01:10:08.280 So it's by definition on nature.
01:10:09.880 The outcome could have actually occurred even if you didn't necessarily do it.
01:10:12.280 It could have just the baby could have happened that way.
01:10:14.680 But also I would say that remember that there's gene editing, which is much further out.
01:10:19.380 It's the idea that you can actually take an embryo and make it whatever you want, basically.
01:10:23.240 Theoretically, we can talk about that, which is very, very different.
01:10:25.880 So I think the concept of IVF clinics using this technology to give patients more information,
01:10:30.700 When they're already getting information on their embryos, now we expand the information, we can help deal with the chronic disease crisis in the United States, the rare disease crisis as well, right?
01:10:38.900 Genetics is unique.
01:10:39.700 I've seen, oh no, I appreciate the upside.
01:10:41.880 No, I agree with you on the upside.
01:10:43.320 I just want to know the downside.
01:10:44.440 Yeah.
01:10:44.880 And I don't, I don't see, hear any, there's no downside.
01:10:47.480 Of course, of course there's downside.
01:10:48.940 What do you imagine it might be?
01:10:50.340 Well, I think let's, let's play this out.
01:10:52.380 Okay.
01:10:53.480 The first thing I'd say is that with IVF at its prevalence today at 2%, I think it's, it's actually more or less fine.
01:11:00.160 and 2% is about 1 in 50 babies.
01:11:02.200 I think I'm going to outline the scenario
01:11:03.760 where I think there's a lot more risk
01:11:05.420 and where human reproduction is going to materially change.
01:11:07.980 We might argue that,
01:11:08.740 you might argue this is a material change.
01:11:10.820 I would argue IVF was the principal material change.
01:11:14.580 You're arguing that it's a material change
01:11:16.140 because you're saying that we're going to have
01:11:18.080 less chronic disease, lower healthcare costs,
01:11:20.660 less suffering, and that's all good.
01:11:22.120 Patients can choose that.
01:11:23.700 You've argued that would be the result,
01:11:25.620 and you're right.
01:11:26.380 It will be the result, and I'm for it.
01:11:28.020 I just want to say I'm for it.
01:11:28.980 I'm just saying that whenever I hear the upside,
01:11:33.200 as you would in any scenario,
01:11:34.820 including your personal family investments,
01:11:36.500 like, tell me the downside.
01:11:37.580 If someone says, well, there's no downside,
01:11:40.240 then I'm like, I don't know if I trust you anymore.
01:11:43.220 So what's the downside?
01:11:45.140 Again, I will articulate downside.
01:11:46.680 It's just, I have to explain.
01:11:48.060 No, you're going to blame some other technology.
01:11:49.660 No, I'm not going to blame some other technology.
01:11:51.440 You're saying gene editing's bad.
01:11:52.420 No.
01:11:52.640 But what about the technology that you're offering
01:11:55.540 has an upside?
01:11:57.100 I totally agree with you.
01:11:58.060 and that will be real
01:12:00.440 and I'll support it
01:12:01.700 I would support
01:12:03.760 I don't know
01:12:04.900 a lot of things
01:12:06.380 but what's the downside
01:12:08.580 like you must have thought about that
01:12:09.880 of course
01:12:10.300 I mean
01:12:11.080 fundamentally
01:12:14.340 this technology
01:12:18.480 can be exploited
01:12:19.860 by centralized bodies
01:12:21.240 to try to control reproduction
01:12:22.880 that is the downside
01:12:24.580 that is the story of the 20th century
01:12:26.320 sorry for getting emphatic
01:12:27.000 but it's just like yes
01:12:27.800 That is the downside.
01:12:28.660 We've seen the downside.
01:12:29.600 We've experienced the downside.
01:12:31.220 But to be clear, that is a moral failure.
01:12:34.860 That is not a failure of the technology.
01:12:36.480 I've established that eugenics, for example, was decades before genetics.
01:12:40.620 Yeah, it's a distinction without a difference in my view.
01:12:43.060 But what you're saying is, without saying it explicitly, that people misuse the creation.
01:12:51.360 And they use it for good, but they also use it for bad.
01:12:53.480 And that's just how people are.
01:12:54.800 And they've always been that way.
01:12:56.080 And they will always be that way.
01:12:57.800 So, with that in mind, I don't think it's just, I totally agree that, of course, centralized powers, whoever they are.
01:13:05.560 Yeah, well, yeah, yeah.
01:13:07.300 I'm not even sure who they are, but they clearly exist.
01:13:09.540 Governments, principally.
01:13:10.680 I mean, that's the 20th century.
01:13:11.740 The Epstein class that runs the governments or whoever these entities are, they, yeah, that's bad. 0.90
01:13:18.820 I totally agree.
01:13:19.700 But the experience of India shows us that given choice, people will also make the wrong decisions as individuals.
01:13:30.780 So I'm just wondering what those consequences might be.
01:13:34.420 Let me just say, I'm interested in this because I have hunting dogs and I've had them my whole life.
01:13:38.620 And hunting dogs are bred for certain qualities.
01:13:41.880 And I watch it carefully and dogs have such short life cycles relative to people that you can kind of in your lifetime watch this happen.
01:13:48.080 But they're bred for certain, I have flushing dogs, spaniels, and they're bred to work close to you, find the bird, jump the bird, retrieve the bird.
01:13:58.880 If you are not very careful about breeding them, or if you breed them only for certain specific qualities, you can wind up destroying the dog.
01:14:06.820 And this is well-known in animal husbandry, it's well-known in bird hunting, it's well-known among anybody who deals with animals.
01:14:12.760 and i don't see people as any different and i know that there are massive consequences to the
01:14:19.040 dog you get dogs that die of cancer at five you get dogs with hip dysplasia you get dogs with
01:14:23.060 unexplained rage that bite your children like we can't foresee with any precision the effects of
01:14:29.600 our tinkering with with reproduction absolutely let me actually give a real example of this so
01:14:33.840 in in china um the scientist who was known for using gene editing to uh engineer the first
01:14:40.160 babies, actually, Dr. He.
01:14:42.980 What he did was he
01:14:44.100 engineered the CCR5
01:14:45.860 gene. I believe that's what the gene was called.
01:14:48.200 And he used CRISPR. CRISPR
01:14:49.760 is a bacterial immune response
01:14:51.960 system. It stands for, you know, clustered, regularly
01:14:54.040 interspaced, short palindromic repeats.
01:14:56.020 Basically refers to the
01:14:57.380 set of palindromic DNA
01:14:59.760 sequences in a bacteria.
01:15:01.580 And he used that to make a gene editing device
01:15:03.660 called CRISPR. And he
01:15:05.880 basically used CRISPR. Oh, I remember very well.
01:15:08.780 And CRISPR
01:15:09.680 is composed of two things
01:15:10.960 it's composed of like a guide
01:15:12.440 like basically
01:15:13.260 imagine it takes the
01:15:14.340 device to the right part
01:15:15.580 of the DNA
01:15:15.920 which is like a scissors
01:15:17.020 and then it
01:15:18.060 excuse me
01:15:18.500 it has a guide
01:15:19.320 which takes the
01:15:20.240 CRISPR to the right part
01:15:22.040 of the DNA
01:15:22.240 and then it has an endonucleus
01:15:23.060 which basically cuts the DNA
01:15:24.080 a little bit of
01:15:26.220 technical explanation
01:15:26.720 basically you can use
01:15:28.140 a bacterial immune response system
01:15:30.160 harness it as a genealogy device
01:15:31.660 okay
01:15:32.100 and this is what the scientist did
01:15:33.760 and I'm obviously
01:15:35.660 you know about the story
01:15:36.400 and he went and he actually
01:15:37.360 engineered human embryos
01:15:38.300 okay
01:15:39.780 It's going on now.
01:15:41.020 In China?
01:15:42.160 Oh, in other parts of the world too.
01:15:43.980 So basically what he did was he knocked out the CCR5 gene.
01:15:47.480 And his justification for knocking out this specific gene was that it would make the children basically resistant to HIV, AIDS.
01:15:57.740 That was what he said.
01:15:59.560 This is really interesting for a lot of reasons.
01:16:02.180 One is because you didn't need gene editing to do that.
01:16:04.600 You could have actually just done that with existing genetic technology.
01:16:07.700 that was much cheaper, much less expensive.
01:16:09.700 But even putting that aside,
01:16:10.880 getting to the fundamental thing that you're articulating,
01:16:12.700 which is the unintended consequences.
01:16:14.380 When you actually optimize for knocking out that specific gene,
01:16:18.860 you're also opening up the susceptibility of that baby
01:16:22.480 to other infectious disease.
01:16:24.320 Because what CCR5 does is it encodes for a specific immune receptor
01:16:28.280 that basically when destroyed,
01:16:30.140 it makes it easier for other pathogens to basically infect you.
01:16:34.260 In other words, there's this, the dangerous side of this, to your point, is that balance, which is in trying to do something good, what he deemed to be virtuous, if you will, it actually potentially could have had very severe consequences on the children's health.
01:16:48.740 And so I think that's a very real, tangible example that we've seen of some of the dangers and the balancing act that is nature.
01:16:56.920 And that's really important to say.
01:16:58.020 What about in your life, have you ever wound up with something that you didn't expect and maybe didn't want and found it to be a great blessing over time?
01:17:06.240 Yeah, absolutely.
01:17:07.540 I mean, meditation.
01:17:10.180 No, but that's something you presumably you chose to try.
01:17:14.360 Well, kind of false to you.
01:17:14.720 I think, you know, sometimes you, you know, a broader force guides you to these things.
01:17:19.260 Yeah.
01:17:19.340 You know, the experience of having children is the most profound example of that.
01:17:23.300 I think if you ask any parent, um, or most parents, many parents will tell you, like,
01:17:27.840 I didn't expect this at all.
01:17:29.140 Yeah.
01:17:30.020 Um, I, I didn't grow up with girls, didn't have a mom, didn't have sisters, didn't want 0.99
01:17:34.160 girls. 1.00
01:17:34.660 I don't understand girls like my wife, but don't want girls. 1.00
01:17:37.080 Ended up having a ton of girls. 0.99
01:17:39.160 Never would have chose that.
01:17:40.460 Yeah.
01:17:41.040 And really one of the great experiences of my life, truly, I mean that.
01:17:46.100 And, um, I'm not embarrassed to say this because my girls know I feel this way, but, uh, and
01:17:50.740 And I, you know, anyway, I never would have, if I'd had the choice, just like, I don't
01:17:55.060 get girls.
01:17:55.640 I can't be the father of girls. 0.79
01:17:56.900 Like what?
01:17:57.460 Yeah.
01:17:58.280 And yet that again turned out to be this great blessing.
01:18:01.400 And I, I'm really glad I didn't have the choice.
01:18:03.620 Have you ever had an experience like that?
01:18:06.880 I mean, yeah, I think some of the best things that happen in life are not things that you
01:18:12.560 can't control.
01:18:13.380 It's part of the divine.
01:18:14.680 Yes.
01:18:15.040 Yes, absolutely.
01:18:16.000 Absolutely.
01:18:16.460 A hundred percent.
01:18:16.820 And sometimes there are things that, man, you don't want at all.
01:18:19.580 But it's actually good for you.
01:18:21.320 It's the best for you.
01:18:22.680 It's the best thing for you, yeah.
01:18:23.580 The thing that you want isn't the thing that you need.
01:18:25.980 So maybe if you get to be the author of your own story
01:18:30.100 and of your own children,
01:18:31.720 the more control you have,
01:18:34.140 the more you get what you want,
01:18:36.560 the more totally you're destroyed.
01:18:40.600 I don't know about that.
01:18:41.700 Maybe it's not good for you to get everything you want.
01:18:44.000 That's been my experience.
01:18:45.380 Let's not remember, though.
01:18:46.080 But like genetics, obviously, is not deterministic, right?
01:18:51.880 So there's two other parts of life.
01:18:54.480 Wait, what?
01:18:55.040 You were just telling me it was-
01:18:56.120 It's not deterministic.
01:18:56.460 We can get rid of all these diseases, which I'm for.
01:18:59.080 But Tucker, a good example is like lung cancer.
01:19:01.420 You smoke, increase your risk of lung cancer.
01:19:03.120 There's some genetics component, but it can be both, right?
01:19:05.040 Also, this is your enjoyment of life.
01:19:07.220 I just want to put in a good word for smoking, if I could.
01:19:09.620 Yeah, heart disease as well, right?
01:19:11.540 Obviously, there's a family history component to it,
01:19:14.080 but there's also like what you eat,
01:19:15.260 how much you exercise.
01:19:15.860 these things. And so under the framework, you think, okay, like what I think is really important
01:19:21.860 in life, in life, which again goes well beyond genetics, you know, we're not genetic determinists
01:19:27.000 here, obviously, that's just not reality. Again, I will go back to the spiritual and cultivation
01:19:32.900 of the soul. That cultivation of the soul to eventually, hopefully divine virtue, union with
01:19:39.360 God, right? That is available to everyone independent of their biological characteristics.
01:19:45.380 And so I think it's important not to, again, conflate
01:19:47.120 optimizing your specific outcome.
01:19:48.000 No, you've made that point and I so appreciate it.
01:19:50.000 But that point is such, that is the point.
01:19:51.760 That is the point.
01:19:52.460 The point is that the union with God ultimately is,
01:19:55.100 that is what life is about.
01:19:56.120 So you're not actually removing,
01:19:58.300 like this idea that like you can,
01:20:00.180 like if there was a world where somehow parents
01:20:02.300 could perfectly predict the baby's going to be like
01:20:05.000 this and this and this,
01:20:05.880 you can't physically,
01:20:07.520 you can't encode the soul is what I'm saying.
01:20:10.880 It doesn't come from biology.
01:20:11.920 We know a lot.
01:20:12.280 So there's stochasticity always is what I'm arguing.
01:20:14.140 Yeah, I mean, but you're arguing the margin.
01:20:15.700 I mean, what you're saying is right.
01:20:17.020 It's true.
01:20:17.640 There's no debating what you're saying.
01:20:19.040 It's fact.
01:20:20.100 And I appreciate that you're saying it.
01:20:21.580 Yes.
01:20:22.080 But it's equally true that we are exercising powers that we didn't have until very recently
01:20:27.200 and that we know more than we ever have.
01:20:29.960 And I just think, and I don't think we can stop it.
01:20:32.620 I don't think there's any way we can stop it.
01:20:34.440 If you weren't doing this and the gene editors weren't doing it, I mean.
01:20:37.260 I don't like that more philosophy generally.
01:20:39.260 Maybe you're right.
01:20:40.060 I actually think people, I think people way overshoot that.
01:20:42.600 I can't stop it.
01:20:43.200 People way overshoot the idea that, oh, technology is inevitable.
01:20:46.420 Technology is not inevitable.
01:20:47.720 This drives me crazy.
01:20:49.420 People make choices that drive technology forward.
01:20:51.880 Technology does not just happen.
01:20:53.060 It's been, you know, 20 years of really 15 years probably since, you know, some of these
01:21:02.620 more advanced screenings have existed, but they've never actually been adopted, right?
01:21:06.020 So the idea that technology naturally progresses is it's a narrative created by Silicon Valley
01:21:10.280 to try to justify raising more money.
01:21:12.020 And by the way, taking away more responsibility.
01:21:14.260 No, people make choices that drive technology forward.
01:21:16.220 I think you're to an extent right.
01:21:18.140 I mean, this is a whole separate conversation.
01:21:19.740 I don't want to bore our remaining viewers with,
01:21:21.900 but I do think we make choices.
01:21:23.940 That's absolutely right.
01:21:24.860 And it's incumbent on us to try to make the right choices
01:21:27.600 for ourselves and those around us.
01:21:29.000 Okay, all true.
01:21:30.360 Those choices matter.
01:21:31.320 Also true.
01:21:31.940 We are also products of the time in which we live
01:21:33.880 and the systems in which we operate.
01:21:35.380 So those things are equally true.
01:21:38.480 Again, I don't want to be boring,
01:21:39.780 but I agree with you.
01:21:41.180 our choices are important. But there's also, again, a lack of respect for what we don't know,
01:21:49.560 which makes me very uncomfortable in science. And one of the reasons that I think that we should
01:21:54.920 put a lot of doctors and scientists in prison as soon as we can is because they've really hurt us
01:22:00.380 over the last, say, six years by not acknowledging what they don't know, overstating their own
01:22:06.640 foresight about things that no human being can know. Like there's no respect for the limits of
01:22:13.800 the human mind. Okay. And suddenly we have these enormous powers that are not actually matched to
01:22:21.060 our wisdom at all. And I just, I just want to say out loud, I'm really worried about it. And I think
01:22:26.640 certain individuals should be punished for doing this. Like the guys who made COVID in the lab,
01:22:30.940 They're not in jail?
01:22:32.520 Like, what?
01:22:33.900 Does that bother you?
01:22:35.640 Do you think it's a lesson?
01:22:36.580 Does that tell us anything?
01:22:37.740 Yeah, it is definitely a lesson.
01:22:39.680 We have to be responsible stewards of the technology.
01:22:43.380 Should there be punishment for people who, like, kill millions through their foolishness?
01:22:47.460 Yeah, I mean, I think the key is that, like, again, genetics can program for somebody to be smarter, but it cannot make somebody wise.
01:22:58.520 And the idea that you can genetically encode somebody's life, again, that's not true.
01:23:03.400 Like nature, like in the DNA, in the nucleus, that's not true.
01:23:07.940 So I want to be clear that you're not controlling the life outcome of your child.
01:23:12.400 You're not going to be like, okay, now the child's going to become LeBron James and they're
01:23:15.740 going to be on the star.
01:23:16.520 That will come from the virtue of hard work, et cetera.
01:23:18.940 So genetics is important.
01:23:20.420 Genetics is important.
01:23:21.120 It plays a factor.
01:23:21.820 It plays a role.
01:23:22.260 But I'm not going to sit here and say, oh, genetics is everything.
01:23:25.320 It's not.
01:23:25.940 It's not.
01:23:26.460 But nobody's making the case that it is.
01:23:27.720 No, but the argument that you can control, parents can control their child's life trajectory would suggest that genetics is pretty deterministic, but I'm saying it's not.
01:23:33.480 I'm actually making the opposite argument, which is you have no freaking idea what's going to happen when you tamper with this stuff.
01:23:38.700 We actually know way less than we think we do.
01:23:40.680 We have less control than we imagine, and that we should proceed with that in mind.
01:23:44.720 That's my only argument.
01:23:46.260 But my question is much more specific.
01:23:48.480 You said the technology is not inevitable.
01:23:50.540 I kind of agree with you.
01:23:51.840 It's not inevitable, no.
01:23:52.500 We certainly have an obligation to do our best.
01:23:54.280 Yeah.
01:23:54.480 For the people who didn't do their best and who hurt others, like the whole world, like the guys who designed COVID in the Wuhan lab, which they did, we've established that, shouldn't there be some punishment for them?
01:24:08.180 And wouldn't that help future generations make wiser decisions if they saw that there were consequences to being thoughtless with technology?
01:24:14.880 Well, I think generally speaking, the kind of history, at least like the modern history of like Silicon Alley has gone from, I think it had some idea of kind of virtue ethics, right? Like, you know, Google back in the day was don't be evil. If you say that today, you'll kind of be laughed at. That was like their corporate motto.
01:24:37.080 So you had, Paul Graham had his, you know, hackers and painters, this idea that that was kind of this, like, kind of a beautiful early Silicon Valley spirit.
01:24:46.740 There was another case of Steve Jobs, 2005 Stanford commencement address. 0.97
01:24:55.180 He ended it by saying, stay hungry, stay foolish.
01:24:57.900 Basically, humility, have humility, open yourself up to the world, not just the natural world, but the divine world.
01:25:03.940 I think a lot of the Silicon Valley ideology has moved from sort of hackers and painters to, you know, maybe capitalists and, you know, politicians or the like.
01:25:12.880 In other words, it's moved into kind of a techno capitalism, this idea that technology is inevitable, this idea that capitalism is inherently good, like it's inherently good if something grows.
01:25:23.920 And you say that with AI companies all the time, they'll celebrate, oh, we hit 100 million AR in, you know, two days or something.
01:25:29.600 And it fundamentally mistakes speed and the rate at which something grows with value, right?
01:25:38.960 Cancer grows very quickly.
01:25:40.260 It's horrible. 0.79
01:25:41.120 And so I think there's this fundamental idea that, you know, this kind of grow, grow, grow, grow, that, you know, inherently the consequences, like, you know, be damned, just grow.
01:25:51.100 Growth is inherently good.
01:25:52.100 I think that fundamental philosophy is so bad.
01:25:55.120 Well, it's a self-justification.
01:25:56.680 Yes.
01:25:56.840 But I wonder where it grows from.
01:25:58.420 I think you described crisply and well the evolution of the attitudes in Silicon Valley, generally speaking, from, hey, this is going to liberate everybody, it's good, to, hey, this hikes GDP, and I've got a massive place in Atherton, therefore it's good.
01:26:15.340 And those are definitely different justifications.
01:26:19.060 And I wonder to what you attribute the change.
01:26:21.500 Like, how did that happen?
01:26:22.980 How did you go from one place to another?
01:26:24.520 And here's my thesis in one sentence.
01:26:26.540 Power.
01:26:27.640 Yeah.
01:26:27.880 When you get a lot of power, you get corrupted, exactly.
01:26:32.140 Power corrupts, yeah.
01:26:33.080 So there's no greater power than determining what kind of kids people are going to have.
01:26:38.660 So like, are you worried at all?
01:26:41.120 Again, we don't determine what kind of kids we'll have.
01:26:43.280 Yeah, you do.
01:26:43.520 We don't.
01:26:44.320 Overpopulations.
01:26:44.680 We don't.
01:26:45.080 No, we don't.
01:26:45.500 Because people are making their own choices.
01:26:46.520 We don't make the choice for them.
01:26:47.560 People are making their own choices.
01:26:48.680 You could easily make the choice.
01:26:49.580 No, we don't.
01:26:51.020 We don't.
01:26:51.540 But you could.
01:26:52.040 You could just say, we're only testing for these three things or whatever.
01:26:55.240 You design the screen.
01:26:57.000 but therefore you design
01:26:58.660 the outcome of a population.
01:27:00.320 No, virtue is not in biology.
01:27:03.180 Okay.
01:27:03.640 So no, we do not encode populations
01:27:05.640 because human beings can't encode.
01:27:08.380 It makes a mistake
01:27:10.140 assuming that we are God.
01:27:11.080 We are not God.
01:27:11.820 We are not God.
01:27:12.040 It's going to affect the nature of people.
01:27:14.980 So that's an inescapable fact.
01:27:17.480 And I think it's important
01:27:18.380 to just wear the mantle.
01:27:19.760 Like this is what we're doing.
01:27:21.220 We're changing the nature of people.
01:27:23.160 We're going to try to make them better.
01:27:24.500 Nature is a very tricky word.
01:27:25.940 the nature of people
01:27:26.960 comes from God
01:27:27.620 it doesn't come from genetics
01:27:28.540 the substance of people
01:27:29.480 their intelligence
01:27:30.180 their height
01:27:30.680 their lifespan
01:27:31.960 that's a key distinction though
01:27:33.100 because ultimately
01:27:34.260 any human being
01:27:34.920 should want
01:27:35.520 again greater
01:27:36.500 spiritual cultivation
01:27:37.740 okay
01:27:38.420 but I'm just saying
01:27:39.500 you are part of
01:27:40.540 not you alone
01:27:41.300 or even substantially
01:27:42.900 but you're part of
01:27:44.060 a trend in science
01:27:45.420 that
01:27:45.760 will
01:27:47.240 change the nature
01:27:48.560 of people
01:27:49.580 so
01:27:50.860 I do think it's worth
01:27:52.000 just admitting that
01:27:52.840 because then
01:27:53.380 once you realize
01:27:54.260 the burden on your shoulders
01:27:55.280 you can
01:27:55.760 bear up under it do you think or i think we yeah definitely this technology i just want to be very
01:28:02.020 careful with the word nature versus biological characteristics i agree that we're changing
01:28:04.900 how long people are changing you're changing that so that alone is yes how tall people are
01:28:10.780 how well they do in the sat but again it's not deterministic that way it's not like you can
01:28:14.460 look at somebody's dna and be like oh they're going to get a 15 70 in their sat but i agree
01:28:17.780 through that overpopulations and we're talking about populations and you're saying it's you know
01:28:22.400 ivf is two percent or whatever but i'm just saying the technology we can see where this is going
01:28:25.960 you offer people a chance to have children who are healthier and smarter and they're going to
01:28:30.460 take it and i've already admitted that i would have taken it because i love my children yeah it's
01:28:35.200 that simple so we know this is going to happen if the technology exists and it's widely available
01:28:39.700 and so that puts you and not just you of course this is hardly an attack but it puts you in a
01:28:45.840 position of having power over the course of humanity over the evolution of humanity we're
01:28:53.340 watching humanity change at the individual level and like that's a big burden man that's a burden
01:28:58.940 that only god bore before like 20 years ago we are not god and we can never be god good well
01:29:03.780 that's a good start we are not god we are not god do you see it as profound absolutely yeah i mean
01:29:10.180 i mean to to to see patients who have had some again i use the huntington's example right to see
01:29:22.420 a loved one die at age 25 because their brain decays and then to never want to have a child
01:29:28.560 huntington's is really and then to be able to use the technology the emotion
01:29:32.780 you know the miracle that they can have a baby basically and that's that's that's amazing it
01:29:40.160 is amazing but i with respect i think having watched i mean i was out in silicon valley in
01:29:44.500 the 90s covering this and i knew the people i still know some of them they were totally fixated
01:29:50.200 on the upside yeah in a good way yeah they were like this gives the encyclopedia britannica you
01:29:55.740 probably know what that is but it's a physical encyclopedia that sat on your shelf and costs
01:29:58.980 like thousands of dollars yeah that's replaced by this cd-rom you know this yeah collection of
01:30:05.680 ones and zeros and like it's incredible the amount of information people will be so much better
01:30:08.940 informed and now you look 30 years later and that's like definitely upsides to technology but
01:30:15.120 also downsides well we're we're susceptible to the same force because we're we're human well
01:30:20.180 that's exactly the argument i'm making yeah i agree that yeah we are so sort of the same force
01:30:23.860 it's it's it's how you know how can how can we continue to do that spiritual work because it is
01:30:31.680 spiritual work right to cultivate the soul to make sure we maintain in these values that i'm that i've
01:30:38.780 articulate I totally agree so here's my
01:30:40.820 final question I'll stop torturing you
01:30:42.260 I think you've done such a
01:30:44.880 great job actually thanks
01:30:46.460 I'm what is nothing to do
01:30:48.840 with you I'm just worried about these things and you're smart
01:30:51.000 and you've again for the third time thought about
01:30:52.880 them to a surprising degree for a guy who's
01:30:54.800 also trying to like build a company I'm impressed thank you
01:30:56.820 but um if we're
01:30:58.800 gonna proceed
01:31:00.520 one hopes with this kind
01:31:02.900 of science in a way that
01:31:04.560 creates rather than destroys
01:31:06.420 that we need to keep in mind, as you said 20 times,
01:31:09.860 the spiritual dimension.
01:31:11.000 Yes.
01:31:11.920 But the spiritual dimension is a dividing point.
01:31:16.120 Some things are good for the spirit
01:31:17.480 and some things are bad for the spirit.
01:31:19.440 Some things are consistent with virtue.
01:31:22.400 Yeah.
01:31:22.900 Some things are not.
01:31:23.860 And if we believe in God,
01:31:24.840 we believe God prefers some outcomes over others.
01:31:27.140 God has rules.
01:31:28.020 It's the nature of God.
01:31:30.300 So will there be an attempt to say,
01:31:35.080 no, these are the rules.
01:31:35.940 Like you can't test for this certain thing.
01:31:37.780 You can't make this choice.
01:31:38.980 You have to constrain people's choices at a certain point
01:31:42.040 if you're going to remain consistent with any kind of ethic.
01:31:45.840 Yeah.
01:31:46.700 No, I thought a lot about that.
01:31:48.000 It's very tricky because you need... 1.00
01:31:49.900 Just as India did. 0.86
01:31:50.720 India said there was a billion people. 0.96
01:31:51.980 You can't make that choice.
01:31:52.780 Sorry.
01:31:52.960 No, that's a very tricky.
01:31:54.760 It's very tricky and very complicated.
01:31:58.220 I think the key thing that we have to do as a business
01:32:00.700 and the moral line that people can hold us to
01:32:02.740 is nucleus has not, is not,
01:32:08.160 and will never say that one embryo
01:32:10.320 is better than another embryo.
01:32:12.180 We just won't.
01:32:13.100 Because again, we cannot mistake instrumental value
01:32:16.080 with moral value.
01:32:17.160 They're different things.
01:32:18.340 And I think in deeply recognizing that
01:32:21.440 and deeply realizing, by the way,
01:32:22.880 the indeterministic nature of genetics as well,
01:32:25.500 as I said, heart disease,
01:32:26.960 you can have a bad diet,
01:32:28.900 you can not exercise lung cancer,
01:32:30.540 even for things like schizophrenia
01:32:32.660 as I mentioned strong genetic components
01:32:34.180 but you can take you know
01:32:35.360 weed actually has made people
01:32:36.720 more schizophrenic for example
01:32:38.620 so there's environmental components as well
01:32:40.560 and so I think
01:32:41.800 you have to have the deep humility in saying
01:32:44.160 there's no better
01:32:46.120 maintain that moral philosophy
01:32:47.520 because that is the foundation for me
01:32:50.460 you can't say it's better to be
01:32:52.460 non-schizophrenic than schizophrenic?
01:32:55.420 I don't think it's for me to say though
01:32:56.940 I also
01:32:57.780 I also don't think though to be clear
01:33:00.180 when we use the term better
01:33:02.280 we start applying moral value
01:33:03.920 and again I don't think
01:33:04.720 moral value lies
01:33:05.480 in the realm of
01:33:06.040 biological characteristics
01:33:06.960 I don't think so
01:33:08.240 so there's no moral guide at all
01:33:10.000 no that's not true
01:33:10.760 there's universal morality
01:33:11.800 which is natural law
01:33:13.540 and divine virtue
01:33:14.320 you can't say that it's better
01:33:14.700 not to have schizophrenia
01:33:15.840 than to have schizophrenia
01:33:16.760 well again when we say better
01:33:18.720 I think we're just like
01:33:19.320 defining it differently
01:33:20.080 I think it's better
01:33:20.900 in the sense that
01:33:21.460 it reduces suffering
01:33:22.320 okay
01:33:23.140 absolutely
01:33:23.440 if that's your measure
01:33:24.300 then it's better
01:33:24.880 yeah exactly
01:33:25.160 but what's your measure
01:33:26.040 exactly
01:33:26.620 but it's honestly better
01:33:27.480 in terms of the worst person
01:33:28.320 so this is totally immoral
01:33:29.660 This is literally immoral.
01:33:30.580 It has no reference.
01:33:31.140 No, it's not immoral.
01:33:31.840 No, not at all.
01:33:32.560 Because everything has a spirit, as I said.
01:33:34.220 Just because there's the physical world
01:33:36.220 and then each thing has a divine spirit to it, right?
01:33:39.320 So each thing has some virtue
01:33:40.960 or opposite of virtue, vice, for example, right?
01:33:43.720 That's true.
01:33:44.280 That's a true thing.
01:33:45.000 But again, these things are not actually
01:33:45.960 incompatible with each other.
01:33:46.780 They're actually compatible.
01:33:48.380 But as a company,
01:33:49.340 can you say there's anything you won't do?
01:33:52.200 On behalf of Nucleus, I think,
01:33:54.640 well, when you say anything we won't do,
01:33:55.740 I don't know.
01:33:56.480 You just said biology has no moral reference.
01:33:59.540 because everything has a spirit.
01:34:01.260 I'm just wondering,
01:34:01.860 is there like a line
01:34:02.720 where like we're not doing that,
01:34:06.820 period, because it's wrong.
01:34:08.220 We're not providing an analysis,
01:34:09.760 for example.
01:34:10.900 Like we're not providing some analysis.
01:34:12.340 That's what you mean.
01:34:12.740 We're not going to make
01:34:13.280 certain behavior easier.
01:34:15.700 When you say certain behavior,
01:34:16.820 you mean picking for a specific
01:34:17.880 like characteristic?
01:34:19.100 I don't know.
01:34:19.600 I mean, I could manufacture fentanyl
01:34:21.440 for a living and say,
01:34:22.540 I'm not forcing people to take it.
01:34:23.840 It's their choice.
01:34:24.840 But I would say
01:34:25.660 I'm not manufacturing fentanyl
01:34:26.800 because it's bad.
01:34:27.900 It's just inherently bad.
01:34:28.940 It degrades people and in some cases kills them.
01:34:31.860 So I'm not doing that.
01:34:32.720 Yeah.
01:34:33.400 So I don't know that it isn't enough to say,
01:34:36.320 let the people decide.
01:34:37.580 No, it's not.
01:34:38.640 You have to be careful.
01:34:39.820 Like giving IQ analysis, for example, right?
01:34:42.660 We've gone through many, many iterations
01:34:44.360 of the best way of doing it.
01:34:45.220 And we sort of slow rolled it out.
01:34:47.600 Personally, because we didn't want people to misunderstand it.
01:34:50.280 We don't want people to think,
01:34:51.580 because again, genetically, it's just like not possible
01:34:53.560 in the same way that there's always environmental components
01:34:55.680 that you can just like look at somebody's DNA
01:34:57.500 and guess the SAT score.
01:34:58.540 That's like people's very simplistic model, which is like, right?
01:35:01.520 But so I'm saying that the way we have a responsibility to very carefully communicate that result.
01:35:06.740 So the IVF clinic, the patient, the physician, everyone understands it.
01:35:11.360 And then when I think when people understand it, it takes it from sort of the sensationalist things and just grounds it.
01:35:15.860 Well, you shift the moral responsibility from yourself to your customers.
01:35:18.340 No, we're still morally responsible.
01:35:19.600 We ship a product.
01:35:20.520 In what way?
01:35:21.460 I could make a product and say, oh, this embryo is better than this embryo.
01:35:24.940 I mean, that would be principally the most immoral line that we could cross.
01:35:27.540 I could say, for example, this embryo is going to be super, super, super smart, right?
01:35:31.620 No, we're careful in the way we say things.
01:35:32.960 Well, that's just a false claim, right?
01:35:34.680 Yeah, I mean, it would be false, but also like people-
01:35:36.260 But what you're saying is that the moral decisions rest with the customers, not with you.
01:35:39.840 They decide what's better.
01:35:40.960 Is it better to have a kid with Down syndrome or not?
01:35:43.080 They decide you're not going to have any role in the moral decision making.
01:35:45.500 Patients can't, so again, there's no moral value because that comes from God, but patients
01:35:49.780 can decide instrumental value, right?
01:35:51.640 Like going back to the deaf couple, the deaf couple deemed it to be best, right, for what
01:35:57.020 they want for the outcome they're optimizing for. In this case, best means optimizing for the set of 0.92
01:36:01.200 biological characteristics for some outcome. For example, somebody might want their daughter to be
01:36:08.440 shorter to be a gymnast, for example. Somebody might want their son to be tall to be an NBA
01:36:12.020 player. Someone else might say, I don't care how athletic they are. I don't care how pretty they
01:36:15.180 are. I want them to be an academic and study really hard their entire life. Depending on those
01:36:19.760 things, as I mentioned in cell biology, specialization breeds sophistication. You realize
01:36:24.620 very quickly, very intuitively, that the value
01:36:26.720 of a phenotype is contingent to its
01:36:28.500 environment. I get it. So, I, I, this
01:36:30.580 is what it comes back to. It's like, it's up to, it's, it's up
01:36:32.640 to them, the parents, to decide what is their
01:36:34.460 instrumental value that they map to these phenotypes
01:36:36.640 and to pick. It's up to you when you want to take fentanyl.
01:36:38.740 No, I get it. I get it. I just,
01:36:40.580 I just hope it works. I,
01:36:42.540 I think the worst things
01:36:44.860 that I've ever done are the
01:36:46.540 things with the greatest promise.
01:36:48.740 Like the iPhone. Like I got,
01:36:50.640 I was so psyched for the iPhone. I was like, I don't
01:36:52.600 need a computer. Yeah. I can work in my living room. Yeah. Next thing you know, you can't have
01:36:56.980 a conversation with your wife. Yeah. Social media is, it's really bad. But it's bad because it's 1.00
01:37:01.740 good. Benzodiazepines are great. That's why they're terrible. Does that make sense?
01:37:10.780 Benzodiazepines are like the greatest drug. Have you ever taken a benzodiazepine? I took it one
01:37:14.340 time in high school. One of my, a kid on my hall in boarding school, his dad was a pharmacist and
01:37:18.860 he had Valium and I was like, I'll take anything. You know, whenever I was a child, I was an idiot. 1.00
01:37:22.020 I take this thing. I was like, that's the greatest thing I've ever taken. And it was so good. I never 0.99
01:37:26.320 took it in because it freaked me out because there was no downside. Literally all of your
01:37:31.400 like voices in your head, any woman listening will know what I'm talking about. Like the things
01:37:35.820 are like, whatever, going on in the background, silenced. Everything's fine. You're not like
01:37:40.540 stoned. You're not out of it. You're just like, great. You're improved. You're your best self.
01:37:46.520 And my animal sense, even in 10th grade, I was like, that's bad.
01:37:51.260 Yeah.
01:37:51.840 Super bad.
01:37:52.680 Whereas you do other drug, you do cocaine, stay up all night doing cocaine, you suffer
01:37:56.760 the next day.
01:37:57.780 And so there's, it's really clear, this is not good, right?
01:38:00.580 Benzos are the best.
01:38:03.940 And that's why they're the most addictive, most dangerous, most society destroying product
01:38:08.840 that we make.
01:38:09.640 Yeah.
01:38:10.560 Does that make sense?
01:38:11.560 Yeah, that makes sense.
01:38:12.440 Yeah.
01:38:12.540 The badness is in direct proportion to the promise, the goodness.
01:38:17.520 Yes.
01:38:18.320 Yes.
01:38:18.960 And then there is a moral character of the person giving out to that drug.
01:38:23.540 And in social media case too, talking about moral philosophy, optimizing for clicks and dopamine, you end up following a consequentialist framework, right?
01:38:31.160 Because there's no virtue.
01:38:32.040 You end up following a consequentialist framework and justifies the means to the point that everybody's scrolling and liking and clicking all day.
01:38:37.320 100%.
01:38:37.800 right so it's the question that you're asking is how do you there is this problem of power because
01:38:44.620 power corrupts absolutely yeah absolutely there's a problem slocum valley which is there's a promise
01:38:48.940 but then you underestimate the thing it's like how do you maintain virtue basically the question is
01:38:52.500 how do you maintain virtue um how do you maintain your soul and your spirit despite these pressures
01:38:57.300 um what's the answer well one it's you know it's really hard i imagine i imagine and i'm hoping to
01:39:03.960 practice for Nucleus and for hopefully this industry, it's praying, it's meditation,
01:39:10.560 it's deep, deep humility with realizing, going back to what I said, there's a raindrop.
01:39:16.800 If you think that the raindrop is the entire world, you're figuring out the entire ocean.
01:39:20.560 That's where I come back to. Yeah. Well, you have a lot of authority. You have a lot of power for
01:39:25.220 a young man, much more than I ever will. And so use it wisely. And thank you for your
01:39:30.100 thoughtfulness and you're willing to have this conversation. And I'm sure it's been hellish for
01:39:33.000 you, but you've done a great job. Thank you, Tucker. Thank you. Appreciate it. Thanks.