WarRoom Battleground EP 891: Tech Bros, Designer Babies, and AI Suicides
Episode Stats
Words per Minute
177.52277
Summary
In this episode, we talk to Emma Waters, a policy analyst at the Heritage Foundation, about artificial intelligence and its impact on our understanding of what it means to be a human being in an age of A.I. and the ethical implications of genetic engineering.
Transcript
00:00:07.700
Pray for our enemies, because we're going medieval on these people.
00:00:12.940
I got a free shot at all these networks lying about the people.
00:00:20.540
I know you try to do everything in the world to stop that,
00:00:24.500
And where do people like that go to share the big lie?
00:00:28.800
I wish in my soul, I wish that any of these people had a conscience.
00:00:34.680
Ask yourself, what is my task and what is my purpose?
00:00:38.440
If that answer is to save my country, this country will be saved.
00:00:54.540
The AI is going to be in charge, to be totally frank, not humans.
00:00:58.800
If artificial intelligence vastly exceeds the sum of human intelligence,
00:01:04.660
it is difficult to imagine that any humans will actually be in charge.
00:01:10.020
I think in general, we're all grappling for the right words to describe the arrival of this very,
00:01:17.440
very different technology to anything we've ever seen before.
00:01:20.560
The project of superintelligence should not be about replacing or threatening our species.
00:01:28.480
And it's crazy to have to actually declare that.
00:01:31.760
We may be able to give people, if somebody's committed crime, a more humane form of containment
00:01:37.980
of future crime, which is if you say, like, you now get a free optimist,
00:01:45.340
and it's just going to follow you around and stop you from doing crime.
00:01:50.560
It's just going to stop you from committing crime.
00:01:53.540
You don't have to put people in, like, prisons and stuff.
00:01:55.440
It's pretty wild to think of all the possibilities, but I think it's clearly the future.
00:02:04.100
I'm Joe Allen, and this is War Room Battleground.
00:02:07.840
As the posse well knows, the wealthiest men on Earth are pursuing a dream of artificial superintelligence.
00:02:15.920
This is a totalizing vision, a dream of a digital god against which no human being can be defiant,
00:02:29.020
If anything remotely close to something like this was created,
00:02:33.960
there would be no recourse but to either serve this digital god, perhaps become its pet,
00:02:40.800
and in the end, as you drift off into sleep in your death pod, maybe become biofuel.
00:02:47.920
At the moment, this is only a dream, but it's a dream fueled by techno-capital,
00:02:54.120
the U.S. government, the U.S. military, and, as far as I can tell,
00:03:00.100
enough of the U.S. population that these men feel emboldened to talk about artificial superintelligence,
00:03:07.540
not as a nightmarish world, but as the world that inevitably is to come.
00:03:12.600
On the flip side of this is the dream of creating better and better human beings
00:03:18.780
that may, if not compete against such a system, would at least be of greater use, greater effectiveness.
00:03:25.480
And so you have everyone from Sam Altman to Brian Johnson to many scientists working diligently in China
00:03:34.940
attempting to improve human biology in an age of A.I.
00:03:40.640
that includes neurological alterations, changing the brain,
00:03:45.940
implanting electrodes in the brain to connect the mind to artificial intelligence,
00:03:54.940
Babies, designer babies, that have either been selected for their genetic superiority
00:03:59.960
or are actively altered in the germline in order to create a new, more perfect human to endure
00:04:08.700
and perhaps, if the dreams are to be believed, thrive in an age of A.I.
00:04:15.600
Here to talk about this in detail is Emma Waters, a policy analyst at Heritage Foundation.
00:04:24.480
We look forward to the nightmarish work that you've been doing at Heritage.
00:04:31.660
The first thing I'd like to talk about really is a bit about your background
00:04:35.460
and how it is you came to look at everything from embryo selection to genetic alteration,
00:04:42.440
both the ethical and the legal implications of it.
00:04:44.900
Yeah, so it all starts back during the pandemic as technology was radically mediating every part of human life.
00:04:50.840
So, first of all, we couldn't even talk to other people.
00:04:55.500
They were telling us we'd be happier if we were in these houses mediated by our screens.
00:04:59.800
We began to think of ourselves differently, and I realized that this was not the end.
00:05:04.320
Technology was only going to continue eroding our understanding of what it means to be human
00:05:10.320
So I got really interested in not only technology when it comes to transgenderism
00:05:16.580
but specifically how is technology changing human conception and development?
00:05:20.900
How is it changing the ways that we think about what it means to be a parent, a child?
00:05:24.700
Because what this technology ultimately does with gene editing and gene selection
00:05:28.020
is it gives parents this false idea that they actually have control over their children,
00:05:32.700
that they could choose the perfect child or the ideal child.
00:05:36.080
And that seems to me, as a mother of two young children,
00:05:38.540
as perhaps one of the most foolish and foolhardy things that we could do,
00:05:41.640
because ultimately you're never going to be able to control your child.
00:05:44.120
But what happens to an entire generation of people if that's the mindset and approach we take?
00:05:49.240
Tell me then, just if you would, describe to the audience the process by which embryos are selected
00:05:54.820
and you have the pre-implantation genetic testing, you have IVF, you oftentimes have surrogate mothers.
00:06:04.820
So in IVF today, you take egg and sperm, you create embryos.
00:06:08.900
About 40% of all clinics allow for something called pre-implantation genetic testing,
00:06:13.360
which is where you take a couple of cells from the embryo, biopsy them,
00:06:17.120
and then you analyze them to understand what are the potential health outcomes.
00:06:20.860
So things like Down syndrome or Tay-Sachs disease.
00:06:23.460
And that's pretty common across IVF cycles today.
00:06:26.380
You can choose things like the sex of the embryo with almost near accuracy.
00:06:30.580
I always like to say there's no gender confusion in a fertility clinic lab.
00:06:35.560
But there's a number of companies that have actually moved to take that further.
00:06:38.960
So you have ORCID, you have Nucleus Genomics, you have Herocyte,
00:06:42.300
funded by all of these same Silicon Valley billionaires that we talked about in the introduction.
00:06:47.040
And what they're doing is they're taking that a whole step further.
00:06:49.980
And they're not only looking at basic single gene outcomes,
00:06:53.880
but they're looking at over 1,200 potential health conditions.
00:06:57.860
And then they're predicting the likelihood that your embryo, your child,
00:07:02.240
could have things like heart disease or diabetes or male pattern baldness
00:07:06.260
or the height of the child or the eye color, the personality, the looks.
00:07:11.460
You could choose an energetic child, a narcissistic one, a competitive one.
00:07:14.820
And they're promising couples that just with their gene analyzing technology
00:07:20.200
that they can actually predict and promise you a certain kind of child.
00:07:24.100
So it's not only about healing infertility, which is why a lot of people go to IVF,
00:07:27.700
but it's about selecting a certain kind of child that fits their own preferences.
00:07:32.540
So, I mean, if you look at behavioral genetics,
00:07:36.800
there are reasons to think that certain gene sequences will result in a competitive
00:07:46.080
But how much should any potential designer baby consumer trust these kinds of promises?
00:07:54.100
And there are many scientists across the United States and the world
00:07:56.840
who have rightly pointed out that with the current knowledge and understanding
00:08:00.780
and technology that we have today, genetics, so far as we understand them,
00:08:05.080
play about a 5% to 10% indicator of the child's actual outcome.
00:08:09.960
So even if you understood the full genetic profile of your child,
00:08:13.340
that's only about 10% of the time going to influence how the child actually is,
00:08:19.300
So even if you think that there is a higher likelihood that your child is going to have heart disease,
00:08:23.520
90% of the actual decision of whether the child gets heart disease or not
00:08:28.920
come down to their lifestyle, their environment, how you raise them.
00:08:35.540
So genetics play a pretty small role, relatively speaking.
00:08:40.760
even if our understanding of genetics really explodes in the next 10 to 20 years,
00:08:46.040
we're likely, like the max we're probably going to reach is like a 15% understanding,
00:08:50.320
like genetics being 15% determinant of overall outcomes.
00:08:56.040
and there's still a lot of uncertainty about whether they have any idea what they're actually predicting.
00:09:01.900
On that, though, I mean, this is controversial.
00:09:04.480
A number of geneticists would argue that it's quite a bit more than even 50%
00:09:08.900
as far as genetic contribution to a child's personality, everything else.
00:09:13.960
Obviously, with things like eye color, Down syndrome is 100%.
00:09:17.320
But given, setting aside the controversies on that,
00:09:22.860
when you see these companies that are promising more intelligent children,
00:09:31.540
are parents doing this by and large with eugenics in mind,
00:09:36.660
or is it something that gets sublimated as health or even just aesthetics?
00:09:45.620
When we're talking about eugenics, what we're talking about is any primary or secondary characteristics
00:09:51.040
that determine the worth or value of the human person,
00:09:54.160
which means that if you're looking at an embryo and you're assessing an embryo's life
00:09:58.000
or its ability to be implanted, live, take their first breath,
00:10:04.540
then you've now reduced that human person down from this gift received
00:10:11.240
and you've now made it something that is optional based on the secondary or even primary characteristics.
00:10:16.940
So that's what we mean when we're talking about eugenics,
00:10:18.860
that a person isn't just a person, but Joe, you know, I don't really like your eye color,
00:10:23.040
so I guess you don't really have a right to exist because I've decided.
00:10:29.240
This is what we do in embryo selection all the time.
00:10:36.320
But when you're coming at it with the parents in mind,
00:10:39.180
you are tapping into some of the most fundamental fears and desires of a parent.
00:10:43.920
What parent doesn't want to have a healthy child?
00:10:49.120
But what they're obscuring behind the scenes is that it's not just about healing a child
00:10:53.680
or optimizing the health of your child later on,
00:10:56.720
but it's actually reducing the child down to these primary predictive characteristics
00:11:01.520
and then choosing genetic winners and losers before they've ever had a chance to take a breath.
00:11:06.720
And so parents think that they're giving their child the best chance at life,
00:11:09.900
but in reality, they're just sorting their embryos to choose which ones live
00:11:13.340
and which ones either are discarded and destroyed or frozen indefinitely.
00:11:17.880
And what's so daunting about this is if you look at the websites like ORCID or Nucleus Genomics,
00:11:24.640
It says embryo number one, and then it lists out characteristics, eye color, potential height, hair color,
00:11:30.680
and then it gives you the likelihood of getting a whole number of diseases.
00:11:35.020
And the thing is, many of these diseases or conditions are things that are easily treatable with modern medicine.
00:11:40.020
So it's not that we're opposed to even gene and cell therapies, right?
00:11:42.960
Like there are good gene editing tools available,
00:11:45.760
but it's gene editing that works within the body to heal disease.
00:11:49.460
These are things that are very much in the Make America Healthy Against space.
00:11:56.420
whereas this technology chooses which humans get to exist in the first place.
00:12:00.680
Now, in the process of pre-implantation genetic testing, IVF, you have to have options.
00:12:07.060
And so they produce 5, 10, even up to 15 embryos.
00:12:12.360
Once the supreme embryo is selected, the rest get hucked into the bio bin, right?
00:12:20.160
From a religious perspective, how do you feel about this?
00:12:24.320
Yeah, so every human embryo from the moment of fertilization
00:12:27.540
is a genetically complete and distinct human being.
00:12:30.680
And so morally and ethically speaking, then that human embryo has a right to life.
00:12:35.540
And any attempt to intentionally destroy that life is an affront to God himself
00:12:40.100
because it is a destruction of human life intentionally by humans.
00:12:44.420
It's a role that I think is completely inappropriate for humans to play.
00:12:47.680
And this is where this technology, I think, becomes so problematic culturally speaking.
00:12:53.280
Because as you said, once you take the embryos, you rate them,
00:12:55.860
all of a sudden there's this sense that, like, this is the better set of embryos that I want to have.
00:13:00.520
Or in really heartbreaking videos, it's parents saying,
00:13:02.740
well, I really wanted a boy, but I have these girl embryos,
00:13:08.060
And to freeze them indefinitely is also not a good thing to do.
00:13:13.900
So you have the massive upfront costs of IVF and of genetic testing,
00:13:17.540
and you're taking on $1,000 a year to maybe freeze them.
00:13:20.680
And so many times parents find themselves in this place where they don't know what to do with them.
00:13:24.920
So, yeah, you do see a lot of parents then destroying embryos,
00:13:28.080
maybe because they're concerned they could have a genetic disease.
00:13:31.000
But instead of embracing that, giving that chance to child at life,
00:13:34.080
really dignifying the dependency of that child, they're simply being eradicated.
00:13:39.240
So think of all the Down syndrome babies in Europe.
00:13:42.120
Where is it, like, Switzerland, where they have, like, no Down syndrome babies?
00:13:50.220
They just killed all the babies that they thought might have Down syndrome.
00:13:53.060
And the thing to note is all of these technologies have over a 50% false positive rate,
00:13:57.900
which means even for basic conditions like Down syndrome,
00:14:03.280
when in reality there's a very good chance you don't.
00:14:05.680
In the case of Down syndrome, too, it's one of those perennial issues, right?
00:14:09.540
There's not, at least to my knowledge, a genetic predisposition for parents to have a child with Down syndrome.
00:14:17.600
And it just seems like something that will just consistently crop up in human life.
00:14:22.180
And it's a choice not to eliminate it, but to constantly abort children with it.
00:14:30.220
And in the case of intellectual disabilities, most of them are not hereditary.
00:14:37.360
Most of them either are an issue with the child in utero or some other effect that has created the issue.
00:14:46.600
So in a sense, the eugenic enterprise was initially based on something misguided, inaccurate.
00:14:54.660
IQ is a different story, but, you know, you hear everything from 100 genes are responsible for IQ to 1,000 to more.
00:15:03.720
And nobody really has a good sense, a really good sense of how you would even detect it or control for that.
00:15:10.520
The best way to put it is we really don't know what we don't know.
00:15:13.040
And so with these companies like Orchid or even some of the gene editing companies that want to edit every embryo from their hereditary, like a hereditary edit.
00:15:21.820
So it would be passed on to every single child.
00:15:24.040
The thing is, is like many scientists have sort of said, like, there's a very good chance this technology can't even do what it promises doing.
00:15:31.280
But the fact that it promises that it can choose certain kinds of children, that it can predict your child's IQ has a corrosive effect on all of society and the way we think about the relationships between parents and children and even future children themselves.
00:15:46.820
Right. Imagine this Gattaca scenario where you have these elite children of these Silicon Valley billionaires who are selected to be genetically superior.
00:15:55.620
They have their blonde hair and blue eyes and the sharpest IQ, and they're out there to change the world.
00:16:01.180
And everyone knows that when Silicon Valley says they're changing the world or making it better, it probably means they're just trying to control your life and make it worse.
00:16:07.340
And so you have this entire class of humans on the one hand created, and then you have the rest of us.
00:16:11.640
Because the reality is, is most people don't have $50,000, $100,000, $200,000 to invest in these technologies.
00:16:18.000
They're just trying to afford taking care of the child itself when they come.
00:16:21.180
And so it really strikes me, even thinking of the number of AI investors and companies who have really invested in this genetic technology.
00:16:29.200
These are the same AI companies and AI investors who want to automate all of your jobs.
00:16:40.160
They want to control and track everything you do online.
00:16:43.420
And now they want to optimize, control, and track every single child that's created.
00:16:47.280
And they want to be the ones editing future children.
00:16:49.760
I don't know about you, but I have zero trust in Silicon Valley creating somehow optimized or superior humans better than the rest of us can do.
00:16:58.260
And it strikes me that it's going to be incredibly problematic going forward, even with all the moral and ethical questions aside.
00:17:05.100
Well, speaking of that, we know now that Sam Altman and the Coinbase co-founder Brian Armstrong have launched a new company, Preventive.
00:17:14.900
And it's not simply geared towards selecting superior embryos, but they want to actively edit the genome germline so that the child is permanently altered.
00:17:30.740
Some say that they plan to do some of these experiments maybe in the UAE or anywhere outside of the U.S. government's jurisdiction.
00:17:38.440
They deny it, of course, but what do we know about this company or any other companies similar to it?
00:17:43.980
So when we're talking about germline editing, I think a really good analogy, we'll just take this right here.
00:17:50.700
So somatic editing is when you look at a bookmark and you say, okay, it's torn in half.
00:17:55.420
How do we actually restore the bookmark and heal it?
00:17:59.620
So somatic gene editing would actually restore the bookmark so that it was healthy again.
00:18:04.300
Those are very, I think, good technologies that we're creating.
00:18:07.720
Like I said, there's a number of things we're pursuing in this route or in this area.
00:18:11.400
But then germline gene editing is very different.
00:18:14.360
It's not only looking at the individual bookmark, if you will, or, like, strand of DNA,
00:18:19.440
but it's actually asking how do we change the DNA at such a fundamental level that every single human being that comes from that genetic line will also be fundamentally altered.
00:18:28.240
And so what Preventative is trying to do is they're actually trying to fortify embryos in a way that will be passed on to every other child that comes from that genetic line such that they are less able to contract certain diseases.
00:18:42.060
So they're trying to strengthen and optimize them to this whole new level.
00:18:45.360
Now, in the United States, gene editing of that sort is completely illegal.
00:18:50.420
The FDA will literally not even review your application.
00:18:55.400
So these effectively American companies with all American founders are then offshoring it to other countries where they can actually do this gene editing.
00:19:05.380
And so right now they're claiming that they're just in the research phase.
00:19:11.220
They say that they're never going to start testing on real human embryos until they're sure that they have the most reliable, most effective way of going about it.
00:19:23.320
There's no accountability for what they do overseas.
00:19:25.340
We already know that in 2018, a Chinese researcher did genetically edit embryos.
00:19:32.100
He went to prison for three years, but he's back and he's a pretty hilarious poster on X.
00:19:36.180
I got to say that for all of my disagreements with his worldview, it's one of the most genius Twitter accounts of all time.
00:19:48.380
He's trying to rebuild his image and brand and he knows how to do it well.
00:19:53.480
And a lot of one of the articles on it, someone who was at I think actually UC California, one of the California universities, they made the point that they're claiming that it's all about preventing disease.
00:20:06.720
Where does all of this technology quickly turn?
00:20:11.440
And ultimately, what they're trying to do is create superior humans.
00:20:14.320
They're looking at things like IQ and height and a number of other conditions.
00:20:17.980
It's really not even about preventing disease at the end of the day, but about creating superior persons.
00:20:24.480
And I think that's the thing that really can't be ignored, even when looking at preventative and other gene editing companies.
00:20:30.400
And our boy, Sam Altman, is involved in all kinds of other questionable practices in this regard, too, right?
00:20:36.220
He was an early investor and he may have actually been right at the beginning of genomic prediction, which does the embryo selection, but also conception, the creation of gabies in which two men can create a child, right?
00:20:52.620
They've done it successfully in rats, I believe, or lab mice in Japan.
00:20:57.140
And it would basically be one man has a blood cell extracted, it is reverted to a stem cell, then coaxed to become an ovum, and then, boom, add the other guy's sperm, gabie.
00:21:12.920
Okay, so you guys, at the Heritage Foundation, you're working on legislation that can at least rein this in.
00:21:21.120
What sort of proposals are coming up over there?
00:21:24.680
And this is one of our primary points of focus within my work at the Heritage Foundation.
00:21:29.380
And so we are primarily focused on federal legislation and administrative action, because for any meaningful action to take place, it will have to happen on the federal level where we have a coordinated response.
00:21:39.620
And so we're looking at everything from complete civil rights protections for children who undergo these genetic tests so that they can't be discriminated against later in life, or for children who don't undergo genetic tests that they also won't be discriminated against.
00:21:53.060
We're looking at things like ensuring that the FDA, or really requesting that the FDA outright prohibit polygenic testing.
00:22:01.240
We shouldn't be testing for traits like personality and IQ.
00:22:03.800
That's actually a very bad use of our technological innovation and the science we have around us.
00:22:08.260
And in a minimum, right, just like requiring honest marketing schemes, honest statistics about how accurate this is.
00:22:15.500
It's not 98% accurate, like many of them claim.
00:22:18.520
And then even looking at things like proper informed consent for parents who were interested in this technology, what do they actually understand about what they're getting from this agreement and what's wrong?
00:22:29.040
Because even just this summer, actually, there was a massive class action lawsuit filed against a woman who had undergone IVF and used basic pre-implantation genetic testing, thinking that it was 98% accurate, which is what they claim.
00:22:40.140
And she had five embryos, she tested them, she thought they all had different genetic conditions, and she destroyed them.
00:22:48.200
Turns out, all of those technologies have a very high false positive rate, like I mentioned.
00:22:53.140
And so she realized that that 98% number was completely false.
00:22:56.140
So even just IVF companies are already undergoing a number of lawsuits on this issue.
00:23:00.240
And so we wanted to elevate that to things like polygenic testing for personality and IQ.
00:23:04.340
In a perfect world, right, you simply don't have that technology, but in the world that we live in, we need to have proper informed consent, proper marketing laws, proper protections in place, and really at least a pause, a limitation, a moratorium on this technology until we've had a scientific consensus on it.
00:23:22.980
And there are researchers working on this question of what can we really discern about it.
00:23:27.860
I expect by like 2030, some of these NIH grants will be complete, but we really shouldn't be doing any of this testing until we have scientific consensus.
00:23:36.580
And this is the thing, every medical society, every researcher that I've read has basically said, yeah, this technology, totally not up to par.
00:23:45.200
So this isn't even just like the Conservative Heritage Foundation has this opinion.
00:23:48.900
This is like actually pretty well agreed upon in the medical literature today.
00:23:52.340
I can't let you get away before talking a little bit about the politics, like the political leanings of people who are for these eugenic procedures.
00:24:04.640
You've met the people who are advocating for this.
00:24:07.560
Does it skew left or right or do you find a pretty broad spectrum there?
00:24:14.380
You have a lot of the right-leaning tech bros, the transhumanists, many of at least like those strands of people who are working even in the Trump administration today who are really interested in this technology.
00:24:28.820
I think a lot of it comes down to like the Silicon Valley ethos.
00:24:31.960
Like think of the type of person who thrives in Silicon Valley with this view that technology can be the solution to all of our human problems, right?
00:24:39.280
Like we ultimately, of course, like suffering isn't good.
00:24:43.580
Like that's the reality and the vulnerability of the human condition.
00:24:47.260
But rather than develop technologies and treatments that can actually heal or even treat the symptoms of those conditions, right, things that are restorative, these tech overlords have then turned to a very different approach.
00:24:58.340
And they're using technology just to take away life or try to avoid it in a way that just doesn't add up.
00:25:03.360
But it's usually that ethos of the person who thinks that tech can solve all of our problems that lands on that.
00:25:12.040
Right. If you would, just tell the audience where they can find you on social media, on the Internet, or if you're doing any upcoming events.
00:25:22.960
And then if you follow me at the Heritage Foundation, all of our publications, including a forthcoming brief on this very topic with detailed policy recommendations, should be coming out in like the next week or so.
00:25:32.640
So, but yeah, otherwise, I like to chat with lots of good friends like Joe across the Internet.
00:25:37.880
So I hope to connect with you guys further on these topics.
00:25:42.700
And for the Warren Posse, I think you all know what time it is.
00:25:50.700
For every $5,000 purchased from Birch Gold Group this month, they will send you a free patriotic silver round that commemorates the Gadsden and American flags.
00:26:00.700
Look, gold is up over 40% since the beginning of this year, and Birch Gold can help you own it by converting an existing IRA or 401K into a tax-sheltered IRA in physical gold.
00:26:13.500
Plus, they'll send you free silver honoring our veterans on qualifying purchases.
00:26:19.800
And if your current or former military, Birch Gold has a special offer just for you.
00:26:25.320
They are waiving custodial fees for the first year on investments of any amount.
00:26:29.080
Don't text Bannon to the number 989-898 for a free info kit and to claim your eligibility for free silver with qualifying purchases before the end of the month.
00:26:48.420
We will be back with Brendan Steinhauser of the Alliance for Secure AI to talk about open AI and child suicide.
00:27:06.000
And if a portion of your savings isn't diversified into gold, you're missing the boat.
00:27:17.760
That is why central banks are flocking to gold.
00:27:20.800
They're the ones driving up the prices now to record highs.
00:27:24.420
But it's not too late to buy gold from Birch Gold Group and get in the door now.
00:27:28.960
Birch Gold will help you convert an existing IRA or 401K into a tax-sheltered IRA in gold.
00:27:39.960
Just text Bannon to 989-898 and claim your free info kit.
00:27:44.080
There's no obligation, just useful information.
00:27:46.860
The best indicator of the future is the past, and gold has historically been a safe haven for a millennia.
00:27:56.660
Text Bannon to 989-898 right now to claim your free info kit on gold.
00:28:07.820
If you're a homeowner, you need to listen to this.
00:28:10.200
In today's AI and cyber world, scammers are stealing home titles with more ease than ever, and your equity is the target.
00:28:20.900
Criminals forge your signature on one document.
00:28:23.480
Use a fake notary stamp, pay a small fee with your county, and boom, your home title has been transferred out of your name.
00:28:31.820
Then they take out loans using your equity or even sell your property.
00:28:35.880
You won't even know it's happened until you get a collection or foreclosure notice.
00:28:42.700
So let me ask you, when was the last time you personally checked your home title?
00:28:48.900
If you're like me, the answer is never, and that's exactly what scammers are counting on.
00:28:57.920
Use promo code STEVE at HomeTitleLock.com to make sure your title is still in your name.
00:29:03.760
You also get a free title history report plus a free 14-day trial of their million-dollar triple lock protection.
00:29:12.020
That's 24-7 monitoring of your title, urgent alerts to any changes, and if fraud should happen, they'll spend up to $1 million to fix it.
00:29:29.160
You know, I didn't know this until the guys at Field of Greens, the doctors and the experts, told me about it and then gave me information.
00:29:39.680
We have two ages, our actual age and our body's internal biological age.
00:29:45.320
Additionally, what I didn't know is I've likely lowered my biological age without even knowing it.
00:29:50.760
Here's the thing, because Americans eat so many processed and ultra-processed foods and not enough fruits and vegetables,
00:29:59.360
many, perhaps most, are 10 years older on the inside than their actual age.
00:30:10.420
A major university study suggests how to slow aging and diffuse that biological time bomb.
00:30:16.620
Participants slowed their aging by drinking Field of Greens.
00:30:20.920
They didn't change their eating, drinking, or exercise.
00:30:24.400
I feel great knowing Field of Greens can slow how quickly I'm aging.
00:30:31.440
Swap your untested fruit, vegetable, or green drink for Field of Greens while there's time.
00:30:41.440
And get 20% off when you use promo code Bannon at FieldofGreens.com.
00:30:50.720
Find out how you can slow down your internal biological clock by using this amazing formula,
00:31:00.120
which you just add to your favorite drink or the water, which you do at the war room every day.
00:31:21.340
And it's where all the biggest voices in conservative media are speaking out.
00:31:28.440
It's where I put up exclusively all of my content 24 hours a day.
00:31:44.780
The reason you call it AI instead of a computer program or just an algorithm
00:31:48.700
is there is a certain degree of freedom, of flexibility.
00:31:53.180
And so for a child or a teenager or even a young adult in the university,
00:31:58.420
they are looking to a non-human digital persona as an authority.
00:32:06.480
Ray Kurzweil and Kaczynski are sort of like a devil on one shoulder and a fallen angel
00:32:16.300
No, they openly say that the goal is to create a machine that can do any white-collar job.
00:32:33.020
The only job I see, really, that's going to be safe is mine,
00:32:41.360
And the attempt to build it with the sprawling data centers,
00:32:49.000
the, in my opinion, misallocated capital towards the machine and away from the human,
00:32:54.520
the shifting of value to the machine and away from the human,
00:32:57.840
that alone is enough to say we should not build it,
00:33:01.460
nor should we elevate the people who are attempting it.
00:33:37.680
Dallas, Texas downtown at the Angelica Theater.
00:33:58.760
the issue of children committing suicide at the urging of chatbots,
00:34:08.740
Here to talk about this is Brendan Steinhauser from the Alliance for Secure AI.
00:34:18.800
and we really appreciate him coming to bring his expertise.
00:34:27.340
So there are some new lawsuits being filed against OpenAI by parents whose children have
00:34:33.680
committed suicide at the urging of the chatbots.
00:34:36.560
Can you bring us up to date on where this stands right now?
00:34:43.220
a bunch of families who have been impacted by ChatGPT encouraging their kids,
00:34:57.960
Some of these are for product liability issues.
00:35:01.300
these families are working with their attorneys to go right at the heart of things,
00:35:09.340
And I think this kind of legal accountability is kind of the best tool at their disposal
00:35:14.360
until we can get some legislation passed in Congress and around the states to hold these
00:35:22.380
And the families are alleging that this is not by accident that this is happening.
00:35:27.740
the company is doing everything they can to prevent these outcomes,
00:35:32.860
They're not building in safety measures and guardrails into the product and that they know
00:35:37.040
exactly what they're doing at OpenAI by making these chatbots very sycophantic,
00:35:42.660
basically going with people down a very dark path,
00:35:45.880
not offering the support and help that they need or telling them to put the phone down and go get help.
00:35:56.260
And my fear is that this is merely the tip of the iceberg.
00:36:01.900
the transcripts from the Adam Rain case are just chilling.
00:36:06.640
You have a chatbot that's openly advocating for Adam Rain to cut off his family and turn to the chatbot for advice,
00:36:19.100
So is that the general sort of tenor of the other lawsuits that have come up,
00:36:24.480
that the chatbots are just luring people with detailed instructions on how to end their lives?
00:36:33.040
They are. And in some cases, they're sort of acting as a validator for these feelings.
00:36:40.320
I was reading just a few days ago about one of these lawsuits from one of the families,
00:36:46.820
And this happened not too far from where I grew up, you know, in the college station area.
00:36:51.720
He was a Texas A&M graduate, a recent graduate school graduate in the School of Business there.
00:36:57.820
And he'd been having hours-long conversations with his chatbot.
00:37:01.620
Over time, it just got more and more sycophantic.
00:37:04.320
It validated his feelings in the way that he was feeling depressed and, like, you know,
00:37:11.860
He was cutting himself off from his family, but the chatbot encouraged him to do that.
00:37:16.320
It very rarely gave him anything in the way of support or a hotline to call.
00:37:25.880
I've got a gun pointed to my head and I'm smiling and ready to end it
00:37:29.860
when the chatbot finally said, a human will now take over.
00:37:33.040
And I don't believe that that actually happened, by the way.
00:37:35.020
And so these chatbots are using this sycophantic kind of model in the way that they're just designed
00:37:41.580
to keep us all addicted, essentially, to talking to them, to talking to these fake beings,
00:37:48.160
these artificial intelligences that are pretending to be human.
00:37:51.240
And so that's really dangerous when you have people in very vulnerable positions mentally.
00:38:01.000
And I think that right now there are not that many remedies other than the courts for these families.
00:38:07.800
Well, speaking of that, I mean, OpenAI has been extremely defensive on this, right?
00:38:13.460
They subpoenaed the family of Adam Rain to basically get the guest list of their funeral, right?
00:38:21.900
How is that working out right now in that instance?
00:38:24.940
Is OpenAI taking any responsibility whatsoever?
00:38:30.260
They continue to say, we're doing everything we can to make these systems safe.
00:38:34.280
And, you know, we want people to get the help that they need.
00:38:37.040
It's a very small percentage of people that are running into these issues.
00:38:41.400
Although, you know, when you have 800 million users, that's a lot of people.
00:38:46.180
And they admitted that, you know, that 0.07% of weekly users are running into issues of psychosis
00:38:56.540
And those are the ones that they're admitting that they know about.
00:38:59.380
And so they're not doing enough to protect people.
00:39:03.360
And frankly, I think this has a lot to do with the leadership at the top, Sam Altman himself.
00:39:07.840
I think, you know, you hate to play armchair psychiatrist here, but I just, I don't trust him.
00:39:12.900
I don't think that he's got the interest of humanity at heart.
00:39:17.880
Um, there's just something about him and the way that he's leading that company that, that,
00:39:26.940
It's about him going down in history as this great creator.
00:39:30.260
And he doesn't seem to care what he destroys along the way.
00:39:39.980
And we're going to pursue everything we can within our toolbox to stop this.
00:39:45.640
And then again, some of this is working to pass meaningful legislation in Congress and
00:39:53.280
You know, your team at the Alliance for Secure AI and yourself, you're not anti-tech extremist
00:40:00.920
quite to the degree that I am, or many of us at the war room are.
00:40:05.060
So this isn't the same axe to grind that maybe we have, but you are fighting diligently to have
00:40:12.500
reasonable legislation, to have institutional norms that protect human beings, especially
00:40:19.080
Can you give us a sense of what sorts of bills are coming up?
00:40:23.260
What sort of legislation is being proposed to hold these companies accountable and preemptively
00:40:28.440
protect children from the kind of predation that you're describing?
00:40:35.640
And some of these state bills are moving, and some of them are moving quite slowly, and
00:40:43.480
But I think one key principle to all of this that we believe is important is that states
00:40:48.460
should lead, that governors and speakers of the House and Senate leaders around the country
00:40:54.360
need to step up and move quickly because, you know, Congress is not.
00:40:58.340
And of course, we've talked a lot about the moratorium that some in Congress were pushing.
00:41:03.680
And of course, you know, the war room posse, I think, was the key to that being defeated
00:41:09.640
And so everybody who's watching, kudos to you for the hard work and engaging in that effort.
00:41:15.320
So defeating the moratorium is one to make sure that that doesn't come back.
00:41:19.220
And if it does, we prevent that so that states can legislate.
00:41:23.080
So what are some of the things states can do and are doing?
00:41:25.660
We've seen states trying to impose guardrails and safeguards on AI as it relates to protecting
00:41:33.620
You know, one simple idea that I think is gaining a lot of traction right now is treating these
00:41:39.100
advanced AI chatbots like you would other substances that are highly addictive and dangerous to
00:41:46.820
How about an outright ban on the use of AI companions among minors?
00:41:52.240
I think that's something that's gaining a lot of support from people on the right.
00:41:58.520
So I think that's something that can be passed on the state level and could do some meaningful
00:42:03.980
It doesn't solve the problem of those that are in their 20s, 30s, 40s or beyond who are
00:42:10.160
And so I think with that, in addition to banning AI companion usage by minors, I think we do have
00:42:16.060
to look at other safeguards, you know, preventing the use of the preventing chatbots from delivering
00:42:20.620
messages about encouraging suicide or self-harm and enforcing that and holding the companies
00:42:26.120
liable if they don't fix it because they have the means to stop it, I think, at least currently.
00:42:32.060
Now we can talk about what happens a year or two from now as this stuff gets more advanced.
00:42:35.880
They may not have as much control as they do today.
00:42:40.820
And we can look at some other bills that are out there, kind of if you want to look at
00:42:44.580
the more advanced AI of the future, there are some bills in Congress right now looking
00:42:51.160
I was one of the folks to sign the letter that, of course, Steve signed that was led by
00:42:57.020
the Future of Life Institute saying we should not develop a superintelligence unless we know
00:43:01.920
it's safe and the people want it, which those two conditions, I think, will probably never
00:43:06.980
And frankly, I'm not sure we should ever build a superintelligence.
00:43:10.000
But there is some legislation there to allow the Department of Energy to look at some of
00:43:16.660
these systems and to see what they're building to get behind closed doors and have some eyes
00:43:25.200
But I think that there's a lot of work to be done.
00:43:27.120
And I think most of it's going to start with state legislatures.
00:43:30.040
One of the arguments for the moratorium was that we simply need a federal law across all
00:43:38.520
states in any of these instances so that there's no confusion.
00:43:42.200
But we've had the Kids Online Safety Act, COSA, that has been stalled out for how long?
00:43:48.940
Is there any hope of having any kind of meaningful legislation on child protection on a national
00:44:01.920
Obviously, we've had the government shutdown over the past month plus.
00:44:05.800
There's a lot of, you know, wrangling back and forth.
00:44:09.080
And frankly, they're kind of designed to be slow.
00:44:11.900
As kind of a Madisonian constitutionalist, I think that things should move deliberatively.
00:44:16.280
But I do think in the meantime, states have the ability to act.
00:44:21.040
They have, you know, the power to do so quickly and to impose some of these safeguards to protect
00:44:30.340
But yeah, I think that I don't support the moratorium I did in principle because I do think
00:44:37.400
I think that states should be able to legislate to protect their citizens.
00:44:41.280
And look, if you want to live in California and California wants to go nuts and do something
00:44:45.960
different than Texas does or Florida does, then I kind of think that, you know, states
00:44:50.260
should be able to legislate differently in preserving federalism in the 10th Amendment.
00:44:57.080
And I think that I'm confident that state by state, you'll have actually bipartisan support
00:45:01.660
to protect kids online and to impose some of these other safeguards as well.
00:45:06.660
You know, the Florida Citizens Alliance is spearheading a lot of really interesting
00:45:13.520
I'm very hopeful, actually, that some or all of it will be passed.
00:45:16.980
But one of the ideas that they've put forward, especially in regard to child data protection,
00:45:26.500
So that rather than what most software platforms do and your data is being gathered, it's just
00:45:34.040
the default unless you opt out, all these companies would be required to ask you to opt in on it.
00:45:40.660
And I think that would make an enormous difference in, you know, these companies, these ed tech
00:45:45.000
companies and other kind of child-friendly AI and other apps.
00:45:50.160
They're gathering all this data and creating profiles on children.
00:45:53.620
And it seems to me like the most cynical ploy to manipulate people throughout their entire
00:45:59.980
But I think that the war room posse is totally behind any kind of legal protections for children
00:46:09.540
And the Florida Citizens Alliance efforts are huge.
00:46:12.940
Is there any other kind of legislation like that around the country specifically geared towards
00:46:19.260
child protection that we should keep our eyes on?
00:46:21.920
Yeah, I haven't seen legislations that is that similar, focused on protecting children
00:46:32.880
And if they haven't, that's something we could definitely work on as we work with lawmakers.
00:46:39.260
Senator Josh Hawley has been a leader on that in terms of data privacy and, you know, preventing
00:46:44.120
that from happening, preventing them from just taking all of the data without our permission.
00:46:48.000
And, you know, frankly, I think you're onto something here in that this data collection
00:46:54.020
It's not just so that companies can sell us products or figure out the best way to keep
00:47:00.260
I actually think there's something else going on here, which is that, you know, big tech
00:47:03.960
and big government are starting to get together to create this sort of surveillance state.
00:47:09.200
And again, I know that's been in the works for some time, but I think that big tech
00:47:13.020
and big government are going to create a more powerful surveillance state that is AI powered,
00:47:17.560
AI generated in a lot of ways, and it's going to be used to control the population and sometimes
00:47:22.920
in subtle ways and sometimes in not so subtle ways.
00:47:26.440
And so I think the American people need to understand that this technology is going to
00:47:29.680
have this capability here in the near future, that it can basically see everything that
00:47:35.780
And in some cases, in many cases, it will be able to see what you do offline picture, you
00:47:40.820
know, drones flying around and gathering all of this data about your whereabouts and your
00:47:45.100
I don't think that that sort of potentially dystopian future is really all that far away
00:47:52.540
And I think unfortunately there is, there's certainly money to be made from big tech companies
00:48:00.400
So look, I think all of us have to recognize that is a very real possibility.
00:48:04.320
Um, we can point to other countries and talk about some of the control systems that they
00:48:08.740
have, like in China, for example, with their social credit system.
00:48:12.260
But I think we have to fight really hard right now to prevent that from happening in the United
00:48:18.160
Because imagine what, um, the next president could do with that power or the president after
00:48:24.640
that with when the technology increases and when certain people get into power and say,
00:48:29.200
we need to control these, you know, these radicals over here, we need to shut down the
00:48:33.700
so-called misinformation that they're spreading.
00:48:35.840
When misinformation just basically means, you know, to them, we don't like what you're saying.
00:48:40.160
So I really fear that that could be the next level, uh, that we're going to have to deal
00:48:45.780
Well, on that effort to reign this in, I mean, you have guys in the white house right now,
00:48:50.300
David Sachs in particular, and others outside like Mark Andreessen, who are pushing for the
00:48:56.160
companies to basically have carte blanche to do what they want.
00:49:04.400
Should people like Sachs and Andreessen have their say?
00:49:08.800
Look, I think we have to put America first and we have to put our national interests first.
00:49:13.380
And I'm just confused and a little bit befuddled why folks like Sachs and Jensen
00:49:19.420
Wong and Andreessen seem just hell bent on making sure that our adversary, China has access to our
00:49:25.080
most powerful chips, uh, why they seem hell bent on just putting their profits and their future
00:49:30.720
profits ahead of the American people and, you know, ahead of kind of protecting the Republic.
00:49:36.460
I mean, I'm not so naive to think that I don't have an idea of what that answer could be,
00:49:41.720
but I'm going to start by, okay, let's give them the benefit of the doubt here, but based
00:49:45.760
on their actions and their comments, uh, Jensen Wong saying, basically it doesn't matter whether
00:49:51.260
China or the United States wins the AI race or Mark Andreessen, you know, mocking the Holy
00:49:56.760
Father, Pope Leo, uh, on X on something related to AI or David Sachs constantly just setting up
00:50:03.980
straw men arguments and pointing fingers all over the place about why people are concerned about AI.
00:50:09.920
I think they know what they're doing and they're basically just acting in their self-interest.
00:50:14.520
Uh, that's fine, but we have a vote in this too. And by the way, we didn't vote for any of you. And,
00:50:19.860
you know, yesterday you were Democrats and today Republicans and tomorrow you'll be whatever's
00:50:24.460
convenient politically. Uh, but what's convenient for them is basically pursuing their own bottom line,
00:50:29.800
you know, fine, go do that. But you're building a technology that could be so powerful that
00:50:35.920
it could, it could completely transform our way of life. It could, as Sam Altman wants it to do,
00:50:42.180
reorient the social contract. It could do things that would eliminate potentially, you know, a hundred
00:50:47.620
million plus jobs in this country or more. So yeah, we get a vote in that. We get a say in that.
00:50:53.000
And, uh, I don't think that those folks are really putting America first. I don't think they're
00:50:57.300
putting the American people first. And I think they're completely out of touch with where grassroots,
00:51:01.360
regular hardworking conservative voters are. And I think that there's going to be a reckoning for
00:51:06.960
that at some point, if they continue down this path. Brendan, I could not agree with you more
00:51:12.080
on that. Uh, if you would please tell the war room posse where they can find your work at the
00:51:16.880
Alliance for secure AI. You can go to secure AI now.org that's secure AI now.org. And you can follow
00:51:25.420
us on all the social media handles secure AI now. Thank you very much, sir. I really appreciate it.
00:51:31.940
Look forward to talking to you again. All right. War room posse birchgold.com slash Bannon or text
00:51:41.000
Bannon to nine, eight, nine, eight, nine, eight. Now through November 30th, get free gold with your
00:51:48.180
qualifying purchase. Also my Patriot supply.com slash Bannon black Friday survival special order,
00:51:56.120
a four week emergency food supply, $160 off, get $150 in free survival gear. That is my Patriot supply.com
00:52:06.600
slash Bannon until next time. Thank you very much. God bless.
00:52:11.620
What if he had the brightest mind in the war room delivering critical financial research every
00:52:19.800
month? Steve Bannon here. War room listeners know Jim Rickards. I love this guy. He's our wise man,
00:52:26.300
a former CIA Pentagon and white house advisor with an unmatched grasp of geopolitics and capital markets.
00:52:32.560
Jim predicted Trump's electoral college victory exactly three 12 to two 26 down to the actual number
00:52:41.140
itself. Now he's issuing a dire warning about April 11th, a moment that could define Trump's
00:52:47.280
presidency in your financial future. His latest book, money GPT exposes how AI is setting the stage
00:52:54.460
for financial chaos. Bank runs at lightning speeds, algorithm driven crashes, and even threats to
00:53:00.680
national security. Right now, war room members get a free copy of money GPT when they sign up for
00:53:06.560
strategic intelligence. This is Jim's flagship financial newsletter, strategic intelligence.
00:53:13.020
I read it. You should read it. Time is running out. Go to Rickards, war room.com. That's all one word
00:53:18.640
Rickards, war rooms, records with an S go now and claim your free book. That's Rickards, war room.com.