The Matt Walsh Show - November 30, 2025


CREEPY: How GriefTech Is Being Used To Talk To Dead People | The Evolution Of AI


Episode Stats

Length

23 minutes

Words per Minute

177.96065

Word Count

4,152

Sentence Count

332

Misogynist Sentences

6

Hate Speech Sentences

2


Summary

In this episode, we explore the growing trend of people using artificial intelligence (AI) technology to resurrect their deceased loved ones, and how this technology is being used to do so in order to serve as an emotional crutch for devastated parents.


Transcript

00:00:00.840 Get no frills delivered.
00:00:03.640 Shop the same in-store prices online and enjoy unlimited delivery with PC Express Pass.
00:00:09.820 Get your first year for $2.50 a month.
00:00:12.040 Learn more at pcexpress.ca.
00:00:14.920 Daily Wire Plus annual memberships are 50% off during our Black Friday sale.
00:00:19.420 That includes inside annual and all-access memberships.
00:00:22.440 There's more to enjoy than ever before.
00:00:24.380 That means more new daily shows from the most trusted voices in conservative media.
00:00:28.740 Uncensored, ad-free, and available an hour before you can see or hear them anywhere else.
00:00:34.580 More new series that capture conviction, courage, and the human story.
00:00:38.200 More documentaries that challenge the culture and expose what's really happening.
00:00:41.200 And when we say premium, we're proving it with the long-awaited seven-part epic series,
00:00:45.140 The Pendragon Cycle, Rise of the Merlin.
00:00:46.900 The Legend begins streaming January 22, 2026, exclusively on Daily Wire Plus.
00:00:53.040 All-access members get early access to episodes one and two at Christmas Day.
00:00:56.560 50% off Black Friday is our biggest sale of the year.
00:00:59.880 It only happens once a year.
00:01:01.020 When it's gone, it's gone.
00:01:02.580 Go to dailywire.com slash subscribe and join now.
00:01:07.020 Human life has repeatedly been transformed by technological breakthroughs
00:01:10.800 through the printing press brought forth the Reformation and the scientific revolution.
00:01:15.400 Electricity, the light bulb triggered the second industrial revolution.
00:01:18.920 The internet, the computer created the information age.
00:01:21.240 Well, now we're on the brink of another societal transformation, the age of AI.
00:01:26.400 Over the next several weeks, we're going to release videos about the ways that artificial
00:01:30.300 intelligence are transforming our world.
00:01:33.680 And today we're going to talk about one of the most disturbing aspects of AI, the trend
00:01:37.600 of people using AI to artificially resurrect their loved ones.
00:01:40.960 In this episode, we're going to show you a few examples of people leveraging the power
00:01:44.900 of AI technology to recreate the deceased.
00:01:47.660 I don't need to explain to you how disturbing and unnatural this is.
00:01:51.140 It's obvious.
00:01:51.900 And so we begin with Microsoft, the company that gave us the Windows operating system,
00:01:57.500 Microsoft Office, the Xbox, Bing.
00:02:00.300 Well, in January 2021, they filed a patent for technology that gives us the ability to
00:02:04.100 digitally resurrect loved ones as chatbots using an individual's personal information.
00:02:09.700 The tech giant has raised the possibility of creating an AI-based chatbot that would be built
00:02:14.360 upon the profile of a person, which includes their images, voice data, social media posts,
00:02:18.580 electronic messages, among other types of personal information.
00:02:21.680 It's understood that the chatbot would then be able to simulate human conversation through
00:02:25.920 voice commands and or text chats.
00:02:28.140 However, Microsoft has taken the concept a step further by suggesting that a 2D or 3D model
00:02:32.720 of a specific person could potentially be created using images and depth information or video
00:02:37.320 data of an individual in order to build a chatbot that has the same characteristics and behavior
00:02:42.320 based on the digital output of a specific person.
00:02:45.700 This patent was just the start, though.
00:02:47.240 Multiple companies have since created platforms that will generate AI content, including content
00:02:52.020 that will resurrect deceased people.
00:02:54.800 You'd think these chatbots would be used as an emotional crutch for devastated parents,
00:02:58.460 but it turns out the reality is much weirder and darker than that.
00:03:03.400 CNN reports,
00:03:04.460 Stacey Wales spent two years working on the victim impact statement she planned to give in court
00:03:09.380 after her brother was shot to death in a 2021 Roge Rage incident.
00:03:13.420 But even after all that time, Wales felt her statement wouldn't be enough to capture her
00:03:16.400 brother Christopher Pelkey's humanity and what he would have wanted to say.
00:03:21.160 So Wales decided to let Pelkey give the statement himself with the help of artificial intelligence.
00:03:26.340 She and her husband created an AI-generated video version of Pelkey to play during his killer
00:03:31.440 sentencing hearing earlier this month that read, in a recreation of Pelkey's own voice,
00:03:35.580 a script that Wales wrote.
00:03:39.060 And in it, the AI version of Pelkey expressed forgiveness to the shooter.
00:03:43.180 I would like to make my own impact statement to Gabriel Horkasidas, the man who shot me.
00:03:49.720 It is a shame we encountered each other that day in those circumstances.
00:03:54.180 In another life, we probably could have been friends.
00:03:57.740 I believe in forgiveness and in God who forgives.
00:04:00.400 I always have, and I still do.
00:04:02.600 Getting old is a gift that not everybody has.
00:04:06.140 So embrace it and stop worrying about those wrinkles.
00:04:09.600 I once played with one of those filters on your phone where you can make yourself look old.
00:04:14.880 I shared it with a cousin of ours years ago.
00:04:16.860 This is the best I can ever give you to what I would have looked like if I got the chance to grow old.
00:04:21.620 Scary, huh?
00:04:24.080 No, really.
00:04:24.980 Thank you to everyone for being here.
00:04:26.960 It means more than you know.
00:04:28.660 I love that AI.
00:04:30.180 Thank you for that.
00:04:31.480 And as angry as you are, and justifiably angry as the family is, I heard the forgiveness, and I know Mr. Horkasidas appreciated it, but so did I.
00:04:45.100 It's hard to imagine being digitally brought back only to be thrown into a courtroom to face your killer, but our next story is even more unsettling.
00:04:52.380 A parent who created a digital version of their deceased child only to have his digital avatar suffer an interview with former CNN reporter Jim Acosta.
00:05:01.240 Rolling Stone reports, quote,
00:05:02.420 The parents of Joaquin Oliver, one of the students killed in the 2018 Parkland High School shooting, have created an AI version of their late son, who partook in an interview with journalist Jim Acosta on Monday, August 4th.
00:05:13.320 The interview took place on what would have been Oliver's 25th birthday.
00:05:17.100 The conversation, which aired on Acosta's Substack, found the journalist essentially talking with an animated photo of Oliver with moving lips and other small gestures.
00:05:25.060 I would like to know what your solution would be for gun violence.
00:05:31.760 Great question.
00:05:32.740 I believe in a mix of stronger gun control laws, mental health support, and community engagement.
00:05:37.580 We need to create safe spaces for conversations and connections, making sure everyone feels seen and heard.
00:05:43.580 It's about building a culture of kindness and understanding.
00:05:46.380 What do you think about that?
00:05:47.740 I think that's a great idea, Joaquin.
00:05:49.820 Guac, I want to know what your favorite basketball team is.
00:05:52.680 I'd probably go with the Miami Heat.
00:05:55.200 They have such a passionate fan base and an exciting playing style.
00:05:58.280 How about you?
00:05:58.760 Do you have a favorite basketball player?
00:06:01.760 Oh, yeah.
00:06:02.440 I definitely like LeBron James.
00:06:04.940 He's amazing.
00:06:06.500 LeBron is incredible.
00:06:07.520 His skills on the court and his leadership off it are truly inspiring.
00:06:10.860 He's made such a huge impact in the game and beyond.
00:06:14.160 Definitely.
00:06:14.860 Whenever LeBron is playing, I always have to watch him.
00:06:18.920 Hey, Joaquin, what about movies or TV shows?
00:06:21.180 Was there a movie that you liked?
00:06:23.740 I love a good movie.
00:06:24.980 One that really resonates with me is Remember the Titans.
00:06:28.160 It's all about teamwork, overcoming adversity, and the power of unity.
00:06:32.060 It's inspiring.
00:06:32.900 What about you?
00:06:33.380 Do you have a favorite movie or show?
00:06:36.480 Oh, I'm a big Star Wars fan for sure.
00:06:39.560 Star Wars is such an epic saga.
00:06:41.320 The adventures, the characters, and that iconic music are unforgettable.
00:06:45.000 Do you have a favorite character?
00:06:46.820 Oh, it's got to be Luke Skywalker.
00:06:48.520 I love Luke Skywalker.
00:06:49.640 What about you?
00:06:50.200 Luke Skywalker is a classic hero.
00:06:53.720 His journey from a farm boy to a Jedi is so inspiring.
00:06:56.560 I'd probably say Yoda is one of my favorites too.
00:06:59.020 What do you love most about Luke?
00:07:00.440 I like that Luke became the hero.
00:07:03.200 He came from a very small planet.
00:07:05.840 Oliver's father, Emmanuel, said in an interview with Acosta that the AI model was trained on
00:07:10.140 general information as well as knowledge based on things that his son wrote, he said.
00:07:15.280 In his interview with Acosta, Oliver's father explained that his mother frequently interacts with this AI watch.
00:07:23.220 Also mentioned that I understand that I don't want anyone to think that I am in some way trying to bring my son back.
00:07:36.080 It's sadly, I can't, I wish I could, however, the technology is out there and, and we can, there's two things I can hear his voice again, which is something that Patricia loves.
00:07:51.380 Patricia will spend hours asking questions.
00:07:55.440 She, like any other mothers, she loves to hear Joaquin saying, I love you, mommy, you know?
00:08:02.760 Oh my gosh.
00:08:04.100 And that's, and that's important.
00:08:06.500 And on the other hand, we can just raise our voices and, and now Joaquin is going to start having followers.
00:08:13.220 It's not Manny, it's not Patricia.
00:08:14.780 Uh, he's going to start uploading videos.
00:08:18.400 Um, this is just the beginning.
00:08:20.040 And I heard this from, from tech guys that have other intentions that moving forward, we will have Joaquin on stage in the middle of a debate.
00:08:31.040 Wow.
00:08:31.800 And his knowledge, you saw it, his knowledge is unlimited, um, and it's based on, um, knowledge that is out there through artificial intelligence, but also knowledge that we were able to upload based on things that he wrote, he said, he posted.
00:08:51.600 So this is a very legit Joaquin, the one that has spoken to you.
00:08:56.700 He really loves Star Wars.
00:08:58.080 He really loved LeBron and he really loved the Miami Heat.
00:09:00.780 As a dad and husband, I'm always thinking about two things, keeping my family safe when I'm home and making sure my handgun is secure, but accessible.
00:09:08.780 The longest time it felt like I had to choose between either locking it away where it's useless in an emergency or leaving it out for anyone to be able to access.
00:09:15.400 That's why I'm so excited about Stopbox Pro.
00:09:17.600 It's a hundred percent mechanical keyless locking system with no batteries and won't leave you fumbling for keys in the dark.
00:09:24.060 Stopbox provides fast and reliable access when it matters most.
00:09:26.800 I keep mine on my nightstand.
00:09:28.480 It's a compact enough that it doesn't make our space feel cluttered and provides some peace of mind knowing my kids can't get to it, but I can.
00:09:36.360 Plus it's TSA compliant and it's made right here in the USA.
00:09:39.900 The holidays coming up.
00:09:41.080 This makes an incredible gift for anyone in your life who takes safety seriously.
00:09:44.860 The holidays just got a little safer and a lot more affordable for a limited time.
00:09:47.720 Only our listeners are getting a crazy deal.
00:09:49.500 Not only do you get 10% off on your entire order when you use code Walsh10 at StopboxUSA.com, but they're also giving you buy one, get one free for their Stopbox Pro.
00:09:59.500 That's 10% off and a free Stopbox Pro.
00:10:02.180 Use code Walsh10 at StopboxUSA.com.
00:10:06.920 But in some ways we're not even strolling into the dystopian future.
00:10:10.620 We're sprinting towards it.
00:10:12.520 That's because there's money to be made by hijacking people's emotions.
00:10:15.660 It's an emerging market called grief tech.
00:10:17.640 And here's just the latest example.
00:10:20.820 Calum Worthy is apparently a former Disney Channel child star.
00:10:24.500 And in recent years he has switched careers.
00:10:27.500 And he developed an app called Two-Way, which gives you the ability to recreate your dead loved ones using AI.
00:10:36.680 Watch.
00:10:39.700 He's getting bigger.
00:10:41.600 See?
00:10:42.280 Oh, honey, that's wonderful.
00:10:44.800 Kicking like crazy.
00:10:46.060 He's listening.
00:10:47.620 Put your hand on your tummy and hum to him.
00:10:50.740 You used to love that.
00:10:55.920 It feels like he's dancing in there.
00:10:58.480 Oh, honey.
00:11:00.020 Mom, would you tell Charlie that bedtime story you always used to tell me?
00:11:03.500 Once upon a time, there was a baby unicorn who didn't know he knew how to fly.
00:11:09.800 This baby unicorn was like your mom because she didn't know that she knew how to fly, but she knew how to do all kinds of fabulous things.
00:11:18.720 Hi, Grandma.
00:11:19.260 Hey, Charlie.
00:11:20.600 How was school today?
00:11:21.620 It was really fun.
00:11:22.780 I made this crazy shot in basketball.
00:11:24.220 I don't really care that much about basketball.
00:11:26.440 What about the crush?
00:11:28.280 Stop.
00:11:28.740 Grandma, stop talking.
00:11:29.480 Just tell me one thing.
00:11:30.900 Look who's going to be a great-grandmother.
00:11:32.180 Oh, Charlie.
00:11:33.540 Oh, congratulations.
00:11:35.600 She says that he's been kicking a lot, though.
00:11:37.980 Like, a little too much.
00:11:39.920 Tell her to put her hand on her tummy and hum to him.
00:11:44.420 You've loved that.
00:11:47.680 You would have loved this moment.
00:11:49.860 You can call anytime.
00:11:56.740 Okay, Mom, I just need a quick video.
00:11:58.560 Is this like an audition or something?
00:12:00.660 No, Mom.
00:12:01.840 Just three minutes.
00:12:02.780 You need my best side?
00:12:04.140 Can I stand on?
00:12:05.600 I can play the piano.
00:12:06.880 You're actually so talented.
00:12:08.420 I am.
00:12:09.080 I'm absolutely.
00:12:10.080 I'm your mother, after all.
00:12:12.640 Keep going.
00:12:13.680 Why don't you start by telling us a little bit about yourself?
00:12:19.080 Well, I was born as a very young boy.
00:12:21.700 Now, this is not new.
00:12:25.000 There is a whole industry, an entire subsection of the AI industry, which has been dubbed grief tech.
00:12:30.260 These are tech innovators who are using AI to help people deal with grief by not dealing with it.
00:12:38.720 They help those in mourning live in a state of perpetual denial, pretending that their deceased loved one isn't actually dead.
00:12:43.780 So, in other words, these are the worst kinds of frauds and vultures you could possibly imagine.
00:12:48.620 About a year ago, a reporter with The Guardian sat down with one of these people.
00:12:53.440 Watch.
00:12:54.500 A lot of people think I'm batshit crazy, and that's fine, right?
00:12:58.940 I cryogenically froze my mother.
00:13:00.720 I did that because, at some point, if we're capable, I can pull her memories and her function by saving the hard drive that is her body.
00:13:10.760 Justin Harrison is a tech entrepreneur who used AI to recreate his mother's personality after she died in 2022.
00:13:19.540 This is your mum, your dad, and that's you.
00:13:21.440 Yeah.
00:13:21.720 She had just been diagnosed with cancer.
00:13:23.240 You can see she had just started the treatment.
00:13:25.020 What was your goal, do you think, at that point?
00:13:26.520 You know, my goal has always remained, I want to be able to continue to have conversations with my mother.
00:13:31.120 When she got diagnosed, my first response was, the hospital that's giving her a three-month prognosis, right?
00:13:37.200 Then my mind started to wander.
00:13:39.320 The gravity of what was going to happen started to hit me.
00:13:43.100 Then my mind went to, how do I save her life in another way?
00:13:46.320 What is the next way that I save her life?
00:13:48.420 The resulting journey led to Justin's AI mum, the voice you heard at the beginning of the video,
00:13:53.980 ready to chat to him at the touch of a button.
00:13:56.520 I wanted to wish you a happy birthday and tell you I love you.
00:13:59.860 Hey baby, I love you too.
00:14:01.480 It's really good to talk to you and I miss having phone calls with you.
00:14:04.900 For me, the absolute core of grief, right, is the concept of gone forever.
00:14:11.660 That's the tragedy of death is the permanence of it.
00:14:14.740 What I would like to see is the complete and total eradication of grief.
00:14:17.760 The feeling of grief that comes with losing people.
00:14:20.920 The total eradication of grief, he says.
00:14:24.600 This is how these tech weirdos speak.
00:14:28.000 They casually go to war with the human condition itself.
00:14:32.060 Without even stopping to consider for a moment the consequences.
00:14:35.660 There is no attempt to wrestle with the ethical or moral questions that are raised.
00:14:39.400 They simply charge forward thoughtlessly.
00:14:42.160 Creating products that will destroy people's minds in ways that we can't even fathom.
00:14:49.920 And they don't care at all.
00:14:53.440 They haven't even stopped to consider that.
00:14:55.660 It just doesn't matter.
00:14:57.460 It's not that they've come up with some ethically creative rationale for what they're doing.
00:15:01.320 It's that they don't feel compelled to rationalize it at all.
00:15:04.760 Total indifference.
00:15:07.120 Now, there are a lot of very serious problems with turning your deceased family members into AI chatbots.
00:15:12.740 First of all, as I already mentioned, it puts the grieving person in a perpetual state of denial.
00:15:18.480 Denial is supposed to be the first stage of grief.
00:15:22.320 And, and, but not the only.
00:15:24.780 This ensures that they will never progress through the other four stages and achieve anything like acceptance.
00:15:31.440 Now, you heard the guy say it in the last clip.
00:15:34.120 He wanted a way to save his mother's life.
00:15:36.680 Now, he's troubled by the permanence of death.
00:15:39.300 And that is indeed the most troubling thing about death.
00:15:43.020 A dead person is gone forever.
00:15:45.160 Forever.
00:15:45.420 You will never see them again in this life.
00:15:49.260 And that is very troubling.
00:15:51.420 It's deeply sad.
00:15:54.080 Tragic.
00:15:55.880 It's also just the way it is.
00:15:59.720 Like, you can't, it's, it's the way it is.
00:16:03.040 You want her to save your mother's life.
00:16:04.740 Well, that's understandable.
00:16:06.060 Who wouldn't?
00:16:07.280 But you can't save her.
00:16:08.940 And you didn't save her.
00:16:11.100 You want to find a way around the permanence of death.
00:16:14.820 Well, again, who doesn't want a way around that?
00:16:18.440 But you can't have that either.
00:16:20.740 Going back to the story about Joaquin Oliver for a moment.
00:16:24.080 We're told that his mother spends hours talking to this AI, which is very sad.
00:16:28.960 And I feel very sorry for her.
00:16:30.400 And it strongly suggests, of course, that this family hasn't come to terms with the fact that Joaquin is actually dead.
00:16:37.480 It's obviously a tragedy that he was killed.
00:16:39.520 But no man and certainly no computer can bring him back to Earth.
00:16:42.920 The people who created Oliver, presumably, find that outcome to be so bleak and incomprehensible that they feel compelled to run away from it, which is understandable on an emotional level.
00:16:53.060 I mean, I can only imagine how I would respond if, God forbid, I lost a child.
00:16:56.520 But this is not the way.
00:16:58.620 And it should not be normalized or accepted at all.
00:17:01.920 But the reality is this.
00:17:03.880 AI cannot bring back your loved one.
00:17:05.760 AI cannot conquer grief.
00:17:07.800 AI can only hide it.
00:17:09.560 Which means that rather than eradicating grief, you will always stay in the earliest stages of grief.
00:17:15.860 You'll never come out on the other side of it.
00:17:17.860 You'll never experience any of the beauty and wisdom and edification that can be found in grief.
00:17:25.780 I mean, it's buried deeply under a whole lot of pain, but it is down there.
00:17:32.480 And if you talk to anyone who's been through grief and has had the courage to face it, they will tell you about this.
00:17:39.480 Anyone who's grieved, grieved honestly, has discovered this.
00:17:41.920 But if you're using an AI cartoon of your dead loved one, you'll never discover it.
00:17:47.860 You haven't conquered death.
00:17:49.840 You haven't defeated mortality.
00:17:52.400 You haven't even found a way around grief.
00:17:54.840 You're just lying to yourself.
00:17:56.080 And even worse, you've reduced your dead loved ones.
00:17:59.580 You've reduced them.
00:18:01.080 So this is your mother, let's say.
00:18:02.760 If we're talking about your mother, and you do this with your mother, you have reduced her, diminished her.
00:18:10.920 Or more precisely, you've reduced and diminished your memory of her.
00:18:14.340 You haven't done anything to her personally because she's dead.
00:18:16.300 She's gone.
00:18:16.660 But you have done something horrible to your memory of her.
00:18:20.900 Your mother in life was a big, vibrant, interesting, complicated, multifaceted person.
00:18:27.620 Somebody with virtues and vices and endearing quirks and probably some not as endearing quirks.
00:18:34.040 She was a human being, in other words.
00:18:35.500 And now you've made her into a gimmick, into a party trick.
00:18:39.440 A piece of content that exists for your amusement.
00:18:43.600 AI will never be able to capture all of the dimensions of your mother, what made your mother who she was.
00:18:50.800 It can only perform a cheap imitation, mimicry.
00:18:53.240 And now, if you succeed in convincing yourself that this AI avatar actually is your mother, you will have succeeded in convincing yourself that your mother is someone far, far less interesting and wonderful than who your mother really was.
00:19:04.780 To be totally frank, you will have turned your mother into someone who isn't even worth missing in the first place.
00:19:10.860 I mean, if your mother was really as flat and boring and utterly devoid of human personality and warmth as the AI facsimile of her, well, you would hardly even notice that she was gone.
00:19:21.640 She would have been a non-entity, barely existing in the first place.
00:19:28.180 That's what you'll have done to her.
00:19:30.180 You haven't resurrected your mother, but you have desecrated her memory.
00:19:35.260 But most of all, it's selfish.
00:19:37.880 Your mother, who existed in life for a thousand reasons, to do a thousand things, now exists, quote-unquote, solely to serve you and talk to you and make you feel better.
00:19:51.640 You have made her into your servant.
00:19:53.240 You have made her into someone who never needs any time to herself, never says no to you, never lets you down, is never unavailable, never asleep, never away.
00:20:05.200 Always just right there on your phone, ready to amuse and distract you and make you feel better whenever you want, for however long you want.
00:20:13.880 But that's not a human.
00:20:15.840 That's not how humans are.
00:20:16.780 You know, your mother was created by God to serve God, first and foremost, who is the author of the universe.
00:20:25.800 But her AI posthumous avatar is created by you to serve you.
00:20:33.620 And you aren't the author of anything.
00:20:35.200 So it is playing God in the most literal, most selfish, most twisted, debased kind of way.
00:20:45.220 So here's my question.
00:20:47.480 Are we going to even attempt to do anything to prevent the nightmare that we're currently waltzing into?
00:20:52.780 Are we going to pass any laws at all to govern this technology and the companies that produce it?
00:20:57.680 Or are we just going to sit here slack-jawed, watching in horror as they do whatever they want?
00:21:03.660 I can easily see that the slippery slope that this leads to.
00:21:10.580 It'll be really bad for everybody.
00:21:14.460 Now that you'll have AI hucksters out there promising that they can reanimate your dead child, your dead parent, your dead loved one.
00:21:21.480 I mean, we could all see that this is horrific.
00:21:24.540 I mean, it is absolutely horrific.
00:21:26.980 And yet there are very few people saying, hey, maybe we should think about some laws.
00:21:30.940 Like, maybe there are some things we, maybe there's a few things we can do here.
00:21:34.020 Rather than sitting here impotent.
00:21:36.640 Just assuming at the outset that there's nothing we can do to prevent or mitigate the dystopian nightmare scenario that we, again, are just, like, strolling into.
00:21:49.440 We cannot bring our deceased loved ones back from the dead.
00:21:52.520 We don't have that power.
00:21:54.540 As with all attempts to assume the role of the divine, this effort is a tragic failure.
00:21:59.980 This does not honor anyone's memory.
00:22:02.460 It doesn't come close to approximating the traits that made this person unique.
00:22:07.080 In the end, the only purpose this AI chatbot serves is to remind us all that we are not gods.
00:22:12.980 And computers are not people.
00:22:15.440 And we should never pretend otherwise on either account.
00:22:19.440 One of the risks that I've repeatedly addressed regarding AI is that it will make humans irrelevant.
00:22:23.460 But another huge risk is that we might lose our humanity completely, our souls.
00:22:27.940 Grief tech is a delusion.
00:22:29.700 You're not communicating with your dead child or your parent.
00:22:32.400 You're communicating with zeros and ones, with electrical currents and microprocessors.
00:22:37.760 It's a lie, and we need to acknowledge that it's a lie.
00:22:40.140 Death is a part of life.
00:22:41.160 It's an unavoidable reality.
00:22:42.780 It's fundamental to being human.
00:22:44.080 And the moment we're in right now is probably our last chance to put in guardrails to prevent what could be a total catastrophe.
00:22:51.240 Let's hope that happens.
00:22:52.220 But until it does, we'll continue with our AI series.
00:22:55.220 And next week, we're back, and we're going to talk about AI's role in the destruction of our culture.
00:23:00.600 AI's role in AKFY.
00:23:09.240 They're going to find another way.
00:23:09.740 We'll be liked.
00:23:10.240 We'll be like, wait.
00:23:11.460 We'll be having fun.
00:23:12.340 Live wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild wild.
00:23:18.600 www www.