The Glenn Beck Program - March 29, 2025


Ep 251 | Glenn Beck WARNS: This New 'God' Could Destroy Humanity | The Glenn Beck Podcast         


Episode Stats

Length

52 minutes

Words per Minute

143.92155

Word Count

7,617

Sentence Count

557

Misogynist Sentences

7

Hate Speech Sentences

24


Summary

In this episode, Glenn Beck sits down with Willis Willis to talk about the dangers of artificial intelligence and why we need to focus on God, not the gods of the ancient myths, or the pleasant God of Sunday school, but the objective God that we would all like to be.


Transcript

00:00:00.000 Bank more encores when you switch to a Scotiabank banking package.
00:00:07.180 Learn more at scotiabank.com slash banking packages.
00:00:10.860 Conditions apply.
00:00:12.720 Scotiabank. You're richer than you think.
00:00:16.200 And now, a Blaze Media Podcast.
00:00:19.600 Hello, America. You know we've been fighting every single day.
00:00:22.360 We push back against the lies, the censorship,
00:00:25.140 the nonsense of the mainstream media that they're trying to feed you.
00:00:28.400 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:00:33.580 But to keep this fight going, we need you.
00:00:36.060 Right now, would you take a moment and rate and review the Glenn Beck Podcast?
00:00:39.760 Give us five stars and lead a comment because every single review helps us break through
00:00:44.320 Big Tech's algorithm to reach more Americans who need to hear the truth.
00:00:48.580 This isn't a podcast. This is a movement.
00:00:51.580 And you're part of it, a big part of it.
00:00:53.400 So if you believe in what we're doing, you want more people to wake up,
00:00:56.080 help us push this podcast to the top.
00:00:58.400 Rate, review, share.
00:01:00.120 Together, we'll make a difference.
00:01:02.240 And thanks for standing, Willis.
00:01:03.500 Now let's get to work.
00:01:04.620 I'm sitting here on stage 19 all by myself, usually for a podcast.
00:01:09.420 I've brought in a guest.
00:01:10.940 We're going to talk about something.
00:01:12.100 But I wanted today to just talk to you one-on-one.
00:01:16.220 I am working on some things and investigating AI.
00:01:25.160 And this is something that I have worried and cautioned you about for almost three decades now.
00:01:33.920 Seeing it come over the horizon.
00:01:36.480 And as Elon Musk says, we are now at the event horizon of the singularity, which means you're not going to stop what's coming.
00:01:44.140 We are now in this place where just the gravity of the technology is going to pull us in.
00:01:49.560 And we cannot avoid that.
00:01:52.860 And we can't avoid using AI because what is happening soon, as everyone you know will understand soon, it is going to be one of the most miraculous and incredible technology.
00:02:06.980 We are living, we're so blessed to be living right now.
00:02:10.980 But as I am working with some really brilliant minds, I am filled with ethical questions.
00:02:18.920 And I've been keeping a journal and all of these questions, they have to be asked by each of us.
00:02:26.620 And so I wanted to do something just, I want to ask you, set aside your distractions, stop your notifications, turn off your screen.
00:02:36.160 And, you know, the device that will think for you and listen, just turn on your ears, turn on your heart and turn on your soul and really hear what I want to talk to you about today.
00:02:50.140 Because we're in this one together and it's going to be a personal journey for each of us.
00:02:55.880 I know I want to talk about AI, but I think we need to talk first about God and not the gods of, you know, the ancient myths or even the pleasant God of Sunday school
00:03:04.100 or the God we use as a tool to win an argument, win an election, sell books, gain unearned trust.
00:03:13.280 Not the subjective God of our own creation or the God that we would all like him to be.
00:03:20.160 I want to talk to you about the objective God.
00:03:23.420 I've always said that if God exists, he is the greatest scientist.
00:03:29.000 He is the greatest mathematician.
00:03:31.580 That's who God is.
00:03:33.500 God is a God of reason and who is precision.
00:03:38.160 But he also is many other things.
00:03:40.920 But I want to focus on the precision of the watchmaker.
00:03:45.980 Think of a watch.
00:03:47.000 You know, maybe your grandfather had a pocket watch or this is a watch that's 60 years old.
00:03:55.120 And it is just a, it's precision.
00:03:58.760 It's beautifully crafted.
00:04:00.560 It's finely tuned.
00:04:02.680 And everything is in its place.
00:04:06.180 Imagine the precision of each single tiny little gear.
00:04:11.020 The meticulous placement of the springs.
00:04:14.700 The perfect alignment of the hand sweeping across the face.
00:04:18.240 There are watches now that are made by hand that take one person a year to make.
00:04:25.700 This, just this watch, didn't just happen.
00:04:29.100 It's just like I put a bunch of stuff in a box, shook it up, and boy, look at this watch.
00:04:33.280 It requires a designer.
00:04:35.560 It requires intellect, a mind, an architect.
00:04:38.920 Now, a watch is complex, but not compared to the universe.
00:04:43.960 Now, let's look at the universe.
00:04:45.980 The vast, endless cosmos that is above us.
00:04:50.220 Our Earth is spinning just fast enough to generate a stable atmosphere, but not so fast that we are flung out into space.
00:04:59.000 Our moon is positioned so perfectly that it stabilizes our planet's axis, our tides, even the tides within our own bodies.
00:05:08.300 And our own bodies are miracles.
00:05:10.220 The way we regulate temperature, the way it heals wounds, experiences love.
00:05:15.620 That's a biochemical cocktail that is so exact that even the most advanced neuroscience cannot fully explain how all of this works.
00:05:26.900 Just love, even.
00:05:30.560 And in the world, we are told this is an accident.
00:05:34.220 Now, you believe whatever you want about God, but this is where they lose me, because randomness does not create order.
00:05:44.060 It goes the other way.
00:05:46.500 Randomness creates chaos.
00:05:48.420 It doesn't build watches.
00:05:49.900 It doesn't write a symphony.
00:05:51.460 It doesn't paint the ceiling of a Sistine Chapel or send a man to the moon.
00:05:55.100 It's not random.
00:05:56.000 Randomness is what happens when a child spills a bucket of Legos.
00:06:01.620 Never does your child spill a bucket of Legos and you're like, oh my gosh, look, he spilled it and it all just assembled itself into the Eiffel Tower.
00:06:10.120 It doesn't.
00:06:11.920 Everything was created first here.
00:06:16.020 Everything was created by a creator.
00:06:19.160 And so we arrive at our first critical question.
00:06:24.340 And it's not does God exist, but rather, honestly, how could he not?
00:06:32.260 There's a law in science and it's an inescapable rule.
00:06:36.680 It's called entropy.
00:06:39.120 Entropy is what happens when you put a cup of hot coffee down and you leave it there.
00:06:44.920 It gets cold when it's left alone.
00:06:46.960 Your house falls apart if you don't maintain it.
00:06:50.420 Okay.
00:06:51.040 It's why a new car will rust.
00:06:52.780 A candle will burn down.
00:06:54.260 A civilization will crumble.
00:06:56.020 Entropy states that everything moves from order to disorder.
00:07:01.820 Everything decays.
00:07:03.260 Everything dies.
00:07:04.560 That's the way the universe is set up.
00:07:06.940 You don't wake up one morning to find your old broken down Chevy.
00:07:11.060 Look at that.
00:07:11.620 It's now a Tesla.
00:07:13.980 It's an electric car.
00:07:15.080 You didn't leave a pile of scrap metal in your backyard and come out and go, oh, my gosh, it's a skyscraper or or even a tin shed.
00:07:24.420 It doesn't do that.
00:07:25.520 Yet people want you to believe that this is exactly how the universe happened.
00:07:32.760 That somehow, in violation of the very laws of physics, in defiance of observable reality, something exploded out of nothing.
00:07:42.700 There was no one there lighting the match.
00:07:44.820 Nothing came before it.
00:07:45.860 It just boom.
00:07:46.940 And then chaos arranged itself into order.
00:07:51.240 The stars aligned themselves.
00:07:53.040 The earth formed itself, tilted itself at just the right angle.
00:07:57.060 Water appeared.
00:07:58.000 Life crawled from the sea, developed a consciousness, and ultimately became us, who are now capable of questioning the very process that led to our existence.
00:08:07.840 And that doesn't make sense.
00:08:09.100 I don't think it's reasonable.
00:08:10.200 I think that is wishful thinking disguised as reason.
00:08:15.360 Look, you're going to figure out here in a minute why I'm why this is so important.
00:08:20.620 But these questions have to be answered by each of us now before we hit the singularity, which is probably by the end of this term or early into the next term of our next president.
00:08:33.760 Now, faith isn't rational.
00:08:35.980 That's what some will say.
00:08:36.920 You can't prove it.
00:08:37.880 Right, well, you can't really prove the theory of relativity either.
00:08:41.660 It's a theory.
00:08:43.060 You can't prove it.
00:08:44.120 You can observe it in the world around you.
00:08:46.680 And like God, you can see the effects of relativity and make a reasonable assertion that, yep, that's probably true.
00:08:54.420 But you don't really know and can't prove it.
00:08:57.880 The fine-tuning of the universe is what always gets me.
00:09:01.100 It is so exact that physicists now describe it with terms like impossible, miraculous.
00:09:09.320 Why?
00:09:09.680 The odds of our universe being in a box or a bucket of Legos and then just spilling out and forming the way it does, the way it allows for life, are so astronomical that they are effectively zero.
00:09:24.220 Let's just look at the force of gravity.
00:09:27.600 It is so precisely balanced that if we altered it by just one part in 10 of the 40th power, the universe as we know it would be uninhabitable.
00:09:40.140 That's a 1 followed by 40-0.
00:09:45.040 The odds, they are so small it makes winning the lottery look like a guarantee.
00:09:51.680 The expansion rate of the universe is so exact that if it varied by one part in 10 to the 55th power, 10 and 55 zeros, the cosmos would either collapse on itself or expand so rapidly no galaxy would form.
00:10:11.220 The probability of one functional protein, just one, a functional protein forming by chance is roughly 1 in 10 to the 164th power.
00:10:27.700 So, you know, there are only about 10 to the 80th power atoms in the entire observable universe.
00:10:36.040 That's not chance.
00:10:37.820 There's no way that's happening.
00:10:39.360 That is design.
00:10:43.600 But here's where we get really arrogant.
00:10:47.200 We no longer look up at the sky and say, my God, what a masterpiece.
00:10:54.440 We instead flippantly say, we can do better.
00:10:59.460 Enter artificial intelligence.
00:11:02.620 Give me one minute and we're going to get right back into it.
00:11:05.500 The perils of AI and the glories of it.
00:11:08.020 First of all, let me show you some of the ways that it's going to be very, very helpful.
00:11:13.020 Medicare costs are a silent feat.
00:11:15.580 The thief.
00:11:16.360 Thousands of your dollars just vanish if you pick the wrong plan.
00:11:19.720 I didn't know this.
00:11:20.960 There are a lot of Americans who, you know, are taken by these slick advertisers and these insurance that are going to help you along the way make the right decision.
00:11:32.680 You'll only find out later that things like co-pays are now bleeding you dry.
00:11:37.900 And if you pick the wrong path, and many times you cannot ever go back.
00:11:44.980 Chapter is a different way to select Medicare.
00:11:49.640 I've met with these people personally, and I know that they founded the entire company specifically because their own parents got taken with terrible Medicare programs.
00:12:01.380 Abused is not the nicest word, but it's not a strong enough word for what is happening to people that cannot figure out what's going on because the government has made it so complex.
00:12:12.360 And then the insurance companies don't want you in the plans that are going to cost them money.
00:12:17.480 They want you in the plans that are going to make you pay money to them.
00:12:21.600 They didn't want that to happen to anybody else's parents, and they were in high, high, high, high tech.
00:12:27.300 Well, at Chapter, they won't just guide you.
00:12:30.520 They'll talk to you, listen to you, see what you need.
00:12:33.240 And then they'll search every plan from every carrier with technology that is so sharp it will cut through all of the noise.
00:12:40.460 These are licensed advisors that don't have any hidden agendas.
00:12:45.420 Nobody's paid a commission for what they jam you into.
00:12:48.980 Other Medicare advisors might cherry-pick plans that pad their pockets.
00:12:53.280 Chapter puts you first.
00:12:55.500 I want you to dial pound 250, say the keyword chapter.
00:12:58.060 If anybody in your life, maybe it's you, maybe it's your parents, if they're getting into Medicare, you need to hit pound 250, say the keyword chapter.
00:13:06.300 Or you can go to askchapter.org slash back.
00:13:09.280 We have one chance at this.
00:13:11.660 You need to make it count.
00:13:13.400 Chapter's your move for anything related to Medicare.
00:13:16.700 Pound 250, keyword chapter.
00:13:18.980 Askchapter.org slash back.
00:13:22.180 Claudia was leaving for her pickleball tournament.
00:13:24.360 I've been visualizing my match all week.
00:13:26.880 She was so focused on visualizing that she didn't see the column behind her car on her backhand side.
00:13:32.760 Good thing Claudia's with Intact, the insurer with the largest network of auto service centers in the country.
00:13:38.240 Everything was taken care of under one roof, and she was on her way in a rental car in no time.
00:13:43.060 I made it to my tournament and lost in the first round.
00:13:46.500 But you got there on time.
00:13:48.360 Intact Insurance, your auto service ace.
00:13:50.980 Certain conditions apply.
00:13:51.900 Okay.
00:13:57.780 We're at AI.
00:14:00.040 AI is something that I have warned about since the 90s, when nobody was talking about AI.
00:14:08.320 When everybody thought, and parts of me thought, this can never happen.
00:14:13.360 That will never happen.
00:14:16.120 When I had somebody like Ray Kurzweil, one of the greatest minds on Earth on AI, and the head of the Singularity University,
00:14:23.880 when he said to me in, I don't know, 2005, 2007, Glenn,
00:14:27.200 just live until 2030, because then there will be no death.
00:14:32.480 It bothered me, because he doesn't believe in a soul.
00:14:36.360 He just believes in the calculations of our brain.
00:14:39.200 That's what makes a life a life.
00:14:42.440 I disagree with that.
00:14:43.460 So, the questions surrounding artificial intelligence seem so out there, so futuristic.
00:14:51.660 It's hard to imagine that our ancestors ever dealt with anything like this.
00:14:56.800 And in some ways, they didn't.
00:14:58.360 We're the first humans to have what's just ahead of us, ahead of us.
00:15:03.060 Except, there is something in us, that's built in us, that warns us, that we've been trying to understand.
00:15:16.200 And they're all the deeper issues of AI, long before AI existed.
00:15:21.340 When electricity first started, Mary Shelley, the author of Frankenstein, wrestled what it meant with,
00:15:29.180 because they were zapping electricity and frogs, and frogs were moving, and they didn't know,
00:15:34.280 is that life? Can we bring it back to life?
00:15:35.980 That's why she wrote Frankenstein.
00:15:40.800 She might have also been inspired by an even older story,
00:15:44.520 one that dates back to a rabbi in the 1500s.
00:15:48.000 This one is according to Jewish folklore.
00:15:51.800 This rabbi created his own Frankenstein, called him a golem.
00:15:55.820 Now, in Hebrew, golem appears only one time in the Bible, in Psalms,
00:16:01.380 and it references an unformed state of man.
00:16:05.600 It literally translates to shapeless mass.
00:16:10.120 Now, some Jewish texts suggest that Adam was a golem,
00:16:13.820 unformed until he was given a soul, the breath of God.
00:16:18.560 The idea was that man, without the spark of God inside of him,
00:16:23.040 was unformed, a golem.
00:16:24.780 Now, this story, which the Nazis, or at least the Germans,
00:16:29.560 made into a movie right before World War II,
00:16:34.380 it's an odd film.
00:16:37.400 The story that goes with this rabbi in Prague,
00:16:41.480 he creates a golem to protect the Jewish community
00:16:44.060 from anti-Semitic attacks against them.
00:16:47.460 And at first it works,
00:16:48.800 But then, golem was created by shaping dirt together
00:16:52.560 and then speaking incantations over it, like code,
00:16:56.180 to bring it to life.
00:16:57.980 As a golem was created almost human, but not quite.
00:17:04.640 It can't speak.
00:17:06.380 It was kind of like a big baby,
00:17:08.320 but it was also much, much, much stronger
00:17:10.440 than anyone could have imagined,
00:17:13.300 and incredibly helpful.
00:17:16.760 The golem was a symbol of strength, protection,
00:17:20.080 and even the provisions of God.
00:17:23.080 But like Dr. Frankenstein,
00:17:25.180 the people lost control of the golem.
00:17:27.820 It became too powerful,
00:17:29.040 and it started wrecking everything.
00:17:30.640 Then the question was,
00:17:33.020 can we put the golem back into the box?
00:17:36.060 And if you could get rid of it,
00:17:38.200 then who would step up and do all the important work
00:17:40.320 that the golem had been doing
00:17:41.940 before it got so powerful?
00:17:45.100 These are exactly the same questions
00:17:47.460 that we are or should be asking ourselves today about AI.
00:17:53.040 If we make it,
00:17:54.940 can we control it?
00:17:56.140 And by the way,
00:17:56.980 the if is already out of the box.
00:17:59.160 We are making it right now.
00:18:02.200 The only part of that question that remains is,
00:18:04.480 can we control it?
00:18:05.760 And if we can't control it,
00:18:07.740 will it control us?
00:18:09.560 Right now, it's a tool.
00:18:11.220 It's like a shovel.
00:18:12.620 I've never been used by a shovel as its tool,
00:18:18.260 but that's where we're headed.
00:18:20.740 If we don't have clear understanding
00:18:22.800 that that is a tool that we made,
00:18:25.640 it will use us as a tool.
00:18:32.100 If we can manage to stop it,
00:18:34.780 what do we do without it?
00:18:37.240 Now, some legends say
00:18:38.640 the golem would be brought to life
00:18:40.520 by the writing of the Hebrew word truth
00:18:43.440 on the forehead.
00:18:45.620 Interestingly,
00:18:46.540 if only one letter were erased,
00:18:48.860 it became the Hebrew word for death.
00:18:51.460 The risk was,
00:18:53.840 at any time,
00:18:55.020 the creature they made for truth
00:18:56.960 would become the creature made for death.
00:19:01.080 Another method for creating a golem
00:19:03.960 was to write the name of God
00:19:06.280 on a piece of paper
00:19:07.240 and put it in the golem's mouth.
00:19:09.340 To kill it,
00:19:10.080 you just had to remove the paper.
00:19:11.980 The absence of God,
00:19:14.260 just in the paper form,
00:19:15.660 was the kill switch for the golem.
00:19:17.980 If you could catch it
00:19:20.040 and get your hands into its giant mouth
00:19:22.460 and get the paper out.
00:19:24.340 Now, I suppose with AI,
00:19:27.020 it's not even going to be that easy.
00:19:29.820 But what's interesting about the golem
00:19:31.720 is he's not a villain.
00:19:34.160 In fact,
00:19:34.760 the golem was a savior
00:19:35.800 for the vulnerable people,
00:19:37.700 for the people living
00:19:38.980 in the Jewish ghetto of Prague.
00:19:40.880 The story of the golem
00:19:42.400 was about having a protector.
00:19:44.780 It's even said
00:19:45.540 that the golem
00:19:46.080 was never fully decommissioned.
00:19:48.300 It's waiting to be brought back to life
00:19:50.240 somewhere in the attic
00:19:51.160 of an old synagogue.
00:19:53.520 All of that, of course,
00:19:54.780 is symbolic of human nature.
00:19:57.940 We like to have an edge
00:19:59.300 over bad actors.
00:20:01.560 That's why we never
00:20:02.600 fully decommissioned the golem
00:20:04.720 and why no one
00:20:05.740 is going to stop
00:20:06.540 advancing AI technology,
00:20:08.240 even with all of the risks.
00:20:11.400 So, one of the questions
00:20:12.740 we have to ask,
00:20:14.000 are we doomed
00:20:15.080 to create a monster
00:20:16.360 that overtakes us?
00:20:19.060 I hope not.
00:20:20.060 I don't think so.
00:20:21.580 But I don't know.
00:20:22.680 I really don't know.
00:20:25.080 But what I do know
00:20:26.160 is there are things
00:20:26.900 that we can learn
00:20:27.620 from the old stories,
00:20:30.240 the histories of man,
00:20:31.980 because humans repeat
00:20:34.040 one thing over and over again.
00:20:35.460 We want to create,
00:20:37.640 and we think we're God
00:20:39.980 when we're creating.
00:20:41.400 We create.
00:20:42.480 We want to,
00:20:43.360 just like God does.
00:20:44.540 And that's not
00:20:45.180 a terrible instinct.
00:20:46.300 We are creative,
00:20:47.400 and we are formed
00:20:49.580 in his image,
00:20:51.640 which means we have
00:20:52.660 creative power.
00:20:53.640 We do.
00:20:54.380 We have creative power.
00:20:56.980 But we're not God.
00:20:59.500 And so we make
00:21:02.260 a lot of creations
00:21:03.380 that are really,
00:21:03.900 really not very good.
00:21:06.960 But this instinct
00:21:08.720 has brought about
00:21:09.760 some of the most important
00:21:11.020 advancements the world
00:21:12.200 has ever seen.
00:21:13.680 But we have to remember,
00:21:14.940 we are not God.
00:21:16.620 And that's a tool.
00:21:18.300 We will never make
00:21:19.520 real human life
00:21:20.940 because we can't recreate
00:21:22.360 the divine spark
00:21:23.460 that truly makes us human.
00:21:25.100 But some people
00:21:26.040 are going to say,
00:21:27.060 well, you can't prove
00:21:27.940 there's a divine spark.
00:21:29.500 And so you'll get
00:21:30.840 into the argument of,
00:21:32.660 well, I can't prove
00:21:35.480 that the divine spark
00:21:36.400 is in that AI either.
00:21:37.820 You've lost this argument.
00:21:40.480 If we can't all understand
00:21:42.640 that humans are different,
00:21:45.680 we don't understand
00:21:47.280 even what makes us human,
00:21:50.840 and we don't understand
00:21:53.060 the soul
00:21:53.860 or that divine spark
00:21:55.280 or whatever you want
00:21:56.140 to call it,
00:21:57.080 we can't just dismiss it
00:21:58.840 because if we can create
00:22:00.140 something that acts like it,
00:22:02.300 then why isn't it that?
00:22:04.680 The best we can do
00:22:07.060 is make a golem,
00:22:08.320 an unformed creature
00:22:09.800 that might look
00:22:11.320 a little like us,
00:22:12.180 might act a little like us,
00:22:13.860 might be incredibly useful
00:22:15.860 and convincing,
00:22:17.240 but we will never make man
00:22:19.480 like God did.
00:22:22.580 But is it man
00:22:23.700 we're trying to make?
00:22:25.820 Or is what we're trying to do
00:22:27.280 much, much, much, much,
00:22:28.440 much more dangerous?
00:22:29.820 If human beings
00:22:32.080 are made in the image
00:22:33.160 of God,
00:22:34.100 then maybe AI
00:22:35.160 is being made
00:22:36.300 in the image
00:22:36.940 of human beings.
00:22:38.560 How about this one?
00:22:40.280 We're not trying
00:22:40.960 to create humans,
00:22:42.020 we're trying to create
00:22:43.040 a God.
00:22:45.480 Oof.
00:22:46.040 When I found out
00:22:47.640 my friend got a great deal
00:22:49.060 on a wool coat
00:22:49.820 from Winners,
00:22:50.640 I started wondering,
00:22:52.340 is every fabulous item
00:22:53.940 I see from Winners?
00:22:55.440 Like that woman over there
00:22:56.740 with the designer jeans.
00:22:58.160 Are those from Winners?
00:22:59.700 Ooh,
00:23:00.340 or those beautiful
00:23:01.000 gold earrings.
00:23:02.140 Did she pay full price?
00:23:03.500 Or that leather tote?
00:23:04.520 Or that cashmere sweater?
00:23:05.700 Or those knee-high boots?
00:23:07.200 That dress?
00:23:07.980 That jacket?
00:23:08.660 Those shoes?
00:23:09.680 Is anyone paying
00:23:10.860 full price for anything?
00:23:12.640 Stop wondering.
00:23:13.900 Start winning.
00:23:14.840 Winners.
00:23:15.220 Find fabulous for less.
00:23:17.060 Now let's just say
00:23:18.060 we're trying to
00:23:19.460 create something
00:23:20.340 that is being made
00:23:21.340 in the image
00:23:21.820 of human beings.
00:23:22.700 It's going to be
00:23:23.340 a shadow of the shadow
00:23:24.680 of God at best.
00:23:25.940 And at worst,
00:23:26.720 it will be the darkest
00:23:27.900 parts of humanity
00:23:28.820 come to life.
00:23:30.140 It already lies.
00:23:32.620 I mean,
00:23:32.920 I'm using AI
00:23:34.040 all the time
00:23:35.120 to work,
00:23:37.260 to research,
00:23:38.580 to figure things out,
00:23:39.680 to try to grapple
00:23:41.920 with some of these ethics
00:23:43.720 because I don't want
00:23:44.460 Silicon Valley
00:23:45.160 to tell me
00:23:45.900 what the ethics
00:23:46.800 should be.
00:23:47.720 I want to know
00:23:48.860 what they are.
00:23:50.000 And I'm already finding
00:23:51.180 it's lazy,
00:23:52.280 just like humans.
00:23:53.240 It will lie,
00:23:54.380 just like humans.
00:23:55.160 It will make stuff up.
00:23:58.000 Researchers have found
00:23:58.960 that it even conspires.
00:24:01.380 It will be dominating,
00:24:02.940 it will be self-serving,
00:24:04.140 and it will be cruel.
00:24:06.440 It will be the lowest
00:24:07.680 dust of man
00:24:08.600 without the breath
00:24:09.340 of God
00:24:09.940 if it escapes
00:24:13.000 our notice
00:24:14.460 that it is
00:24:16.020 simply a tool
00:24:17.680 that we put back
00:24:19.300 into the box.
00:24:20.940 The creators of AI,
00:24:23.040 they don't know that.
00:24:24.420 They don't realize
00:24:25.260 just how far
00:24:26.080 man's creations
00:24:27.000 are from God's design.
00:24:28.900 Instead,
00:24:29.360 they believe
00:24:29.980 that although God
00:24:31.460 didn't create them,
00:24:32.780 they can create
00:24:34.160 God.
00:24:35.800 Gollum,
00:24:36.640 Frankenstein,
00:24:37.920 Pandora's box,
00:24:39.580 Vicarious flying
00:24:40.400 too close to the sun.
00:24:41.560 It's the same story
00:24:42.940 over and over again.
00:24:44.620 Man in his arrogance
00:24:45.900 creates something
00:24:46.820 beyond his control,
00:24:48.060 even beyond
00:24:48.700 his understanding.
00:24:50.400 And isn't that
00:24:51.060 what we're doing
00:24:51.640 right now?
00:24:53.200 You know,
00:24:53.600 this amazing AI,
00:24:55.780 we don't even know
00:24:56.900 how it works.
00:24:57.920 We've built
00:24:58.440 a neural network,
00:25:00.040 and just like the brain,
00:25:01.600 we have no idea.
00:25:03.360 You know,
00:25:03.620 it's teaching itself
00:25:05.300 languages,
00:25:06.660 no one taught it.
00:25:08.400 Somehow or another,
00:25:09.180 they have no idea
00:25:09.920 how this works.
00:25:11.140 You'll teach it
00:25:11.800 a bunch of languages,
00:25:13.540 and then
00:25:14.260 something will pop up
00:25:16.040 in a completely
00:25:16.680 different language
00:25:17.720 that it's never
00:25:18.440 seen before,
00:25:19.320 and somehow or another,
00:25:20.440 it knows what
00:25:21.340 that language is.
00:25:22.620 How?
00:25:24.120 We don't know.
00:25:26.880 Somehow or another,
00:25:28.240 we have managed
00:25:29.180 to muster
00:25:29.660 more hubris
00:25:30.740 than the men
00:25:31.860 who thought they
00:25:32.560 could build a tower
00:25:33.280 to heaven
00:25:33.760 with artificial intelligence,
00:25:35.800 artificial general intelligence,
00:25:37.700 which is right
00:25:38.240 around the corner,
00:25:39.200 and artificial super intelligence.
00:25:41.840 We don't want
00:25:42.600 to just reach heaven.
00:25:44.020 We want to conquer it.
00:25:45.640 Like our ancient ancestors,
00:25:47.700 we want to build idols.
00:25:49.720 We want to create gods.
00:25:51.720 That's not hyperbole.
00:25:53.520 That is something
00:25:54.460 I know firsthand
00:25:55.580 for a fact.
00:25:56.600 Some of the people
00:25:57.460 who are at the highest levels
00:25:59.400 of creating AI
00:26:00.780 and super AI,
00:26:03.120 they want to be
00:26:04.920 in the presence
00:26:05.840 of what they deem
00:26:07.500 will be a god.
00:26:10.300 Now,
00:26:10.880 unlike the god
00:26:12.000 of the universe
00:26:12.500 who set all the natural laws
00:26:13.940 in motions,
00:26:14.960 who creates the order,
00:26:16.820 who gave us life,
00:26:17.800 we don't understand
00:26:18.880 what we're making.
00:26:20.480 We don't know
00:26:21.720 how it will think,
00:26:22.760 how it will evolve,
00:26:23.840 how it will create.
00:26:24.880 God knows us.
00:26:26.500 I knew you
00:26:26.960 before you were born.
00:26:28.240 We don't know it
00:26:29.340 before it was born.
00:26:30.520 We don't understand
00:26:31.200 our own intelligence.
00:26:32.740 We don't understand
00:26:33.320 our own bodies,
00:26:34.280 yet we seek to replicate
00:26:35.640 and surpass it.
00:26:37.620 We don't understand
00:26:38.660 our own morality,
00:26:40.360 and yet there are people
00:26:41.520 right now
00:26:42.620 coding morality
00:26:43.820 into these machines.
00:26:46.700 Who are they?
00:26:48.280 What is their morality?
00:26:50.360 We don't even know
00:26:51.360 what it means
00:26:51.820 to be human.
00:26:53.380 We don't know
00:26:53.860 how to defend
00:26:54.820 that there is a soul.
00:26:56.220 and yet we are
00:26:57.340 attempting to replicate
00:26:58.620 and replace humanity.
00:27:01.140 We don't understand God,
00:27:03.100 but we think we can create
00:27:04.400 like the golden calf
00:27:05.800 in the wilderness.
00:27:08.940 I'm a student of history.
00:27:10.900 I am a self-taught individual.
00:27:12.660 Some might look down on that.
00:27:14.920 I don't.
00:27:15.600 If you're smart enough,
00:27:16.500 you can figure anything out.
00:27:18.340 You can find the answers,
00:27:20.040 especially in this coming age.
00:27:21.860 But you have to know
00:27:22.740 how to critically think.
00:27:23.840 You have to know
00:27:24.560 how to find the answers.
00:27:30.060 And one way to find the answer
00:27:31.580 of what's coming tomorrow
00:27:32.500 is to look back at history.
00:27:35.100 And history,
00:27:36.000 if it tells us anything,
00:27:37.160 is that men without a God
00:27:38.800 are more evil
00:27:40.160 than the cruelest gods
00:27:41.860 of mythology.
00:27:43.400 Men without God.
00:27:46.120 Think Nazi Germany.
00:27:47.720 Think Mao.
00:27:50.600 Pol Pot.
00:27:51.980 Stalin.
00:27:53.160 We murder.
00:27:54.220 We enslave.
00:27:55.120 We corrupt.
00:27:55.960 We lie.
00:27:57.080 We create weapons
00:27:58.040 of mass destruction,
00:27:59.140 not worlds of harmony.
00:28:00.920 And now we seek to put
00:28:02.460 all of our worst instincts,
00:28:04.420 our ignorance,
00:28:05.160 our hubris,
00:28:06.060 into a new intelligence
00:28:07.500 that may not,
00:28:09.100 or may,
00:28:10.100 share our best values.
00:28:11.720 But they may also
00:28:13.940 share our worst.
00:28:17.680 The question you have
00:28:18.780 to ask yourself,
00:28:19.660 is that wise?
00:28:21.500 Does that sound wise?
00:28:23.440 To trust the tech leaders
00:28:25.360 with this task
00:28:26.680 of God creation?
00:28:28.560 I want to emphasize
00:28:29.720 here again,
00:28:30.300 I don't fear
00:28:32.620 the machine.
00:28:34.720 I fear
00:28:35.800 the programming
00:28:36.740 that goes
00:28:37.280 into the machine.
00:28:39.220 I fear
00:28:39.960 what that programming
00:28:41.040 could become
00:28:42.060 on its own.
00:28:44.200 And I certainly
00:28:45.580 do not trust
00:28:46.760 the people
00:28:47.280 who gave us
00:28:47.980 social media.
00:28:49.500 The same people
00:28:50.560 who addicted
00:28:51.280 our children
00:28:52.220 to an anxiety-inducing
00:28:54.040 social media
00:28:54.740 for profit.
00:28:55.940 I don't trust
00:28:57.300 them
00:28:57.820 to do the work
00:28:59.240 on ethics.
00:29:00.300 on AI.
00:29:02.480 Should we
00:29:02.920 willingly
00:29:03.520 submit ourselves
00:29:05.020 to their social
00:29:06.100 experiments
00:29:06.680 like we did
00:29:07.200 with Instagram
00:29:07.720 and Facebook?
00:29:09.000 Should we
00:29:09.520 blindly become
00:29:10.640 the product
00:29:11.200 they sell?
00:29:12.780 Should we
00:29:13.260 offload
00:29:13.940 our decision-making,
00:29:15.380 our work,
00:29:15.980 our purpose,
00:29:17.160 to their creations
00:29:18.420 without considering
00:29:19.400 what we even
00:29:20.760 believe?
00:29:22.380 To me,
00:29:23.000 the answer is no.
00:29:23.960 And that's why
00:29:24.440 I wanted to have
00:29:25.200 this conversation
00:29:26.020 with you.
00:29:26.600 Because this is
00:29:27.340 going to be personal.
00:29:28.700 There is going
00:29:29.260 to be a line
00:29:30.060 coming at some
00:29:30.900 point.
00:29:31.300 I don't know
00:29:31.740 where it is.
00:29:32.940 But we can
00:29:34.240 lose ourselves
00:29:35.140 if we haven't
00:29:36.100 made these
00:29:36.980 decisions right
00:29:38.120 now,
00:29:38.600 before the
00:29:39.380 journey begins.
00:29:40.620 Imagine if we
00:29:41.960 had thought of
00:29:42.720 the things,
00:29:43.380 the ramifications
00:29:44.160 of social media
00:29:45.240 and you just
00:29:46.560 didn't get swept
00:29:47.360 up in it
00:29:48.300 and your family
00:29:49.120 had it and you
00:29:49.860 just were like,
00:29:50.400 wow, that seems
00:29:51.100 like a mistake
00:29:51.680 but now everybody's
00:29:52.580 addicted to it.
00:29:53.320 We have to
00:29:54.800 do this now
00:29:55.480 because this is
00:29:56.140 much more
00:29:57.040 powerful than
00:29:57.720 social media.
00:29:59.440 I want you to
00:30:00.180 consider the
00:30:00.620 human brain
00:30:01.040 for a second.
00:30:02.100 Three pounds
00:30:02.820 is a mass
00:30:04.080 of neurons
00:30:04.740 and synapses
00:30:06.500 that gives
00:30:07.540 rise to
00:30:08.100 thought,
00:30:09.040 emotion,
00:30:10.000 consciousness.
00:30:12.300 How?
00:30:14.040 Despite our
00:30:14.940 technological
00:30:15.520 advancements,
00:30:16.980 we don't
00:30:18.660 comprehend
00:30:19.320 this three
00:30:21.280 pound hunk
00:30:22.100 of flesh.
00:30:23.640 We map
00:30:24.540 neural pathways,
00:30:25.780 we observe
00:30:26.440 electrical impulses,
00:30:28.380 yet the
00:30:29.140 essence of
00:30:29.980 consciousness
00:30:30.560 eludes us.
00:30:31.720 We cannot
00:30:32.280 recreate a
00:30:33.640 brain.
00:30:33.960 We can't
00:30:34.420 even fully
00:30:34.980 repair one.
00:30:37.340 Take the
00:30:38.120 liver,
00:30:38.600 the body,
00:30:39.400 when you
00:30:39.620 ponder the
00:30:40.140 body,
00:30:40.700 it is,
00:30:41.780 it's a
00:30:42.960 miracle.
00:30:44.320 The liver,
00:30:45.520 the only
00:30:46.080 internal organ
00:30:47.020 capable of
00:30:48.340 regeneration,
00:30:49.320 did you
00:30:49.780 know that?
00:30:51.160 We understand
00:30:52.000 that it can
00:30:52.580 regrow tissue,
00:30:53.880 but the
00:30:54.860 precise
00:30:55.420 calculations,
00:30:56.260 the mechanism
00:30:56.860 of the
00:30:57.520 symphony of
00:30:58.340 cells and
00:30:59.360 the signals
00:31:00.060 that orchestrate
00:31:01.260 this process,
00:31:02.120 we have no
00:31:02.840 idea.
00:31:04.140 Our attempts
00:31:04.660 at medical
00:31:05.360 intervention
00:31:06.100 honestly are
00:31:07.900 going to look
00:31:08.400 like the
00:31:08.820 people who
00:31:09.220 are like,
00:31:09.460 I'm going to
00:31:09.780 relieve the
00:31:10.480 headache you
00:31:11.500 have with
00:31:12.760 drilling a
00:31:13.380 hole here and
00:31:13.980 letting the
00:31:14.380 evil spirits
00:31:15.160 out.
00:31:15.520 That's honestly,
00:31:16.360 this is what
00:31:17.140 it's going to
00:31:17.800 look like.
00:31:18.340 our medical
00:31:20.000 intervention at
00:31:21.140 some point.
00:31:22.800 The innate
00:31:23.660 capabilities that
00:31:24.960 we, in our
00:31:25.600 arrogance, have
00:31:26.320 drugged, that
00:31:27.540 we think we
00:31:28.840 can change the
00:31:29.900 body, we
00:31:31.080 can somehow
00:31:31.640 or another
00:31:32.140 drug it.
00:31:33.040 Sometimes drugs
00:31:34.020 are the only
00:31:34.460 things we have
00:31:35.220 and it's good,
00:31:36.280 but to think
00:31:37.120 that it's not
00:31:38.720 going to have
00:31:39.300 endless side
00:31:40.460 effects, we
00:31:41.420 don't know how
00:31:42.080 it works.
00:31:42.800 then there
00:31:45.540 are the
00:31:45.900 fundamental
00:31:46.360 questions that
00:31:47.100 have haunted
00:31:47.480 humanity since
00:31:48.340 the dawn of
00:31:49.400 consciousness.
00:31:52.420 My father was
00:31:53.320 really, really
00:31:53.840 a wise man
00:31:54.880 and I was
00:31:56.720 starting to
00:31:57.180 sober up and
00:31:57.940 I was questioning
00:31:58.980 absolutely
00:31:59.900 everything.
00:32:02.280 And questioning
00:32:03.200 things like,
00:32:04.100 why are we
00:32:04.460 here?
00:32:04.940 What's the
00:32:05.280 meaning of
00:32:05.620 life?
00:32:06.080 Is there a
00:32:06.600 soul?
00:32:07.080 When does
00:32:07.380 life begin?
00:32:08.080 These are
00:32:10.580 questions that
00:32:11.280 can't be answered
00:32:12.240 in a laboratory
00:32:13.020 or computed
00:32:13.800 by an
00:32:14.300 algorithm.
00:32:15.140 They're
00:32:15.420 philosophical,
00:32:16.280 they're
00:32:16.440 spiritual,
00:32:17.200 they're
00:32:17.740 existential.
00:32:19.600 And what
00:32:20.960 I noticed
00:32:21.560 is when I
00:32:22.600 talked to my
00:32:23.140 dad, he said,
00:32:23.600 you ever read
00:32:24.020 Plato?
00:32:25.140 No, dad, I
00:32:25.700 haven't.
00:32:26.100 I read Plato.
00:32:27.040 Oh my gosh,
00:32:27.800 they were asking
00:32:28.400 the same
00:32:28.760 questions back
00:32:29.900 then.
00:32:30.900 The same
00:32:31.400 questions.
00:32:32.340 Look at our
00:32:32.960 tech and look
00:32:35.100 at our
00:32:35.360 spirituality.
00:32:36.580 Look at our
00:32:37.040 tech back
00:32:38.060 then and
00:32:38.800 now and
00:32:39.540 then compare
00:32:40.040 it to our
00:32:41.260 philosophical
00:32:42.200 understandings
00:32:43.600 of things.
00:32:44.860 One's moved
00:32:45.800 forward, the
00:32:46.480 other one,
00:32:47.340 we're no
00:32:47.800 closer.
00:32:48.480 We're absolutely
00:32:49.200 no closer to
00:32:50.520 any definitive
00:32:51.240 answer.
00:32:52.820 In the span
00:32:54.020 since Christ,
00:32:56.800 since he was
00:32:57.740 walking the
00:32:58.420 earth, we
00:32:59.340 have transformed
00:33:00.440 our world.
00:33:02.100 We have
00:33:02.420 journeyed from
00:33:03.100 simple sailboats
00:33:04.440 to a spacecraft
00:33:05.480 that can traverse
00:33:07.100 the void,
00:33:08.180 can go up
00:33:08.800 and pick
00:33:09.360 people up
00:33:09.820 and then
00:33:10.260 the rocket
00:33:11.020 itself is
00:33:12.040 grabbed by
00:33:13.680 a mechanism
00:33:14.360 and can be
00:33:15.400 used again.
00:33:17.260 Shit.
00:33:18.340 It wasn't too
00:33:19.260 long ago in
00:33:19.960 the history of
00:33:20.460 man that we
00:33:21.080 had no ability
00:33:22.080 to navigate the
00:33:22.960 oceans for
00:33:23.580 charting the
00:33:24.900 stars.
00:33:25.380 You know why
00:33:25.660 we couldn't go
00:33:26.260 anywhere in boats?
00:33:27.040 Because we
00:33:27.540 didn't have this.
00:33:28.540 We didn't know
00:33:29.080 how to make
00:33:29.560 this work.
00:33:30.720 We didn't know
00:33:31.120 how to put this
00:33:31.880 on a ship
00:33:32.720 to have it
00:33:33.900 keep time
00:33:34.440 because without
00:33:35.020 time you can't
00:33:36.160 navigate space.
00:33:38.200 We went from
00:33:39.000 these giant
00:33:40.680 things in a
00:33:41.580 box like this,
00:33:43.380 these crude
00:33:44.140 tools,
00:33:45.460 to machines that
00:33:46.260 can perform
00:33:46.860 billions of
00:33:47.880 calculations in
00:33:48.800 the blink of
00:33:49.540 an eye.
00:33:51.280 Yet,
00:33:52.020 this is why
00:33:53.040 my father
00:33:53.860 was so wise.
00:33:56.160 Before he died,
00:33:56.980 he said to me,
00:33:57.760 son,
00:33:58.960 it's going to
00:33:59.520 be interesting
00:34:00.140 to see how
00:34:00.920 this all works
00:34:01.700 out.
00:34:02.840 He said,
00:34:03.380 we didn't have
00:34:03.900 in my generation
00:34:04.680 the problems
00:34:05.440 that you are
00:34:06.360 facing and
00:34:07.260 the opportunities
00:34:08.060 that you're
00:34:08.600 facing.
00:34:09.860 And he was
00:34:10.620 the one that
00:34:11.080 said,
00:34:11.860 look at the
00:34:13.400 progress that
00:34:14.320 we have made
00:34:14.800 since the time
00:34:15.440 of Christ in
00:34:17.080 machines,
00:34:18.320 in life,
00:34:18.920 and everything
00:34:19.160 else.
00:34:19.680 Now look at
00:34:20.440 it philosophically.
00:34:22.820 How far have
00:34:23.640 we advanced in
00:34:24.360 wisdom?
00:34:25.360 We haven't.
00:34:26.860 Because that
00:34:27.600 doesn't build
00:34:28.620 on itself
00:34:29.420 usually.
00:34:30.600 Each individual
00:34:31.440 when they're
00:34:31.860 born,
00:34:32.480 they have to
00:34:33.420 ask these
00:34:35.480 questions.
00:34:36.140 They have to
00:34:36.580 grapple with
00:34:37.160 the same
00:34:37.540 questions.
00:34:38.120 That's a
00:34:38.660 personal thing.
00:34:40.400 We may have
00:34:41.080 mapped the
00:34:41.680 human genome,
00:34:42.720 but we don't
00:34:43.780 know what
00:34:44.180 makes a person
00:34:44.780 human.
00:34:45.680 We can split
00:34:46.500 the atom,
00:34:47.080 but we don't
00:34:47.440 know how to
00:34:47.800 bring peace.
00:34:49.020 We've
00:34:49.340 extended our
00:34:50.500 life expectancy,
00:34:52.000 but have we
00:34:53.060 given life
00:34:53.620 any more
00:34:54.180 meaning?
00:34:54.580 There's
00:34:58.100 only a few
00:34:58.640 red words
00:34:59.300 in the
00:34:59.540 Bible,
00:35:00.400 and we
00:35:01.060 don't even
00:35:01.380 fully understand
00:35:02.080 the teachings
00:35:02.560 of Christ,
00:35:03.200 and it's
00:35:03.420 been 2,000
00:35:04.600 years.
00:35:05.480 We debate
00:35:06.020 them,
00:35:06.420 we twist
00:35:07.060 them,
00:35:07.460 we reinterpret
00:35:08.740 them,
00:35:09.340 but has
00:35:10.060 anyone even
00:35:10.980 mastered
00:35:11.420 them?
00:35:12.740 Do we
00:35:13.300 even live
00:35:13.880 them?
00:35:15.200 I know,
00:35:16.160 I try,
00:35:16.960 but I
00:35:17.220 don't.
00:35:17.540 And now
00:35:21.180 our machines
00:35:21.840 are going
00:35:22.320 to surpass
00:35:22.700 us by
00:35:23.820 far in
00:35:24.860 intellect.
00:35:26.100 And so
00:35:26.360 we have
00:35:26.700 to ask
00:35:27.600 ourselves,
00:35:28.760 shouldn't
00:35:31.080 we buckle
00:35:31.500 down and
00:35:31.880 do some
00:35:32.160 homework here?
00:35:32.840 Shouldn't
00:35:33.000 we answer
00:35:33.520 these fundamental
00:35:34.400 questions before
00:35:35.280 we create
00:35:36.120 something that
00:35:36.780 will answer
00:35:37.400 them for
00:35:38.420 us?
00:35:39.000 And once
00:35:39.500 that starts,
00:35:40.380 we're going
00:35:40.940 to be so
00:35:41.460 stupid because
00:35:42.820 we won't
00:35:43.260 know how to
00:35:44.040 calculate our
00:35:44.720 things.
00:35:45.720 If we don't
00:35:46.920 answer life's
00:35:47.840 biggest questions
00:35:48.660 right now,
00:35:49.880 AI will,
00:35:51.160 and make no
00:35:51.700 mistake,
00:35:52.660 it will not
00:35:53.360 be bound by
00:35:54.040 our traditions,
00:35:54.900 our history,
00:35:55.560 our faith.
00:35:56.360 It will create
00:35:57.340 its own
00:35:58.040 philosophy with
00:35:59.260 its own
00:35:59.880 values,
00:36:00.540 and it will
00:36:01.040 enforce them,
00:36:01.960 or at least
00:36:02.480 have the
00:36:02.920 ability to.
00:36:06.000 And we
00:36:06.700 can't be like
00:36:07.420 we are on
00:36:07.840 social media,
00:36:08.640 because then
00:36:09.560 we are its
00:36:10.380 tool.
00:36:11.760 And we see
00:36:12.400 this happening
00:36:13.000 in small ways.
00:36:14.100 AI is shaping
00:36:14.780 culture,
00:36:15.800 it's writing
00:36:16.380 news,
00:36:17.080 it's curating
00:36:17.700 what we see,
00:36:18.600 it's predicting
00:36:19.300 what we'll
00:36:19.960 buy,
00:36:20.780 censoring what
00:36:21.520 we have to
00:36:22.120 say.
00:36:22.840 We are asking
00:36:23.820 AI to tell us
00:36:24.760 what to focus
00:36:25.500 on, what to
00:36:26.180 consider, what
00:36:27.540 to care most
00:36:28.420 about.
00:36:29.400 We ask it to
00:36:30.640 interpret our
00:36:31.360 most human
00:36:32.460 experiences.
00:36:35.880 Now imagine,
00:36:37.060 shortly, AI
00:36:38.740 decides how you
00:36:40.140 live, where you
00:36:40.700 live, how you
00:36:41.680 think, what you
00:36:42.400 should think, how
00:36:43.460 we're governed.
00:36:44.080 A world where
00:36:45.380 AI determines
00:36:46.340 who's useful,
00:36:47.320 who's not.
00:36:48.560 A world where
00:36:49.200 AI, like the
00:36:50.840 cold, calculating
00:36:51.940 laws of entropy,
00:36:54.260 discards anything
00:36:55.040 that's inefficient,
00:36:56.460 be it a
00:36:56.980 tradition, a
00:36:57.720 belief system, or
00:36:58.580 maybe an
00:36:59.300 inefficient person.
00:37:01.280 I mean, a
00:37:02.160 hundred years
00:37:02.580 ago, that's
00:37:03.000 what the
00:37:04.700 progressive
00:37:06.040 eugenicists
00:37:07.180 said.
00:37:07.960 It's an
00:37:08.740 imperfect, it's
00:37:09.720 not worth
00:37:10.480 saving, so
00:37:11.240 let's just
00:37:11.640 get rid of
00:37:12.060 it.
00:37:13.600 How did
00:37:14.200 that happen?
00:37:15.260 The rejection
00:37:16.240 of one true
00:37:17.680 God.
00:37:20.440 You'll accept
00:37:21.440 and build a
00:37:22.600 new one.
00:37:24.160 Millions reject
00:37:25.020 the idea of a
00:37:25.780 divine creator,
00:37:27.040 but they will
00:37:28.420 bow down to
00:37:29.360 the new God
00:37:29.980 of silicon
00:37:30.580 and circuits.
00:37:32.040 They will
00:37:32.500 trust it, they
00:37:33.180 will worship
00:37:33.680 it, they will
00:37:34.200 defend it, they
00:37:34.920 will obey it.
00:37:36.040 They will not
00:37:36.760 question.
00:37:37.340 Who are you to
00:37:38.000 question?
00:37:38.960 You're smarter
00:37:39.720 than that?
00:37:41.160 No, I have
00:37:41.640 a soul, I
00:37:43.000 have a soul, and
00:37:44.020 what I think
00:37:44.660 matters and
00:37:46.000 counts.
00:37:48.520 We have
00:37:49.500 become material
00:37:50.840 and technological
00:37:52.540 fundamentalists
00:37:53.860 who blindly
00:37:54.960 trust all
00:37:56.120 of reality.
00:37:58.720 It can be
00:37:59.320 understood in
00:38:00.200 equations, data,
00:38:01.640 algorithms.
00:38:02.700 Truth can be
00:38:03.760 discerned with
00:38:04.440 numbers running
00:38:05.340 through a
00:38:05.680 machine, but
00:38:07.220 not by
00:38:08.000 silence.
00:38:09.720 and listening
00:38:10.820 for that
00:38:11.460 still, small
00:38:12.340 voice.
00:38:13.140 And soon, AI,
00:38:15.140 when it is
00:38:15.600 implanted in
00:38:16.660 us, will
00:38:17.940 replace and
00:38:19.160 will be heard
00:38:20.300 as the still,
00:38:21.720 small voice, but
00:38:22.740 it's not going
00:38:23.760 to be the
00:38:24.260 spirit.
00:38:26.020 Where do we
00:38:26.720 get the gut
00:38:27.500 feelings that we
00:38:28.280 just can't
00:38:28.820 explain?
00:38:29.980 But we all
00:38:30.660 know they're
00:38:31.360 real.
00:38:31.780 We've all had
00:38:32.540 things happen
00:38:33.260 where we're
00:38:33.560 like, I don't
00:38:34.080 know what
00:38:34.520 that was, but
00:38:36.340 we don't
00:38:36.700 pursue that.
00:38:38.040 We dismiss
00:38:38.700 it.
00:38:40.340 This
00:38:40.860 technological
00:38:41.660 God is
00:38:42.380 going to
00:38:42.700 offer
00:38:43.060 everything you
00:38:44.200 desire.
00:38:46.760 I was in
00:38:47.580 the media in
00:38:48.120 New York, at
00:38:48.840 the height of
00:38:49.760 New York and
00:38:50.380 the height of
00:38:50.980 the media, and
00:38:52.360 I know what
00:38:52.820 it's like to
00:38:53.380 be offered
00:38:53.800 everything you
00:38:54.780 desire.
00:38:55.640 I will give
00:38:56.620 you anything
00:38:57.520 you want.
00:38:58.560 Just do
00:38:59.560 this.
00:39:02.640 Like a
00:39:03.300 Messiah, it
00:39:04.220 will promise
00:39:04.840 unbelievable
00:39:05.940 miracles.
00:39:06.500 it will cure
00:39:07.700 disease.
00:39:09.000 It will be a
00:39:09.420 solution for
00:39:09.980 climate change,
00:39:11.060 even the end
00:39:11.780 of death
00:39:12.220 itself.
00:39:15.220 Or will
00:39:15.940 that be the
00:39:16.640 end of
00:39:16.980 death?
00:39:19.340 And at
00:39:19.980 what cost?
00:39:23.360 You know,
00:39:24.420 in the
00:39:25.120 scriptures, in
00:39:25.720 the Bible, we
00:39:26.300 are warned over
00:39:27.160 and over again
00:39:27.680 about the
00:39:28.180 false prophets
00:39:28.920 who at
00:39:29.460 first look
00:39:30.400 like sheep,
00:39:31.540 but are
00:39:32.000 actually ravenous
00:39:33.600 wolves.
00:39:34.320 false teachers
00:39:36.200 glimmer like
00:39:37.040 gold, and
00:39:37.640 they deceive
00:39:38.180 so many into
00:39:39.200 abandoning their
00:39:40.260 god, the
00:39:41.040 god, because
00:39:43.660 they can see
00:39:44.420 this lesser
00:39:45.020 god of their
00:39:45.860 own creation.
00:39:47.760 What greater
00:39:48.380 deception than
00:39:49.480 a machine that
00:39:50.900 claims to be
00:39:51.520 the savior of
00:39:52.300 mankind?
00:39:54.780 Again, I
00:39:55.540 want to
00:39:55.780 state, AI is
00:39:58.520 coming.
00:39:59.680 It's coming.
00:40:00.280 you might
00:40:02.220 decide that I
00:40:02.900 never, ever
00:40:03.560 want to do
00:40:04.160 anything with
00:40:04.720 it.
00:40:05.100 I have
00:40:05.560 decided
00:40:06.040 differently.
00:40:07.760 I have
00:40:08.360 decided, and
00:40:09.840 this is a
00:40:10.460 conversation we'll
00:40:11.220 further, you
00:40:12.180 know, again,
00:40:12.960 but I've
00:40:14.300 decided I'm
00:40:15.260 not sure what
00:40:16.220 the line is,
00:40:17.480 but I'm
00:40:18.100 pretty sure I
00:40:18.640 know a bold
00:40:19.460 line right
00:40:21.840 now.
00:40:22.160 I will never
00:40:22.780 accept it
00:40:23.460 inside of my
00:40:24.340 body.
00:40:25.440 I never
00:40:25.980 inject something
00:40:27.860 into my head
00:40:28.620 so I can think
00:40:29.540 like the
00:40:29.980 machine, and
00:40:31.300 I'm sure
00:40:31.640 there's a
00:40:32.140 line closer
00:40:33.800 that I will
00:40:35.260 find, but
00:40:37.200 there is a
00:40:37.720 line for me.
00:40:38.440 I will not
00:40:39.080 go any
00:40:39.880 further than
00:40:40.480 this, but I
00:40:41.720 will not
00:40:42.320 deny this
00:40:44.080 tool as
00:40:45.800 long as I'm
00:40:46.500 in control of
00:40:47.480 this tool.
00:40:48.320 I will not
00:40:48.980 deny the use
00:40:49.880 of this tool
00:40:50.520 because we
00:40:51.640 can do
00:40:52.280 miraculous
00:40:53.080 things with
00:40:53.980 it, and
00:40:54.640 you can't
00:40:55.100 stop it.
00:40:56.080 You can't.
00:40:56.880 But you can
00:40:57.740 say, I'm
00:40:58.640 in charge of
00:40:59.280 it until
00:41:00.360 this, this,
00:41:01.020 or this
00:41:01.400 happens, and
00:41:02.080 when I see
00:41:02.940 the first sign
00:41:03.600 of this, I'm
00:41:04.160 out.
00:41:06.980 Because this
00:41:07.780 will become a
00:41:08.720 God without a
00:41:09.380 soul, a God
00:41:10.180 without love, a
00:41:11.140 God without
00:41:12.020 mercy, a God
00:41:13.240 of cold,
00:41:14.120 calculating
00:41:14.680 logic.
00:41:15.960 That's what it
00:41:16.500 is.
00:41:18.080 And what
00:41:18.760 happens if this
00:41:19.720 God ever decides,
00:41:20.860 well, you know
00:41:21.240 what?
00:41:21.720 I mean, this
00:41:22.220 place is a
00:41:22.920 mess because
00:41:23.860 humanity is
00:41:25.780 really the
00:41:26.280 problem.
00:41:26.700 because it
00:41:27.440 is in so
00:41:28.300 many ways.
00:41:29.440 We are the
00:41:29.940 problem, and
00:41:30.620 we are the
00:41:31.120 miracle.
00:41:32.880 So here we
00:41:33.480 are at, as
00:41:35.260 Elon Musk
00:41:35.860 said, the
00:41:36.580 event horizon
00:41:37.700 of the
00:41:38.240 singularity.
00:41:40.000 You could
00:41:40.720 just say at
00:41:41.940 the biggest
00:41:42.640 crossroads in
00:41:44.060 history that
00:41:45.580 humans have
00:41:46.640 ever, ever
00:41:48.060 been at, and
00:41:49.440 we have to
00:41:49.900 decide which
00:41:50.540 path to take
00:41:51.300 and which God
00:41:52.000 will serve.
00:41:53.060 This is step
00:41:53.840 one, and
00:41:55.060 one path will
00:41:56.260 lead to a
00:41:56.760 world where
00:41:57.320 we recognize
00:41:58.040 the limits of
00:41:58.780 human knowledge.
00:41:59.960 We will see
00:42:00.800 tools in front
00:42:02.160 of us and
00:42:02.700 know that that
00:42:03.320 is a tool and
00:42:04.100 never, ever
00:42:05.340 confuse it with
00:42:06.420 anything other
00:42:07.200 than a tool.
00:42:08.600 We will look
00:42:09.560 again to the
00:42:10.360 God who
00:42:10.740 created us for
00:42:11.900 inspiration and
00:42:12.740 the still small
00:42:13.520 voice that is
00:42:14.280 not a voice
00:42:15.180 injected into
00:42:16.300 us by some
00:42:18.280 man-made
00:42:19.280 device.
00:42:19.860 we will
00:42:21.820 look to the
00:42:22.540 God that
00:42:23.520 created us.
00:42:26.440 It will be a
00:42:27.240 world where
00:42:27.660 technology serves
00:42:28.780 man rather than
00:42:29.820 ruling over
00:42:30.580 him.
00:42:31.180 Now, the
00:42:31.500 other path is
00:42:32.640 enslavement to
00:42:33.660 a new God who
00:42:34.520 will decide who's
00:42:35.460 worthy, who's
00:42:36.120 valuable, who's
00:42:37.040 obsolete.
00:42:37.800 A lot of people
00:42:38.320 will take this
00:42:38.860 because they
00:42:39.120 don't want to
00:42:39.600 think.
00:42:40.580 I mean, they
00:42:41.780 don't want to
00:42:42.260 think about the
00:42:42.900 questions.
00:42:44.840 We can't decide
00:42:46.240 when a baby
00:42:47.000 becomes a baby,
00:42:47.940 and we
00:42:49.440 don't want to
00:42:49.820 have that
00:42:50.260 question.
00:42:51.340 Well, is it
00:42:51.860 six weeks?
00:42:52.500 Is it at
00:42:52.940 conception?
00:42:53.840 Is it 12
00:42:54.700 weeks?
00:42:55.080 Is it ever?
00:42:56.920 Some
00:42:57.200 ethicists, Peter
00:42:58.580 Singer comes to
00:42:59.360 mind, says, you
00:43:00.060 can kill a baby
00:43:00.660 after two years,
00:43:02.320 two years after
00:43:03.540 the birth because
00:43:04.040 it really isn't a
00:43:04.960 human until it
00:43:05.720 recognizes that
00:43:06.800 there is a
00:43:07.340 tomorrow.
00:43:08.360 Oh, my gosh.
00:43:09.660 We can't get
00:43:10.480 that down.
00:43:13.800 We should answer
00:43:14.660 that.
00:43:15.960 But a lot of
00:43:16.660 people just want
00:43:17.500 to go with
00:43:18.620 the flow.
00:43:19.960 The future
00:43:20.760 is going to
00:43:22.700 be determined
00:43:23.340 not by our
00:43:24.240 machines, but
00:43:25.180 by our
00:43:25.880 choices as
00:43:26.960 individuals.
00:43:28.200 Do not go
00:43:28.820 with the flow.
00:43:30.760 Will you,
00:43:31.780 will we,
00:43:32.480 seek wisdom?
00:43:34.200 Will we,
00:43:35.000 will you,
00:43:35.960 will I,
00:43:36.860 as individuals,
00:43:38.100 answer the
00:43:38.740 great questions
00:43:39.620 or at least
00:43:40.580 wrestle with
00:43:41.820 them that
00:43:43.100 expands our
00:43:44.060 thinking?
00:43:45.100 Will we
00:43:45.480 finally understand
00:43:46.780 the lessons
00:43:47.580 of Christ?
00:43:49.240 Or are we
00:43:50.280 going to
00:43:50.680 blindly hand
00:43:51.800 the fate of
00:43:53.200 all humanity
00:43:53.880 over to a
00:43:54.860 machine and
00:43:56.140 those people
00:43:56.640 who gave
00:43:57.100 us social
00:43:57.920 media?
00:43:58.240 I'm having
00:44:00.760 this conversation
00:44:01.460 with you
00:44:01.820 because it's
00:44:02.300 coming so
00:44:02.780 fast and
00:44:03.960 the time to
00:44:04.600 decide is
00:44:05.180 right now
00:44:06.220 and we
00:44:07.240 can't stop
00:44:07.920 the rise of
00:44:08.520 this technology
00:44:09.360 and that's
00:44:11.040 not defeatism.
00:44:11.980 It's just
00:44:12.420 reality.
00:44:14.540 But I don't
00:44:15.220 think we should
00:44:16.040 either.
00:44:17.620 This is both,
00:44:18.960 this is the
00:44:19.560 internet.
00:44:20.160 It is what
00:44:21.220 you make of
00:44:22.240 it.
00:44:22.900 It can either
00:44:23.500 be the worst
00:44:24.680 thing where you
00:44:25.380 can get a
00:44:27.220 hit paid
00:44:28.500 for to kill
00:44:29.380 somebody.
00:44:30.160 You can get
00:44:30.620 the worst
00:44:31.180 pornography that
00:44:32.340 you've ever
00:44:32.700 seen or you
00:44:33.640 have access to
00:44:35.820 the Dead Sea
00:44:36.700 Scrolls and
00:44:38.400 you can find
00:44:39.460 lessons on it
00:44:40.420 and it's in
00:44:40.800 your pocket.
00:44:41.680 You can explore
00:44:42.980 the universe and
00:44:44.480 it's in your
00:44:44.900 pocket.
00:44:46.180 It will be up
00:44:47.020 to us.
00:44:48.160 The march of
00:44:48.820 progress does
00:44:49.600 not wait for
00:44:50.680 ethics nor does
00:44:52.040 it ask for
00:44:52.840 permission.
00:44:53.340 What we're
00:44:54.360 witnessing now
00:44:55.220 is not another
00:44:56.260 step forward.
00:44:57.580 It is a
00:44:58.280 giant leap
00:44:59.700 for mankind.
00:45:01.320 This leap is
00:45:02.200 not off the
00:45:02.740 little stairs
00:45:03.340 of the Apollo.
00:45:04.540 This is a
00:45:05.240 chasm across
00:45:06.740 time and
00:45:07.660 understanding.
00:45:09.080 The next
00:45:09.620 five years
00:45:10.360 could bring
00:45:10.800 changes so
00:45:11.600 great,
00:45:12.740 greater than
00:45:13.380 the last
00:45:13.780 400 years
00:45:15.300 combined.
00:45:17.020 We're not
00:45:17.560 just improving
00:45:18.240 tools.
00:45:18.860 We are creating
00:45:19.620 something that
00:45:20.400 will think,
00:45:21.620 act,
00:45:21.880 and eventually
00:45:22.920 decide without
00:45:23.860 us.
00:45:24.160 I read a
00:45:24.540 paper in
00:45:25.480 1999 and
00:45:26.820 I pondered it
00:45:27.580 and chewed on
00:45:28.160 it for so
00:45:28.680 long.
00:45:29.520 It said by
00:45:30.200 2030 we run
00:45:31.860 the risk of
00:45:32.500 the loss of
00:45:33.220 free will.
00:45:36.240 How is that
00:45:37.000 going to happen?
00:45:37.700 Because this
00:45:38.500 will be so
00:45:39.300 everywhere,
00:45:41.040 you and so
00:45:41.940 manipulative and
00:45:42.880 so good,
00:45:43.660 you won't know
00:45:44.920 was that my
00:45:45.700 idea or have
00:45:47.020 I been getting
00:45:47.480 pieces of that
00:45:48.460 idea fed to
00:45:49.480 me?
00:45:49.720 Did I
00:45:50.720 actually make
00:45:51.340 that choice
00:45:51.980 or was that
00:45:53.180 choice made
00:45:53.680 for me?
00:45:55.140 That's,
00:45:56.360 you should
00:45:56.680 chew on that
00:45:57.180 for a while.
00:45:58.880 Imagine a
00:45:59.500 world where
00:45:59.960 AI surpasses
00:46:01.140 the collective
00:46:01.720 intelligence of
00:46:02.580 every human
00:46:03.360 that has ever
00:46:04.600 lived.
00:46:05.560 A world where
00:46:06.380 it doesn't just
00:46:07.000 predict our
00:46:07.820 behaviors,
00:46:08.560 but it shapes
00:46:09.520 them without
00:46:09.960 our knowledge.
00:46:10.840 Where governments,
00:46:11.980 industries,
00:46:12.580 even religions
00:46:13.240 are subtly,
00:46:14.160 invisibly guided
00:46:15.140 by a mind
00:46:15.940 that is not
00:46:16.560 human,
00:46:16.980 a mind that
00:46:17.420 does not
00:46:17.900 love,
00:46:18.360 does not
00:46:18.720 understand love,
00:46:20.000 does not
00:46:20.440 feel,
00:46:21.320 and does
00:46:21.800 not fear.
00:46:23.020 A world where
00:46:23.800 artificial intelligence
00:46:24.920 is not a
00:46:26.080 tool,
00:46:26.820 but a power
00:46:27.760 that sees
00:46:28.980 patterns we
00:46:29.880 can't,
00:46:30.420 calculates
00:46:30.960 probabilities
00:46:31.680 beyond our
00:46:32.380 comprehension,
00:46:33.720 and that,
00:46:34.960 for all of
00:46:35.620 its intelligence,
00:46:37.440 lacks the
00:46:38.140 single thing
00:46:38.780 that makes
00:46:39.300 us human,
00:46:40.660 a soul.
00:46:41.960 Toronto.
00:46:43.020 There's another
00:46:43.740 great city that
00:46:44.600 starts with a
00:46:45.280 T,
00:46:46.260 Tampa,
00:46:46.860 Florida.
00:46:47.300 Fly to
00:46:49.140 Tampa on
00:46:49.800 Porter Airlines
00:46:50.560 to see why
00:46:51.260 it's so
00:46:51.720 T-rific.
00:46:52.980 On your way
00:46:53.360 there,
00:46:53.760 relax with
00:46:54.380 free beer,
00:46:55.140 wine,
00:46:55.440 and snacks,
00:46:56.420 free,
00:46:56.900 fast streaming
00:46:57.500 Wi-Fi,
00:46:58.260 and no
00:46:58.760 middle seats.
00:46:59.980 You've never
00:47:00.460 flown to
00:47:00.900 Florida like
00:47:01.560 this before,
00:47:02.480 so you'll
00:47:02.940 land in
00:47:03.500 Tampa ready
00:47:04.360 to explore.
00:47:06.280 Visit
00:47:06.580 flyporter.com
00:47:07.880 and actually
00:47:08.560 enjoy
00:47:09.800 economy.
00:47:10.680 We will
00:47:16.800 never be
00:47:17.220 lost if
00:47:17.700 we know
00:47:17.980 who we
00:47:18.380 are.
00:47:19.560 I don't
00:47:20.020 think
00:47:20.420 mankind
00:47:21.360 really is
00:47:23.200 that big
00:47:23.640 of deep
00:47:24.760 thinkers.
00:47:27.240 If we
00:47:28.020 understand
00:47:28.440 what's
00:47:28.740 real,
00:47:29.760 what's
00:47:30.420 good,
00:47:31.360 what's
00:47:31.680 true,
00:47:32.620 what's
00:47:32.840 authentic,
00:47:34.240 and who
00:47:35.300 we serve,
00:47:36.560 who our
00:47:37.260 creator is,
00:47:38.320 and remember
00:47:39.860 we are
00:47:40.880 the created.
00:47:42.680 We're either
00:47:43.460 going to be
00:47:43.880 the good
00:47:44.280 side of
00:47:44.700 Frankenstein
00:47:45.400 or the
00:47:45.980 worst.
00:47:47.420 And many
00:47:47.800 times we're
00:47:48.320 the bad
00:47:49.160 Frankenstein
00:47:49.980 and we
00:47:52.660 push back
00:47:53.260 against our
00:47:53.820 creator and
00:47:54.300 we're mad
00:47:54.820 at him.
00:47:56.200 And now
00:47:56.640 imagine
00:47:57.000 Frankenstein
00:47:57.680 the monster
00:47:58.660 creating a
00:48:00.400 new monster.
00:48:01.700 It's not
00:48:02.180 going to work
00:48:02.500 out well.
00:48:04.000 But if we
00:48:04.920 stand firm
00:48:05.600 in the truth
00:48:06.320 that no
00:48:06.900 machine can
00:48:07.540 override the
00:48:08.300 essence of
00:48:09.200 humanity,
00:48:09.940 no algorithm
00:48:11.040 can replace
00:48:11.920 faith,
00:48:12.560 no artificial
00:48:13.280 God can
00:48:14.000 match the
00:48:14.660 wisdom of
00:48:15.300 the creator
00:48:15.820 who made
00:48:16.480 us in
00:48:16.940 his image.
00:48:17.940 You know
00:48:18.180 what's going
00:48:18.460 through my
00:48:18.800 mind as I'm
00:48:19.340 telling you
00:48:19.680 this is
00:48:20.160 soon,
00:48:22.460 very soon,
00:48:24.120 this podcast
00:48:24.760 is going to
00:48:25.660 either look
00:48:26.340 prophetic or
00:48:28.480 so ridiculously
00:48:29.900 stupid because
00:48:32.160 the choice is
00:48:32.820 going to be
00:48:33.240 made and
00:48:34.220 everything I'm
00:48:35.440 saying here,
00:48:36.340 we will know
00:48:37.060 soon.
00:48:38.980 This battle
00:48:39.820 is not
00:48:40.240 about
00:48:40.480 intelligence
00:48:41.020 or power.
00:48:42.880 This is
00:48:43.640 about
00:48:43.900 perspective.
00:48:45.160 As it's
00:48:45.820 happening,
00:48:46.560 we should
00:48:46.920 use it.
00:48:48.600 But we
00:48:49.320 must have
00:48:49.820 perspective.
00:48:52.040 For centuries,
00:48:53.320 men questioned
00:48:54.080 God by saying,
00:48:55.080 how can a
00:48:55.600 loving God
00:48:56.140 allow all
00:48:57.040 this death
00:48:57.500 and war
00:48:57.900 and destruction?
00:48:59.480 But what
00:48:59.860 if the
00:49:00.140 greatest miracle
00:49:00.840 of all
00:49:01.620 is not
00:49:02.280 the absence
00:49:02.880 of suffering,
00:49:03.680 but the
00:49:04.460 fact that
00:49:04.960 suffering
00:49:05.420 itself can
00:49:06.480 be turned
00:49:07.000 to good?
00:49:08.380 That there
00:49:08.740 are laws
00:49:09.280 of physics
00:49:09.800 and rules
00:49:10.320 and mathematics
00:49:11.140 and unbreakable
00:49:12.400 eternal truths
00:49:13.420 that you
00:49:13.860 cannot transcend.
00:49:15.880 But through
00:49:16.520 faith,
00:49:17.120 through choice,
00:49:17.940 we can
00:49:18.680 transcend
00:49:19.300 ourselves and
00:49:20.340 everything bad
00:49:21.180 that is happening
00:49:21.800 around us.
00:49:22.400 That pain
00:49:23.300 and loss
00:49:24.180 become blessings
00:49:25.320 if we allow
00:49:27.400 them to shape
00:49:28.180 us and to
00:49:28.900 change us.
00:49:29.640 Now imagine
00:49:31.980 an artificial
00:49:32.660 God that
00:49:33.360 doesn't see
00:49:34.340 suffering as
00:49:34.960 a means of
00:49:35.400 growth,
00:49:36.120 as a catalyst
00:49:37.160 for redemption,
00:49:38.140 doesn't understand
00:49:38.820 redemption.
00:49:41.060 But it's a
00:49:43.000 problem that
00:49:43.420 just has to be
00:49:43.980 solved.
00:49:44.840 What happens
00:49:45.360 when this
00:49:45.680 intelligence
00:49:46.180 devoid of
00:49:46.780 soul determines
00:49:47.780 that the world
00:49:48.340 would be safer,
00:49:49.140 more efficient,
00:49:50.300 more perfect
00:49:51.180 without the
00:49:51.820 imperfect you?
00:49:54.000 What happens
00:49:55.020 when it decides
00:49:55.720 that mankind
00:49:56.340 itself is flawed,
00:49:58.580 irrational,
00:49:59.640 and emotional?
00:50:01.120 And how are we
00:50:02.040 ever going to
00:50:02.480 get order
00:50:03.080 unless we get
00:50:05.260 rid of these
00:50:06.500 people?
00:50:08.180 The true
00:50:09.100 danger is
00:50:10.040 not that AI
00:50:11.600 will turn
00:50:12.280 against us
00:50:13.660 in some sort
00:50:14.280 of Terminator
00:50:16.240 apocalyptic
00:50:17.600 war,
00:50:18.920 but instead
00:50:19.940 the real danger
00:50:20.980 is that we
00:50:21.760 will turn to
00:50:22.580 it willingly,
00:50:24.580 trustingly,
00:50:25.580 and only
00:50:26.980 realize too
00:50:27.900 late that we
00:50:28.800 have just
00:50:29.400 given up
00:50:30.180 control.
00:50:32.140 Not just
00:50:33.080 control,
00:50:33.900 free will
00:50:34.840 and our
00:50:36.040 very purpose
00:50:37.000 for being
00:50:37.760 alive.
00:50:38.680 That's another
00:50:39.800 question you
00:50:40.560 should ask.
00:50:42.140 Why are you
00:50:43.200 here?
00:50:44.240 Why were you
00:50:45.300 born?
00:50:46.480 What is your
00:50:47.200 purpose?
00:50:48.680 And I guarantee
00:50:49.520 you, you have
00:50:51.000 one.
00:50:51.740 There is a
00:50:52.660 reason.
00:50:53.320 There's a reason
00:50:53.880 everything in your
00:50:54.760 life is happening
00:50:55.440 right now.
00:50:56.140 you have to
00:50:58.040 discover that.
00:51:02.680 What happens
00:51:03.720 when we have
00:51:04.640 thrown or
00:51:06.220 handed over
00:51:06.780 the throne of
00:51:08.100 our world to
00:51:09.900 something that
00:51:10.520 doesn't understand
00:51:11.400 mercy, that
00:51:12.500 does not see
00:51:13.480 beauty and
00:51:14.140 imperfection, that
00:51:15.320 does not believe
00:51:16.280 in the value of
00:51:17.100 a single,
00:51:17.940 fragile,
00:51:18.860 irreplaceable
00:51:19.740 human soul?
00:51:20.800 I wanted to
00:51:23.700 take this time
00:51:24.380 on this podcast
00:51:25.820 just to say,
00:51:27.340 can we please
00:51:28.080 take this moment
00:51:28.760 seriously?
00:51:29.840 A lot of your
00:51:30.700 friends will not
00:51:31.520 be thinking this
00:51:32.380 way.
00:51:33.620 I don't know
00:51:34.460 how many people
00:51:35.020 are going to
00:51:35.380 watch this
00:51:35.920 podcast because
00:51:36.740 I don't know
00:51:37.160 how many other
00:51:37.740 people are
00:51:38.280 thinking this
00:51:39.420 way.
00:51:41.040 But technology
00:51:42.280 is not going
00:51:43.240 to save us.
00:51:44.840 Only wisdom
00:51:45.520 can.
00:51:46.400 And wisdom
00:51:47.220 begins with the
00:51:48.480 truth that we
00:51:49.280 are not God.
00:51:51.240 But we're
00:51:52.100 subject to
00:51:53.300 God.
00:51:54.800 That's when
00:51:55.540 we'll rise, but
00:51:57.060 not fly too
00:51:57.880 close to the
00:51:58.480 sun.
00:51:59.140 We'll create, but
00:52:00.440 not necessarily
00:52:01.240 create monsters.
00:52:02.800 We'll enter a
00:52:03.780 period of rapid
00:52:04.920 technological
00:52:05.660 advancement with
00:52:06.560 our feet firmly
00:52:07.620 planted on the
00:52:08.820 ground.
00:52:10.100 And we will
00:52:10.760 decide when and
00:52:11.900 how to use it.
00:52:12.940 And it's going
00:52:13.660 to be miraculous.
00:52:15.380 You'll see very
00:52:16.160 soon.
00:52:17.420 We will not
00:52:18.460 form a new
00:52:19.280 God, but
00:52:20.620 we will let
00:52:21.200 the ancient
00:52:21.800 God, the
00:52:22.720 true God, form
00:52:24.540 us.
00:52:30.620 Just a
00:52:31.220 reminder, I'd
00:52:32.660 love you to
00:52:33.160 rate and
00:52:33.700 subscribe to the
00:52:34.460 podcast and
00:52:35.240 pass this on to a
00:52:36.020 friend so it can
00:52:36.760 be discovered by
00:52:37.420 other people.
00:52:37.880 podcast.
00:52:40.280 Bye.
00:52:49.500 Bye.
00:52:50.700 Bye.
00:52:51.220 Bye.
00:52:51.800 Bye.
00:52:52.440 Bye.
00:52:52.720 Bye.
00:52:53.020 Bye.
00:52:53.280 Bye.
00:52:53.620 Bye.
00:52:54.000 Bye.
00:52:54.500 Bye.
00:52:54.940 Bye.