The Auron MacIntyre Show - May 28, 2024


Is the Future Transhuman? | Guest: Joe Allen | 5⧸28⧸24


Episode Stats

Length

1 hour and 19 minutes

Words per Minute

165.70872

Word Count

13,214

Sentence Count

262

Misogynist Sentences

6

Hate Speech Sentences

26


Summary

Joe Allen, author of the new book, Dark Aeon, joins me to discuss the concept of transhumanism and how it relates to our understanding of what it means to be a human being. We discuss the role of technology in shaping our perception of who we are and how we are shaped by it.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.320 Rocky's Vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.200 Sponsored by Bell. Conditions apply.
00:00:28.720 CRCanada.com.
00:00:30.000 Hey everybody, how's it going?
00:00:32.160 Thanks for joining me this afternoon.
00:00:33.780 I've got a great stream with a great guest
00:00:35.600 that I think you're really going to enjoy.
00:00:38.700 So as we rush to the future,
00:00:41.380 we often, I think, don't really evaluate the path
00:00:44.680 that we're taking.
00:00:46.240 The world changes, technology seems to go ahead and evolve,
00:00:50.280 and people tend to evolve with it,
00:00:52.700 but we're not very intentionable about the path that we're on.
00:00:56.980 And so I think it's really important for us
00:00:58.720 to stop and think about the way that we are either embracing
00:01:02.240 or rejecting or developing alongside technology,
00:01:07.100 and whether it's good for us,
00:01:08.700 whether it's something that might even be inevitable,
00:01:11.100 and we simply don't have control over parts of it.
00:01:14.560 Thinking about these issues is, of course,
00:01:16.900 the transhuman editor over at Steve Bannon's War Room.
00:01:20.720 He's an author of the book, Dark Aeon, addressing this idea.
00:01:26.040 Joe Allen, thanks for joining me, man.
00:01:28.180 All right, and very good to be here.
00:01:29.320 Thank you.
00:01:30.660 Absolutely.
00:01:31.220 So let's start from the beginning, I guess.
00:01:33.500 When we talk about transhumanism,
00:01:35.400 I guess that can mean a lot of things to a lot of people,
00:01:38.440 not to hit too much on the obvious at the outset.
00:01:41.720 But when we talk about transhumanism,
00:01:43.620 are we just talking about the embrace of,
00:01:46.560 I don't know, wiring a microchip into the human brain?
00:01:50.560 Is this a wider process?
00:01:52.260 Is it only technological,
00:01:53.980 or does it involve the spiritual or the psychological as well?
00:01:57.860 How would you define the transhuman project?
00:02:02.120 You know, given the range of angles that you just presented,
00:02:06.340 I would say that you've captured most, if not all of them.
00:02:09.520 You know, it begins, I think, at a spiritual level,
00:02:12.420 the conception of what it is to be human.
00:02:14.380 And moving into the psychological,
00:02:16.560 the more immediate sort of perception
00:02:18.400 of what it is to be human.
00:02:19.540 And then, of course, technology,
00:02:21.040 how that affects what it is to be human.
00:02:24.220 You know, the term itself was initially coined
00:02:27.540 by Julian Huxley back in the mid-50s
00:02:30.420 and was really brought to the forefront
00:02:32.800 and kind of simultaneously or independently coined
00:02:37.020 by Max Moore, recoined, you could say,
00:02:40.060 in the early 80s.
00:02:42.400 For Max Moore, the idea is just simply using science
00:02:46.460 and reason to, in essence, enhance the human being.
00:02:50.900 And, of course, technology is a large part of that.
00:02:53.280 But the worldview, the scientific worldview,
00:02:55.920 is at the core of it.
00:02:57.480 And so it would be a mistake to say transhumanism
00:02:59.940 is just about chipping the brain or injecting nanobots,
00:03:03.000 but that is definitely an element.
00:03:05.300 And, you know, I try in the book
00:03:07.360 to show how both that narrow definition
00:03:10.480 and the kind of broader way that people think
00:03:12.660 of transhumanism are both applicable,
00:03:16.260 but many people who would never accept
00:03:18.420 the label of transhumanist are indeed transhumanist.
00:03:23.140 That would be anything from, you know,
00:03:24.900 more naturalistic or, you know, even right-wing thinkers
00:03:30.240 and even religious, a lot of religious thinkers
00:03:33.440 and leaders, the embrace of technology,
00:03:37.560 the use of science to alter the human being.
00:03:40.120 It's just part and parcel of the social
00:03:43.660 and technological milieu we're in.
00:03:46.940 So when do you feel like this shift really takes place?
00:03:50.460 Because, of course, technology has always,
00:03:52.860 to some degree, been a part of the story of humanity.
00:03:56.280 As soon as we start using tools,
00:03:58.300 we start using technique, it starts to shape us.
00:04:01.580 And, of course, history is filled with moments
00:04:05.180 where technology shifted the way
00:04:06.900 that our social arrangements existed.
00:04:09.740 Of course, the creation of the longbow
00:04:11.620 and then the crossbow radically reshapes
00:04:14.980 the way that European nobility is able to go ahead
00:04:18.460 and hold their social order together.
00:04:21.240 The Pope recognized this to the point
00:04:22.760 where he tried to ban the crossbow at one point.
00:04:25.140 The technology has always had some role
00:04:28.640 in shaping who we are as human beings,
00:04:30.860 how we interact with the other,
00:04:32.140 how we order our society.
00:04:33.860 But is there a definitive moment
00:04:35.620 where you feel like we move to a trans-human moment
00:04:39.320 where it's about transcending the ability of the humans
00:04:43.020 not by just augmenting it with something like a spear
00:04:46.600 or a crossbow, but altering the human themselves
00:04:49.720 in a way that is irreversible or fundamental?
00:04:52.240 I don't think there's any single point
00:04:56.600 that you could, that I've identified.
00:04:58.520 I think it's more just touchstones along the way
00:05:01.320 or mile markers.
00:05:02.960 You know, arguably, and many make this argument,
00:05:05.660 you know, it began with the transition of humanity
00:05:09.180 out of, you know, a kind of early hominid
00:05:12.660 or Australopithecine state into hunter-gatherers
00:05:17.520 or maybe the rise of agriculture
00:05:19.180 and the way that shifted everything
00:05:20.680 or the Industrial Revolution
00:05:22.440 or the scientific revolution that preceded it.
00:05:25.940 I think that maybe a point that is really important
00:05:29.540 is the Second World War
00:05:31.620 and the aftermath of the Second World War.
00:05:34.080 You know, it was in World War II
00:05:36.280 that you had the first computer systems
00:05:38.620 being designed and employed
00:05:41.300 mainly to, for code encryption.
00:05:44.100 And then in the aftermath of the war,
00:05:47.000 you had people like Norbert Wiener,
00:05:49.720 Alan Turing, Claude Shannon,
00:05:52.120 and they're conceptualizing
00:05:53.940 what computer systems will be
00:05:57.280 and they're beginning to create
00:05:58.880 the first kind of primitive computers.
00:06:01.660 And even in those early days,
00:06:04.420 the imagination of what all this would become
00:06:07.500 really laid the foundation for what has occurred
00:06:11.280 in the last 70 years or so.
00:06:16.000 You had artificial intelligence, of course,
00:06:17.980 the term was coined in the mid-50s
00:06:20.720 around the same time that transhumanism was coined.
00:06:24.100 You had the analysis of Watson and Crick
00:06:28.620 of the helical structure of DNA,
00:06:31.180 and it really opened the doors
00:06:33.040 for what then became the genetic mutations
00:06:36.360 driving everything from just the bacterial engineering
00:06:40.900 to the mRNA vaccines
00:06:42.580 that were forced upon most people on Earth very recently.
00:06:47.900 So I think that that post-war period
00:06:50.740 really did mark a point
00:06:53.680 that the acceleration of the process
00:06:56.960 can be apprehended pretty clearly.
00:07:00.600 But then, you know, in the development,
00:07:02.600 I think that the last five
00:07:04.780 or arguably 10 years
00:07:06.420 that the understanding of CRISPR
00:07:09.880 and the development of everything
00:07:13.520 from large language models
00:07:14.820 to maybe the less well-known systems
00:07:18.640 that are accelerating right now,
00:07:22.340 what you've seen in the last five years,
00:07:23.980 what I've seen in the last five years
00:07:25.440 convinced me that the predictions
00:07:28.020 of people like Ray Kurzweil,
00:07:30.100 while they're not perfect,
00:07:31.780 you can't look at Ray Kurzweil's prediction
00:07:34.140 and see just one-to-one correspondence
00:07:36.320 with the history that succeeded them.
00:07:40.460 I do think that when I first got turned on to Ray Kurzweil,
00:07:44.240 it seemed fairly imaginative and ridiculous,
00:07:47.360 terrifying if it were to become real,
00:07:49.760 but mainly I just kind of dismissed it
00:07:52.320 as a strange, quirky way of thinking.
00:07:55.000 In the last five years,
00:07:57.000 I've really started to change
00:07:58.700 the way I've thought about that.
00:07:59.840 It is not, I don't think, going to become,
00:08:02.520 I don't think our world
00:08:03.500 is going to be flooded by swarms of nanobots
00:08:05.980 that shapeshift and augment our bodies at will,
00:08:09.460 at least not anytime soon.
00:08:11.560 But I do think that the overall vision
00:08:14.020 that he captured and projected onto the world
00:08:17.000 is in fact coming into reality in some form.
00:08:20.720 Interesting.
00:08:21.920 So I guess I want to address
00:08:23.800 a couple aspects of this
00:08:25.700 because it is so multifaceted.
00:08:27.940 The first one I want to look at,
00:08:30.160 I guess, is
00:08:31.080 the thing that a lot of people
00:08:33.100 are talking about right now,
00:08:34.740 which is the AI component.
00:08:37.340 Whenever I ask this question,
00:08:39.040 I always get a different response
00:08:40.440 and usually from people I really respect.
00:08:42.620 So I'm always interested in asking it
00:08:44.000 because the fact that so many people
00:08:45.780 who are so thoughtful and experts on this issue
00:08:48.300 have such wildly conflicting ideas
00:08:50.840 about kind of the future AI
00:08:52.600 always makes me,
00:08:53.520 you know,
00:08:54.300 as someone who is not as aware of it
00:08:56.240 or as schooled in it,
00:08:58.340 kind of nervous.
00:08:59.680 That's so few people seem to agree
00:09:02.220 about where this is going.
00:09:04.260 But when it comes to artificial intelligence,
00:09:06.580 where are we in this?
00:09:08.940 As you've said,
00:09:09.460 so much of what we understand,
00:09:11.220 we envision about the future,
00:09:13.940 you know,
00:09:14.240 even a cyberpunk style dystopia
00:09:17.580 comes from these,
00:09:18.960 you know,
00:09:19.540 science fiction and the predictions.
00:09:21.280 Almost the Mary Shelley's Frankenstein
00:09:24.120 understanding of how technology
00:09:26.320 will eventually kind of subordinate
00:09:28.620 our humanity.
00:09:29.780 I talk to some people
00:09:31.040 when we come to AI
00:09:32.800 and they say,
00:09:33.620 we're so far from AI
00:09:35.320 being anything that we need to worry about
00:09:37.500 that it's laughable.
00:09:38.880 This is just a program
00:09:40.040 that repeats things back to you
00:09:41.500 that you want to hear.
00:09:42.920 That really doesn't matter.
00:09:44.460 I have other people say,
00:09:45.600 well,
00:09:45.720 even if that is the only thing
00:09:47.040 that AI does,
00:09:48.120 that in and of itself
00:09:49.140 tends to program the human being
00:09:50.760 for certain responses
00:09:51.960 and it will short circuit
00:09:53.560 the way that we interact with things
00:09:55.780 because it will go ahead
00:09:57.180 and prompt us for answers
00:09:58.520 without human interactions
00:09:59.620 that we already don't expect.
00:10:01.060 And then there are some people
00:10:01.900 who think AI is like
00:10:03.220 already on its way
00:10:04.580 to ruling us,
00:10:05.720 you know,
00:10:06.720 with an Iron Fist Terminator style.
00:10:09.400 What,
00:10:09.820 when you look at the AI landscape,
00:10:11.800 what do you see out there?
00:10:13.380 What concerns you the most?
00:10:14.940 Where do you think we are on,
00:10:16.480 if this is some kind of timeline
00:10:18.600 towards the overlords here,
00:10:20.960 where do you think we sit on that?
00:10:23.540 You know,
00:10:23.820 my personal interest
00:10:25.000 and my academic training
00:10:26.180 is for the most part
00:10:28.520 in religious studies,
00:10:30.340 comparative religion,
00:10:31.180 with a healthy dose
00:10:32.980 of scientific instruction
00:10:34.920 on top of that.
00:10:36.460 But without going too far
00:10:38.080 into the weeds
00:10:38.640 on my personal background,
00:10:41.200 I am in a sense
00:10:42.640 kind of like a carpenter
00:10:43.600 with, you know,
00:10:44.960 a limited set of tools
00:10:46.420 and, you know,
00:10:47.140 I'm looking around
00:10:47.860 for nails with my hammer.
00:10:49.740 And so I do look at this
00:10:51.020 by and large
00:10:51.840 as a religious phenomenon
00:10:53.800 in the sense that
00:10:54.880 religion is the encoding
00:10:57.520 or the encapsulation
00:10:58.620 of worldviews
00:10:59.660 in text,
00:11:00.820 in ritual,
00:11:01.860 in social norms,
00:11:03.860 in morality.
00:11:05.260 And the,
00:11:06.760 again,
00:11:07.200 this isn't something
00:11:07.820 that started yesterday
00:11:08.820 or within the lifetime
00:11:10.820 of any person alive now.
00:11:12.600 This is something
00:11:13.320 that goes back
00:11:14.160 certainly to the Renaissance
00:11:15.800 and the rise of science
00:11:18.180 and the scientific method
00:11:19.220 and the Industrial Revolution.
00:11:20.800 But what we are seeing
00:11:22.740 is the fruition
00:11:23.920 of the overhaul
00:11:25.280 of worldviews,
00:11:27.260 the overhaul
00:11:28.060 of social norms
00:11:29.300 and the world
00:11:30.400 that any human being
00:11:32.000 can expect
00:11:32.580 to encounter
00:11:33.640 outside his door
00:11:34.500 or in the next decade
00:11:35.680 or so.
00:11:36.720 So as a religious phenomenon,
00:11:38.160 and I would argue
00:11:39.100 transhumanists don't like
00:11:40.560 to think of it this way,
00:11:42.000 but transhumanism,
00:11:43.120 post-humanism,
00:11:43.860 it is at the very least
00:11:46.260 a kind of replacement
00:11:48.020 for the content
00:11:50.700 of cognitive modules
00:11:52.000 that religion,
00:11:53.360 traditional religion,
00:11:54.620 once occupied.
00:11:56.640 And, you know,
00:11:58.220 to me,
00:11:58.680 I don't see things
00:11:59.880 in purely naturalistic
00:12:01.100 or scientific terms,
00:12:03.140 but just to kind of
00:12:04.920 begin there,
00:12:06.520 what you see
00:12:07.160 with the rise
00:12:07.760 of artificial intelligence
00:12:08.880 is in many ways
00:12:10.020 the,
00:12:10.940 maybe Nick Land
00:12:11.860 would say
00:12:12.200 the hyperstition
00:12:13.100 of the golem.
00:12:14.780 And what I mean by that
00:12:17.040 is that,
00:12:17.580 you know,
00:12:17.900 humanity going back
00:12:19.160 2,500 years at least
00:12:20.940 has dreamt
00:12:22.120 of creating automata
00:12:24.120 that resembled
00:12:25.540 human beings
00:12:26.560 or animals
00:12:27.300 and had a kind of
00:12:29.380 life of their own.
00:12:31.260 And those were,
00:12:33.200 as far as we know,
00:12:34.440 just mere myths.
00:12:36.100 But the fact
00:12:37.320 that those myths
00:12:38.280 were able to
00:12:39.800 apprehend a future
00:12:40.940 that, in fact,
00:12:41.660 is coming into realization
00:12:42.720 I think is very important.
00:12:44.640 What we're seeing
00:12:45.540 with artificial intelligence
00:12:46.680 is the realization
00:12:47.800 of the dream
00:12:48.720 of creating
00:12:49.860 a human mind
00:12:50.980 outside of
00:12:52.640 the human brain
00:12:53.840 or outside
00:12:54.380 of the human body,
00:12:55.740 outside of the human
00:12:56.520 experience.
00:12:58.240 I think that
00:12:59.040 the skeptics
00:12:59.920 of artificial intelligence
00:13:01.980 are probably
00:13:03.540 wise
00:13:05.180 to doubt
00:13:06.540 the overwhelming hype
00:13:08.300 coming out of
00:13:08.880 Silicon Valley
00:13:09.560 and Wall Street
00:13:10.620 and the various
00:13:11.120 boosters,
00:13:12.100 the doomsayers,
00:13:13.380 the promoters,
00:13:14.160 all of them
00:13:14.640 on the hype
00:13:15.140 end of the spectrum
00:13:15.940 are always going
00:13:16.880 to be shooting
00:13:17.380 farther than
00:13:18.520 the arrow
00:13:19.460 is able to
00:13:20.040 actually hit,
00:13:21.720 right?
00:13:22.820 But the people
00:13:23.520 that are on
00:13:24.020 the far side
00:13:25.480 of that doubter
00:13:26.360 end of the spectrum,
00:13:27.220 the people who say
00:13:27.680 the artificial intelligence
00:13:28.620 is nothing but a toy,
00:13:30.280 it's just code,
00:13:31.300 you know,
00:13:31.500 garbage in,
00:13:32.040 garbage out,
00:13:32.520 all that sort of stuff.
00:13:33.380 I think that
00:13:34.740 that was probably
00:13:35.560 a reasonable stance
00:13:37.580 in the 90s,
00:13:39.000 early 2000s.
00:13:40.500 I think anyone
00:13:41.300 paying attention
00:13:41.960 to the advancement
00:13:42.660 of large language models
00:13:43.880 or the various systems
00:13:45.080 that analyze
00:13:45.860 biological structures,
00:13:47.740 proteins,
00:13:48.440 genetic code,
00:13:49.860 anyone who's seen
00:13:51.200 the advancement
00:13:51.940 in control systems,
00:13:54.160 AI-powered control systems
00:13:55.420 for robotics,
00:13:56.760 the development
00:13:57.400 of humanoid robots
00:13:58.800 or drones
00:13:59.520 or any other
00:14:00.380 technology
00:14:01.900 that kind of fits
00:14:02.700 within that transhuman
00:14:03.800 or futurist milieu,
00:14:06.660 what we're seeing
00:14:08.340 undoubtedly
00:14:08.980 is a rapid acceleration
00:14:10.520 towards something.
00:14:12.100 I don't have
00:14:12.880 any hard predictions
00:14:13.820 about what that
00:14:15.140 actually will be,
00:14:17.080 but I,
00:14:17.720 and maybe this
00:14:18.360 kind of indicates
00:14:19.500 how limited
00:14:20.100 my point of view is
00:14:21.280 or maybe I'm
00:14:22.440 dead on target,
00:14:23.240 I'll let the listener
00:14:24.020 decide,
00:14:25.060 but what I do see
00:14:26.480 is this shift
00:14:28.080 in worldview
00:14:28.740 and social norms
00:14:30.280 and the kind of
00:14:31.100 boundaries between
00:14:32.240 what is and isn't
00:14:33.520 sacred and profane,
00:14:35.640 that as that
00:14:37.460 accelerates
00:14:38.320 alongside
00:14:39.160 the development
00:14:40.600 of these technologies,
00:14:42.140 in the same way
00:14:44.000 that you don't
00:14:44.600 have to believe
00:14:45.560 that Allah
00:14:47.200 came to Muhammad
00:14:48.920 and dictated
00:14:50.920 the Quran
00:14:51.980 through an angel
00:14:52.760 and that Allah
00:14:53.980 has, you know,
00:14:54.820 elevates the
00:14:56.400 Kaaba in Mecca
00:14:58.280 to a position
00:14:59.360 of special privilege,
00:15:00.300 you don't have
00:15:00.960 to believe
00:15:01.320 any of that
00:15:01.880 on a metaphysical
00:15:02.920 level to know
00:15:03.720 that Muslims
00:15:05.080 have tremendous
00:15:06.080 social, economic,
00:15:07.520 and military power
00:15:08.520 given their position
00:15:10.180 in the world
00:15:10.880 and the same
00:15:11.560 could be said
00:15:12.000 of Christianity,
00:15:12.900 Judaism,
00:15:13.640 so on and so forth.
00:15:15.280 You don't have
00:15:15.980 to believe
00:15:16.500 the predictions
00:15:17.440 and the dogmas
00:15:18.540 of any religion
00:15:19.420 to know
00:15:19.960 that religion
00:15:20.940 has tremendous
00:15:22.420 force
00:15:23.440 that can't be denied
00:15:24.700 and so with the rise
00:15:26.120 of these techno-religions
00:15:27.280 and it's very heterodox,
00:15:28.280 you could say
00:15:28.620 it's very polytheistic,
00:15:30.460 the rise of these
00:15:31.380 techno-religions
00:15:32.120 are already having
00:15:33.300 an impact
00:15:33.920 through the kind
00:15:34.840 of waves
00:15:35.660 of propaganda
00:15:36.400 changing the way
00:15:37.720 people perceive
00:15:38.500 what's possible
00:15:39.460 in the future,
00:15:40.460 what's coming
00:15:41.000 in the future,
00:15:41.660 what it means
00:15:42.560 to be human
00:15:43.100 in the future,
00:15:44.180 and so really
00:15:45.180 with just the technologies
00:15:46.420 we have now,
00:15:47.460 the smartphone,
00:15:48.720 digital currency,
00:15:50.420 you know,
00:15:50.640 kind of clunky
00:15:51.360 large language models,
00:15:53.400 drone control systems,
00:15:54.960 surveillance systems,
00:15:55.900 all of these things,
00:15:57.260 non-invasive
00:15:58.160 brain-computer interfaces,
00:15:59.880 just those technologies,
00:16:01.300 if the progress
00:16:02.060 stops right here,
00:16:03.820 as humans become,
00:16:05.000 in my opinion,
00:16:06.000 more and more
00:16:06.720 dumbed down,
00:16:07.460 especially in the West,
00:16:09.200 and as these technologies
00:16:10.240 become more and more
00:16:11.280 sophisticated
00:16:11.780 and able to engage,
00:16:13.680 control,
00:16:14.400 and kind of activate
00:16:15.200 certain aspects
00:16:16.000 of the human mind,
00:16:17.340 the human body,
00:16:18.640 then we're going
00:16:19.900 to see some,
00:16:20.860 we are already seeing
00:16:22.180 some version
00:16:23.320 of those religious prophecies
00:16:25.020 playing out,
00:16:26.300 and I think that
00:16:27.880 that's really
00:16:28.540 what's to be reckoned with,
00:16:30.040 is that the way
00:16:31.460 in which we approach life
00:16:33.260 is always going to be
00:16:34.120 dictated by worldview,
00:16:35.340 morality,
00:16:36.180 and the kind of
00:16:36.740 day-to-day rituals
00:16:38.200 that our religious
00:16:39.920 conception dictate,
00:16:41.560 and it's happening
00:16:43.200 very, very quickly
00:16:44.140 that the smartphone
00:16:45.660 is far more important
00:16:46.880 than any religious icon
00:16:48.360 or holy text
00:16:49.260 for many,
00:16:50.580 if not most people
00:16:51.500 in the modern world,
00:16:52.560 and the AI
00:16:54.560 represents this
00:16:56.320 sort of
00:16:56.920 disincarnate
00:16:58.200 personality
00:16:59.120 that is able
00:17:00.720 to interact
00:17:01.780 with these people,
00:17:03.840 and it very closely
00:17:05.140 mirrors
00:17:05.660 what an atheist
00:17:06.800 would see
00:17:07.560 in religion,
00:17:08.780 right,
00:17:09.040 like these supposed
00:17:10.120 non-corporeal beings,
00:17:13.600 but the difference
00:17:15.260 being that
00:17:15.920 it's in your face,
00:17:17.200 it's happening now,
00:17:18.120 it can speak,
00:17:19.360 and it can't be denied
00:17:20.400 that it speaks,
00:17:21.060 you might deny
00:17:22.040 how intelligent it is,
00:17:23.360 and I agree
00:17:24.440 in many ways,
00:17:25.320 you might deny
00:17:26.300 how accurate it is,
00:17:27.460 and that's readily apparent,
00:17:30.000 but how powerful it is,
00:17:31.880 I don't think
00:17:32.260 there's any denying it,
00:17:33.260 it's extremely powerful,
00:17:34.820 it's extremely transformative,
00:17:36.320 and if these systems
00:17:37.660 do continue
00:17:38.380 to become more
00:17:39.020 and more sophisticated
00:17:39.680 to the point
00:17:40.760 that it's kind of
00:17:42.160 impossible to deny
00:17:43.280 their accuracy
00:17:44.100 or their wisdom
00:17:45.080 or authority,
00:17:45.800 so to speak,
00:17:46.760 then anyone
00:17:47.760 who is of
00:17:48.780 a traditional mindset
00:17:49.780 is faced
00:17:51.140 with a real quandary,
00:17:52.180 I think that's
00:17:52.680 a quandary
00:17:53.260 that needs to be
00:17:53.720 taken very seriously.
00:17:55.720 You know,
00:17:56.680 one of the things
00:17:57.500 that a lot of people
00:17:58.480 use to differentiate
00:17:59.880 the adult mind
00:18:01.160 from the child mind
00:18:02.260 is the idea
00:18:03.160 that the adult
00:18:04.480 can recognize
00:18:05.680 a fictional scenario,
00:18:08.480 right,
00:18:08.720 that the child
00:18:09.460 watches a television show,
00:18:11.400 they believe
00:18:12.520 that part of that's true,
00:18:13.900 they have a problem
00:18:15.060 differentiating it
00:18:16.140 until they get older,
00:18:17.200 and that's why
00:18:17.900 you have to be careful
00:18:18.600 about what you put
00:18:19.400 in a child's television show
00:18:21.060 because they have
00:18:21.600 a hard time
00:18:22.060 separating those things out,
00:18:24.820 but I think really
00:18:26.060 the next level
00:18:27.440 of development,
00:18:28.680 the point at which
00:18:29.600 you become a real adult,
00:18:30.900 is when you realize
00:18:31.680 that actually
00:18:32.220 the stories
00:18:32.860 are way more true
00:18:34.120 than the material reality
00:18:38.880 that you've been
00:18:39.580 separating them from,
00:18:41.240 and ultimately
00:18:41.960 that's the trick
00:18:42.900 that I think AI
00:18:43.700 is playing
00:18:44.500 on a lot of people,
00:18:45.680 is that they believe
00:18:47.680 that they have
00:18:48.820 the ability
00:18:49.380 to go ahead
00:18:49.960 and separate
00:18:51.040 the truth
00:18:51.820 from the fiction,
00:18:52.680 they have the ability
00:18:53.460 to go ahead
00:18:55.060 and recognize
00:18:56.060 when they're consuming
00:18:57.200 something as entertainment
00:18:58.260 and when it's actually
00:18:59.300 sharing them
00:19:00.360 something that's true
00:19:02.460 about the world,
00:19:03.420 but ultimately
00:19:03.980 the stories
00:19:04.600 are the things
00:19:05.080 that are actually
00:19:05.640 sharing what's true
00:19:06.980 about the world,
00:19:07.640 and when you start
00:19:08.340 interacting
00:19:09.020 with an artificial
00:19:10.520 intelligence,
00:19:11.180 even as you say
00:19:12.080 one that is crude,
00:19:13.260 one that may not
00:19:14.380 be impressive,
00:19:15.140 may not deliver
00:19:15.700 that much original
00:19:16.660 content or understanding
00:19:18.400 or depth
00:19:19.320 or have that much agency,
00:19:21.060 ultimately you're
00:19:22.100 engaging in an
00:19:23.760 artificial reality
00:19:25.060 that will shape
00:19:26.040 your own.
00:19:26.800 You don't actually
00:19:28.300 have the ability,
00:19:29.840 no matter how intelligent
00:19:30.960 you think you are,
00:19:32.060 no matter how developed
00:19:33.200 you think you are,
00:19:34.020 you don't have the ability
00:19:34.800 to actually separate
00:19:36.120 those experiences
00:19:37.100 in your mind.
00:19:38.080 You're not really
00:19:38.580 designed to do that,
00:19:40.180 and in fact,
00:19:41.080 the idea that you think
00:19:43.140 you're an intelligent
00:19:44.040 consumer of these
00:19:44.940 things is itself
00:19:46.800 a kind of a lie
00:19:48.220 that allows you
00:19:49.020 to engage even more
00:19:50.240 fully in its own
00:19:52.180 brainwashing.
00:19:53.060 It's like so many
00:19:53.640 people think that,
00:19:54.960 well, because I'm
00:19:55.640 an intelligent
00:19:56.260 and rational person,
00:19:57.280 I can't be tricked
00:19:58.560 by advertising
00:19:59.580 or social media
00:20:00.600 because I am the one
00:20:02.260 in control of the
00:20:03.160 information.
00:20:03.900 I'm the one shaping
00:20:04.500 these things,
00:20:05.320 not recognizing the way
00:20:06.740 that that actually
00:20:07.520 ends up shaping
00:20:08.500 them in spite of that
00:20:10.460 or actually because of
00:20:11.660 that.
00:20:11.860 the more intelligent
00:20:12.900 of a consumer
00:20:13.600 they become of those
00:20:14.460 products,
00:20:15.020 the more it gains
00:20:16.340 the ability to program
00:20:17.360 them.
00:20:18.160 And I think ultimately
00:20:18.860 that's the danger of AI
00:20:20.300 is not so much that,
00:20:21.640 you know,
00:20:22.440 it goes ahead
00:20:23.280 and becomes this
00:20:25.400 hyper-intelligent
00:20:27.080 Skynet that nukes
00:20:28.340 the world so much
00:20:30.100 as it becomes able
00:20:31.820 to spit back
00:20:32.540 people's own fantasies
00:20:33.680 about destroying
00:20:34.360 the world to them
00:20:35.600 until the point
00:20:36.120 where they manifest
00:20:37.360 them in the real world.
00:20:38.520 You know,
00:20:40.400 the two fictional works
00:20:42.560 or films that I think
00:20:43.940 probably kind of paint
00:20:46.200 my vision of the future
00:20:48.040 the most,
00:20:49.500 The Matrix,
00:20:50.760 of course,
00:20:51.260 the kind of
00:20:51.600 technognostic tale
00:20:53.020 and the film
00:20:55.780 Idiocracy by Mike Judge.
00:20:58.300 And, you know,
00:20:59.000 I oftentimes say
00:20:59.920 that my fear
00:21:00.960 is not that
00:21:01.720 The Matrix will,
00:21:03.020 the AI will come alive
00:21:04.260 and become this
00:21:05.180 sort of digital
00:21:05.780 demiurge
00:21:06.660 and enslave humanity
00:21:09.720 through its wiles,
00:21:11.800 although there's
00:21:12.580 some element of truth
00:21:13.760 to that mythology.
00:21:15.660 I'm much more worried
00:21:16.800 about kind of
00:21:17.620 walking out my door
00:21:18.640 and literally everyone
00:21:19.980 is idiotic
00:21:21.740 and the machines
00:21:22.980 are similarly idiotic
00:21:25.120 but slightly more intelligent
00:21:26.900 than the people
00:21:27.620 and are dictating behavior.
00:21:29.220 Sort of like if you're
00:21:30.160 at the self-checkout kiosk
00:21:32.320 at Kroger came alive
00:21:34.020 and began walking around
00:21:35.660 and telling you
00:21:36.360 where to put your groceries
00:21:37.600 and where to put your money.
00:21:39.780 You know,
00:21:40.060 something that is
00:21:41.000 happening right now
00:21:42.160 that I think
00:21:42.620 should probably alarm everyone.
00:21:44.280 I've tried to drive this home
00:21:45.740 as best I can
00:21:47.220 but it's difficult to do
00:21:48.560 for whatever reason.
00:21:50.880 The rise of e-learning,
00:21:54.060 of digital systems
00:21:55.280 used to educate young people,
00:21:58.020 that's been in the works
00:21:59.280 for a long time
00:22:00.160 when I taught
00:22:01.400 and when I was an instructor
00:22:03.320 in a college,
00:22:05.060 you could see
00:22:05.860 how much it had already invaded
00:22:07.480 and this was
00:22:08.080 quite a few years ago
00:22:09.800 but the pandemic,
00:22:11.840 of course,
00:22:12.180 increased the popularity
00:22:13.260 or at least the use
00:22:14.220 of e-learning systems.
00:22:16.440 When I was in grad school,
00:22:18.480 you already had
00:22:19.340 the Khan Academy,
00:22:20.880 right,
00:22:21.120 created by Sal Khan.
00:22:22.480 It was a very,
00:22:23.440 very popular way
00:22:24.360 for people to wrap their head
00:22:25.500 around any subject
00:22:26.420 but of course,
00:22:27.420 all that's human produced
00:22:28.600 and human managed.
00:22:30.800 what we're seeing now
00:22:31.880 is this push
00:22:32.760 kind of beginning
00:22:33.840 with people like Sal Khan
00:22:35.280 who in his prediction
00:22:37.340 of the future,
00:22:38.080 his desire for the future
00:22:39.260 openly says
00:22:40.160 every student
00:22:41.400 should have an AI tutor
00:22:43.000 and he's working
00:22:44.500 with OpenAI
00:22:45.880 to incorporate,
00:22:47.120 they've already done so,
00:22:48.460 to incorporate
00:22:49.140 the GPT,
00:22:51.320 large language model,
00:22:52.600 into the Khan Academy
00:22:55.280 e-learning systems
00:22:57.280 and they're deploying them
00:22:59.340 in colleges,
00:23:00.420 universities,
00:23:01.080 and even high schools
00:23:02.080 right now
00:23:02.780 across the country
00:23:04.080 and across the world
00:23:04.920 and it wouldn't be
00:23:07.200 that big of a deal
00:23:07.980 if it was just kind of
00:23:08.960 isolated to him
00:23:10.300 and his company
00:23:11.680 and his work
00:23:12.540 but you have Bill Gates,
00:23:14.160 of course,
00:23:14.600 Microsoft,
00:23:15.400 investing in OpenAI,
00:23:16.820 closely connected to all this.
00:23:18.240 He's also pushing
00:23:19.140 the narrative
00:23:19.580 that every child on Earth
00:23:21.660 should have an AI tutor
00:23:23.560 and, of course,
00:23:24.540 Sam Altman,
00:23:25.220 Greg Brockman
00:23:25.820 and OpenAI
00:23:26.380 pushing the same sort of thing
00:23:28.040 and then you have
00:23:29.480 Elon Musk recently
00:23:31.240 and going back,
00:23:33.100 he has a history of this
00:23:33.900 but, you know,
00:23:34.440 talking about
00:23:35.120 every child will have
00:23:36.240 a kind of AI Einstein
00:23:37.440 to instruct them
00:23:38.660 and maybe moving farther right
00:23:42.160 on the political spectrum,
00:23:43.680 people like Mark Andreessen
00:23:45.320 saying exactly the same thing.
00:23:47.740 Now, the content of those systems
00:23:49.080 is going to differ
00:23:49.740 from community to community
00:23:51.260 and the effectiveness
00:23:52.020 of those systems
00:23:53.140 is going to differ
00:23:54.240 but what's absolutely happening
00:23:56.200 is the development
00:23:57.840 of a human-AI symbiosis,
00:24:00.240 a kind of human-machine relationship
00:24:02.040 at a very, very, very young age
00:24:04.380 and as people were kind of taken
00:24:06.900 by surprise
00:24:08.060 by the rise of the transgender movement
00:24:10.200 or the rise of what, you know,
00:24:11.900 is oftentimes called woke,
00:24:14.280 a lot of that began
00:24:15.440 in the universities
00:24:16.200 and then just, you know,
00:24:17.620 expanded out from there
00:24:19.060 into the rest of the society
00:24:20.360 and so while your average boomer
00:24:22.740 might be able to scoff
00:24:23.700 at the idea of every child
00:24:26.300 which will, of course,
00:24:27.120 translate into just many
00:24:28.160 or maybe most children
00:24:29.420 becoming human-AI symbiotes
00:24:32.840 and people of our generation
00:24:35.500 are probably more open
00:24:37.660 to the possibility
00:24:38.420 but still, you know,
00:24:39.680 thinking back to the computer systems
00:24:41.580 of the 90s
00:24:42.720 and the early 2000s
00:24:43.900 maybe kind of shrug it off,
00:24:46.020 what will undoubtedly occur
00:24:48.020 is that that younger generation
00:24:49.940 will see this as normal.
00:24:51.500 It will just simply fade
00:24:52.500 into the background
00:24:53.080 in the same way
00:24:53.780 that the smartphone has
00:24:54.940 and so some part
00:24:56.760 of their cognition
00:24:57.560 will be externalized
00:24:58.780 not just in other human thinkers
00:25:01.200 who have written things down
00:25:02.400 so that they can then absorb,
00:25:04.240 analyze,
00:25:05.020 turn it over on their own
00:25:06.360 but is actively changing
00:25:08.920 the content
00:25:09.640 of what they're learning
00:25:10.720 and shaping that content
00:25:12.480 to their own
00:25:14.240 kind of psychological profile
00:25:15.860 and creating this dependency
00:25:17.880 so that they're basically
00:25:19.380 one dead battery
00:25:20.580 away from being a moron
00:25:21.920 and I think that this is
00:25:23.840 a really, really important development
00:25:25.440 on top of everything
00:25:26.340 that is happening
00:25:26.960 in biomedicine
00:25:27.840 and, of course,
00:25:28.680 military technology
00:25:29.700 and the way in which
00:25:30.700 AI is being infused
00:25:31.960 into big corporations.
00:25:33.500 If the youngest generation
00:25:35.200 becomes dependent on AI
00:25:36.800 and comes to see AI
00:25:38.040 as a kind of companion,
00:25:39.520 especially as a kind
00:25:40.340 of spiritual companion,
00:25:42.100 then not only have you had
00:25:43.200 a change in the worldview
00:25:44.260 but you've also had
00:25:45.120 a tremendous shift
00:25:46.620 in what the center,
00:25:48.220 like the highest authority is
00:25:50.140 and smarter children,
00:25:52.780 I think more disciplined children,
00:25:54.180 children who are wealthier
00:25:55.500 and kind of already come
00:25:57.080 from families of power
00:25:58.200 and privilege,
00:25:59.260 they're going to fare better
00:26:00.600 most likely
00:26:01.600 but as you move down
00:26:03.340 the socioeconomic ladder
00:26:04.700 and the ladder of
00:26:05.480 maybe even cognitive ability,
00:26:07.800 what I think you'll see
00:26:08.680 more and more is,
00:26:09.760 in fact,
00:26:10.080 something like that,
00:26:11.220 something between
00:26:11.920 the matrix and idiocracy.
00:26:14.260 When I found out
00:26:15.000 my friend got a great deal
00:26:16.360 on a designer dress
00:26:17.200 from Winners,
00:26:18.020 I started wondering,
00:26:19.640 is every fabulous item
00:26:21.200 I see from Winners?
00:26:22.800 Like that woman over there
00:26:24.060 with the Italian leather handbag,
00:26:25.740 is that from Winners?
00:26:26.920 Ooh,
00:26:27.540 or that beautiful silk skirt,
00:26:29.460 did she pay full price?
00:26:30.700 Or those suede sneakers?
00:26:32.280 Or that luggage?
00:26:33.360 Or that trench?
00:26:34.500 Those jeans?
00:26:35.200 That jacket?
00:26:35.920 Those heels?
00:26:36.780 Is anyone paying full price
00:26:38.520 for anything?
00:26:39.720 Stop wondering.
00:26:41.020 Start winning.
00:26:41.960 Winners.
00:26:42.540 Find fabulous for less.
00:26:44.260 You know,
00:26:45.000 as someone who was
00:26:46.460 teaching public high school
00:26:47.880 just a few years ago,
00:26:49.220 this is something
00:26:49.840 that I'm very familiar with.
00:26:51.360 I taught through the pandemic.
00:26:53.180 And you can already
00:26:54.440 see this happening.
00:26:55.740 There was already
00:26:56.980 a real danger
00:26:58.000 in the fact that
00:26:59.820 basically all of the tests
00:27:02.720 that kids had to take
00:27:04.300 had in them,
00:27:05.680 had a company with them,
00:27:07.120 their own computer programs
00:27:08.680 that basically drilled them
00:27:10.000 24-7
00:27:10.940 on how to take those tests
00:27:12.800 and how to succeed on them.
00:27:14.620 And so children
00:27:15.580 stopped learning
00:27:16.880 from teachers
00:27:17.680 a long time ago
00:27:18.680 because actually teaching
00:27:20.340 a student
00:27:21.400 is far more difficult
00:27:22.720 than just letting them
00:27:24.380 plug into the computer
00:27:25.660 over and over again
00:27:26.760 and repeat the same process
00:27:28.460 to make sure
00:27:29.080 that they can go ahead
00:27:29.860 and pass the test.
00:27:31.140 Many teachers have,
00:27:32.120 of course,
00:27:32.320 brought this up.
00:27:32.900 Teaching the test
00:27:33.580 is a very old complaint
00:27:35.220 for teachers.
00:27:35.800 But this is something
00:27:36.740 that's been fully embraced
00:27:37.720 at this point
00:27:38.280 by the system
00:27:39.060 that we want to go ahead
00:27:40.160 and train every student
00:27:41.660 to think the way
00:27:43.340 that the test is given.
00:27:45.280 And that means think the way
00:27:46.700 that the computer applies
00:27:48.280 all of these different systems.
00:27:50.360 And when you get kids
00:27:52.000 into any moment
00:27:54.400 where they have to actually
00:27:55.620 start reasoning for themselves
00:27:57.200 when they need to start
00:27:58.120 pulling any information
00:28:00.220 for themselves,
00:28:01.000 they're completely unable
00:28:02.080 to do it.
00:28:02.880 Most of them are completely
00:28:03.940 unable to do any form of research.
00:28:06.000 They can't do anything
00:28:07.200 outside of Googling.
00:28:08.460 They think that's
00:28:08.980 the ultimate authority.
00:28:10.200 When they find it,
00:28:11.040 they have no ability
00:28:11.680 to vet the information.
00:28:12.780 The algorithm vets it for them.
00:28:14.200 If it's at the top,
00:28:14.900 it's right.
00:28:15.600 If it's not at the top,
00:28:16.900 they don't bother
00:28:17.340 to look for it.
00:28:18.400 Oftentimes,
00:28:18.900 they're just copying,
00:28:19.740 pasting off the Google summary.
00:28:21.080 They don't even go
00:28:21.600 to the source
00:28:22.200 that it's linking to.
00:28:24.260 And so the idea
00:28:25.160 that they would,
00:28:25.900 you know,
00:28:26.460 go to a library,
00:28:27.980 look at multiple sources,
00:28:29.580 attempt to put something
00:28:30.420 together like that
00:28:31.260 is just entirely foreign.
00:28:33.540 And I'm not just saying
00:28:34.420 that is like the old man
00:28:35.680 who used to write
00:28:36.280 a brontosaurus to school.
00:28:37.800 I'm saying that is like,
00:28:39.520 we're losing the ability
00:28:41.260 to have any friction
00:28:42.960 in the process.
00:28:44.280 Everything has been centralized
00:28:45.420 into the ability
00:28:47.540 of the algorithm
00:28:48.300 to kind of feed it back
00:28:49.420 to somebody.
00:28:50.740 And so they've lost
00:28:51.600 the ability to do
00:28:52.640 any of that
00:28:53.180 without,
00:28:53.620 you know,
00:28:54.320 that intermediary,
00:28:55.440 that digital intermediary
00:28:56.540 kind of feeding it
00:28:57.640 back to them.
00:28:58.540 And I think you're right
00:28:59.620 that that puts you
00:29:00.400 in a real disaster
00:29:01.940 because,
00:29:02.640 you know,
00:29:03.520 the thing that,
00:29:04.420 you know,
00:29:05.080 I just got done
00:29:05.760 writing a book
00:29:07.060 about the total state,
00:29:08.060 but my prediction
00:29:08.860 is that eventually
00:29:09.620 this thing collapses
00:29:11.260 not because
00:29:11.960 we all gain the will
00:29:13.700 to go ahead
00:29:14.280 and knock it over,
00:29:15.240 but ultimately
00:29:16.200 because the society
00:29:18.000 that we're training
00:29:18.900 is completely unable
00:29:20.080 to upkeep the systems
00:29:21.600 that it's building.
00:29:23.160 And eventually,
00:29:23.700 like you said,
00:29:24.180 if you go ahead
00:29:24.840 and degrade
00:29:26.020 the level
00:29:26.800 of the human capital,
00:29:28.120 you're really just relying
00:29:29.380 on the advancement
00:29:30.300 of the technology
00:29:31.120 to go ahead
00:29:31.600 and paper over that.
00:29:33.480 And if at some point
00:29:34.280 you fall behind that
00:29:35.140 and there's a major failure,
00:29:36.760 well,
00:29:37.000 then no one knows
00:29:37.640 how to fix it
00:29:38.300 because all the people
00:29:39.140 involved in its creation,
00:29:41.060 its actual,
00:29:42.160 that have the skill set
00:29:43.520 necessary to kind of
00:29:44.660 reason that into being
00:29:46.120 and go ahead
00:29:46.680 and create it
00:29:48.180 have fallen apart.
00:29:49.900 Do you foresee
00:29:50.660 a moment
00:29:51.320 where the gap
00:29:52.480 between kind of
00:29:53.340 the degradation
00:29:53.820 of human capital
00:29:54.860 and the rise
00:29:56.160 of kind of
00:29:56.900 digital advancement,
00:29:58.620 there's a gap there
00:29:59.600 that allows
00:30:00.140 for this whole system
00:30:01.020 to start coming apart
00:30:02.020 at the seams?
00:30:03.520 You know,
00:30:04.100 there's a real faith
00:30:05.040 among,
00:30:05.940 say,
00:30:06.700 what you could
00:30:07.280 just loosely call
00:30:08.280 transhumanist
00:30:09.080 or accelerationist,
00:30:10.540 even post-humanist.
00:30:11.980 There's this faith
00:30:12.920 that the system itself,
00:30:14.620 as the technology
00:30:15.820 becomes more sophisticated,
00:30:17.120 as energy
00:30:18.320 becomes easier
00:30:19.880 to produce,
00:30:21.380 whether it be nuclear
00:30:22.240 or more
00:30:23.380 kind of
00:30:24.380 fanciful
00:30:26.240 means,
00:30:27.940 and then,
00:30:28.380 you know,
00:30:28.620 also just the
00:30:29.700 efficiency
00:30:30.660 of the systems themselves.
00:30:31.780 Like,
00:30:31.920 right now,
00:30:32.660 it requires,
00:30:33.360 you know,
00:30:33.600 Walmart-sized
00:30:34.580 data centers
00:30:35.660 to train
00:30:36.280 a large language model,
00:30:37.980 but that process
00:30:38.600 is becoming
00:30:38.940 more and more efficient,
00:30:40.120 and the models
00:30:40.920 are indeed
00:30:41.640 becoming more sophisticated,
00:30:42.560 so there's this faith
00:30:43.400 that the system itself,
00:30:45.560 with human input,
00:30:47.380 will have enough momentum
00:30:48.640 to take off,
00:30:49.620 even,
00:30:50.380 you know,
00:30:50.760 especially among
00:30:51.360 the kind of post-human
00:30:52.480 and accelerationist,
00:30:55.160 effective accelerationist set,
00:30:57.220 that it will have
00:30:58.640 its own kind of life force.
00:31:00.740 It won't just be
00:31:01.840 a tool anymore.
00:31:02.840 It will be a new species,
00:31:04.380 or some would say
00:31:05.260 it will be a kind of
00:31:06.100 new spiritual being.
00:31:08.560 I, myself,
00:31:09.620 I remain always
00:31:12.260 in the state
00:31:13.020 of constant agnosticism
00:31:14.540 about the future,
00:31:15.540 even about next month,
00:31:16.680 let alone in 10 years.
00:31:18.800 I don't know
00:31:19.680 how much energy
00:31:20.780 they'll be able
00:31:21.280 to extract
00:31:22.040 and put into
00:31:22.820 these systems.
00:31:23.460 I don't know
00:31:24.120 how well
00:31:25.240 they'll be able
00:31:25.760 to sustain
00:31:26.600 these systems.
00:31:28.460 It's quite possible
00:31:29.860 that for the rest
00:31:30.560 of our lives,
00:31:31.360 we're going to be
00:31:32.000 watching this monster
00:31:33.280 grow and grow and grow,
00:31:35.320 and it will be
00:31:36.120 very difficult
00:31:36.720 to tell children
00:31:37.940 who,
00:31:38.780 as they're becoming adults,
00:31:39.840 that this is not
00:31:40.960 what you should
00:31:41.640 put your faith in,
00:31:42.620 that this is a false god.
00:31:44.300 I think that that's,
00:31:45.340 I'm not saying
00:31:46.200 that is going to happen.
00:31:47.620 I'm not even going to say
00:31:48.240 that's the most likely outcome,
00:31:49.960 but I do think
00:31:51.340 that it's a possibility
00:31:52.200 that needs to at least
00:31:53.140 be accounted for
00:31:54.400 and taken seriously,
00:31:55.440 just as the collapse
00:31:57.320 of the system
00:31:57.980 needs to be taken seriously.
00:32:00.080 Some societies,
00:32:00.940 it's probably kind of inevitable
00:32:02.660 that you're going to see
00:32:03.300 a series of growth
00:32:04.540 and collapse, right?
00:32:05.820 Third world countries
00:32:06.860 all across Africa
00:32:07.920 and the poorer areas
00:32:09.180 of South Asia.
00:32:11.100 I think that it's much more likely
00:32:12.700 that you'll see
00:32:13.260 the rise and fall
00:32:14.340 of different kind of,
00:32:16.020 you know,
00:32:17.540 technotronic,
00:32:18.340 futuristic societies
00:32:20.080 that then,
00:32:20.600 you know,
00:32:21.340 they're not able
00:32:22.240 to turn the gears
00:32:22.920 fast enough,
00:32:23.480 they're not able
00:32:23.880 to, you know,
00:32:24.660 respond to cyber attacks
00:32:26.640 or just failures
00:32:27.460 fast enough
00:32:28.020 and you see
00:32:29.180 that kind of,
00:32:30.120 that rise and fall pattern,
00:32:32.620 but in America,
00:32:33.660 in China,
00:32:34.680 in Russia,
00:32:35.700 smaller but very powerful
00:32:37.000 nations like Israel,
00:32:38.440 you know,
00:32:38.900 across Europe,
00:32:39.480 I think that
00:32:40.060 there's every reason
00:32:41.960 to think that
00:32:42.980 not only because
00:32:43.900 of the vast amount
00:32:45.640 of capital poured
00:32:46.460 into these systems
00:32:47.200 and so therefore
00:32:47.900 an incentive
00:32:48.740 to make them work
00:32:49.500 even if they're not
00:32:50.220 really working
00:32:50.940 all that well.
00:32:52.420 Of course,
00:32:53.060 the large amount
00:32:54.640 of human capital
00:32:55.400 in these systems
00:32:56.080 is very intelligent people
00:32:57.400 that are being incentivized
00:32:58.720 to constantly maintain them
00:33:00.060 and then more and more
00:33:01.260 a propagandized society
00:33:02.600 that is becoming
00:33:03.600 very faithful
00:33:04.500 that these systems
00:33:05.180 are going to produce
00:33:06.280 better and better
00:33:07.100 health outcomes,
00:33:08.880 better and better
00:33:09.360 economic outcomes
00:33:10.280 and of course
00:33:10.800 position these different countries
00:33:12.580 for greater and greater
00:33:14.000 geopolitical dominance
00:33:15.240 through military technology
00:33:16.460 and other means.
00:33:17.880 I think that there's,
00:33:18.920 again,
00:33:19.420 in advanced nations,
00:33:20.680 there's every reason
00:33:21.800 to think that
00:33:22.840 for most of the rest
00:33:23.980 of our lives
00:33:24.580 or for the rest
00:33:25.240 of our lives,
00:33:25.900 assuming we live out
00:33:26.840 into our old age,
00:33:28.820 but this is going
00:33:29.640 to be a constant presence
00:33:30.880 and so the question
00:33:32.320 is going to,
00:33:33.320 if that's the case,
00:33:34.160 the question is,
00:33:35.240 do I submit
00:33:36.160 to this system?
00:33:37.180 Do I find some
00:33:38.260 compromise
00:33:38.860 with the system,
00:33:40.060 with this new
00:33:40.880 kind of techno-religious
00:33:41.920 civilization?
00:33:43.060 Do we find ways
00:33:44.320 to overcome it
00:33:45.360 and dominate it
00:33:46.280 so that it's now
00:33:47.040 like a techno-traditionalist system
00:33:48.700 or do people
00:33:49.920 find ways
00:33:50.720 to kind of
00:33:51.500 isolate themselves
00:33:52.440 to pull back
00:33:53.540 to retreat
00:33:54.120 and regroup
00:33:55.720 and maintain
00:33:57.120 semi-traditional
00:33:58.880 ways of life
00:33:59.800 that could perhaps
00:34:00.740 endure the collapse
00:34:01.900 of a system like that?
00:34:03.640 So that's a kind of
00:34:04.500 a multi-pronged
00:34:06.700 non-answer
00:34:07.360 to your question,
00:34:08.180 but I think
00:34:09.040 we're in such
00:34:09.600 a chaotic moment
00:34:10.400 right now.
00:34:10.880 I mean,
00:34:11.000 the whole concept
00:34:11.740 of the singularity,
00:34:13.340 there's a lot
00:34:14.300 of different definitions
00:34:15.060 of it,
00:34:15.520 but I think
00:34:16.120 the most important
00:34:16.860 is kind of
00:34:17.800 the original intent
00:34:18.860 behind using
00:34:19.840 the singularity
00:34:20.760 as a metaphor
00:34:22.060 for technological
00:34:23.080 development.
00:34:24.420 You know,
00:34:25.020 it was first coined,
00:34:26.820 I guess,
00:34:27.060 by John von Neumann,
00:34:29.320 but it was really
00:34:29.840 fleshed out
00:34:30.400 by the sci-fi writer
00:34:31.900 Werner Vingy,
00:34:33.420 and he saw
00:34:34.780 at least four
00:34:35.880 different possible
00:34:36.780 types of singularities,
00:34:38.220 but the basic idea
00:34:39.480 is that,
00:34:40.400 like with a black hole,
00:34:41.740 you have this
00:34:42.640 exponential increase
00:34:43.700 in the case
00:34:44.100 of a black hole
00:34:44.720 of mass collapsing
00:34:47.120 in on itself
00:34:47.820 and an increase
00:34:49.320 in force and gravity
00:34:50.200 disappearing beyond
00:34:51.400 an event horizon
00:34:52.380 that no observer
00:34:53.500 could ever get past.
00:34:55.820 And so that the singularity,
00:34:57.400 in the same way
00:34:57.920 that technology
00:34:58.660 develops exponentially,
00:35:01.460 that it creates
00:35:02.900 a kind of event horizon
00:35:04.060 that no human mind
00:35:05.300 can comprehend,
00:35:06.040 no human mind
00:35:06.800 can really see beyond.
00:35:08.160 You can't predict
00:35:08.980 the future
00:35:09.660 with any reasonable
00:35:10.780 accuracy or confidence.
00:35:13.060 And I think that
00:35:14.500 even if
00:35:15.740 it doesn't turn out
00:35:17.280 like Ray Kurzweil's
00:35:18.340 version of it,
00:35:18.900 where it's more
00:35:19.300 kind of a fleshed out,
00:35:20.200 actualized singularity,
00:35:22.080 that we are entering
00:35:24.220 a realm,
00:35:25.620 again,
00:35:25.900 as humans are more
00:35:26.620 and more dumbed down,
00:35:27.640 our ability
00:35:28.280 is to actually
00:35:29.780 assess what's happening
00:35:31.300 right now,
00:35:31.820 let alone what's going
00:35:32.440 to happen in the future,
00:35:33.220 is being degraded.
00:35:34.180 And these systems
00:35:35.460 are becoming
00:35:36.160 more and more
00:35:36.820 diverse,
00:35:38.380 more and more
00:35:38.960 sophisticated,
00:35:40.500 more and more prevalent
00:35:41.320 in society.
00:35:43.080 And so we're at a point
00:35:44.860 where a certain degree
00:35:45.880 of agnosticism
00:35:46.960 about the future,
00:35:48.260 a certain degree
00:35:48.860 of comfort
00:35:49.780 with ambiguity
00:35:50.940 is going to be necessary
00:35:52.340 because,
00:35:53.460 you know,
00:35:54.020 some of the finest minds
00:35:55.820 that I know personally
00:35:57.080 don't have really clear
00:35:59.100 pictures of what
00:36:00.140 the future looks like.
00:36:01.580 And as I go over
00:36:02.600 all of these
00:36:03.020 futurist predictions
00:36:03.860 that you see right now,
00:36:05.540 they're incredibly diverse.
00:36:06.920 As Nick Bostrom once said,
00:36:09.160 they're as confident
00:36:10.040 as they are diverse.
00:36:12.220 And so I don't see
00:36:13.600 any real reason
00:36:14.880 to become too fixated
00:36:16.600 on one single vision
00:36:18.360 of the future.
00:36:19.240 I think it's really important
00:36:20.240 to at least have
00:36:21.380 a kind of,
00:36:22.300 a few different options
00:36:23.660 and a few different plans
00:36:25.760 in effect
00:36:26.500 for any given future
00:36:28.160 that might be coming at us.
00:36:30.080 So we've talked a lot
00:36:31.260 about the,
00:36:31.880 I guess you could say,
00:36:32.740 software side of this
00:36:34.560 where the AI
00:36:35.420 and the reprogramming
00:36:37.300 of the human mind
00:36:38.740 through these different interactions,
00:36:40.460 training people to think
00:36:41.260 more like computers,
00:36:42.640 which often means
00:36:44.200 just making them
00:36:45.280 dumber
00:36:45.740 in many ways.
00:36:47.480 But we haven't talked
00:36:48.560 a lot about
00:36:49.020 the hardware side
00:36:50.140 of this,
00:36:50.700 the actual
00:36:51.400 installing of chips
00:36:52.960 in my brain,
00:36:53.900 the replacing of organs,
00:36:55.400 the creation
00:36:55.960 of these things.
00:36:57.020 So when it comes
00:36:57.820 to the hardware side
00:36:59.880 of the transhuman question,
00:37:02.440 where are we now?
00:37:04.600 What technologies
00:37:05.360 are the most important
00:37:06.380 for people to understand
00:37:07.540 that have been coming out
00:37:09.100 or the ones that are
00:37:10.100 making the biggest leaps
00:37:11.560 right now?
00:37:12.500 What do you think
00:37:13.180 is having the biggest impact
00:37:14.860 on humans
00:37:15.880 at the moment
00:37:16.580 when it comes
00:37:17.060 to the field
00:37:17.660 of grafting,
00:37:18.880 the technological
00:37:20.240 and the biological
00:37:21.000 together?
00:37:23.700 Just to make a concession
00:37:25.500 to the kind of
00:37:26.080 transhuman mindset,
00:37:27.240 let's do imagine
00:37:28.080 AI as a kind of mind.
00:37:30.200 At the very least,
00:37:30.900 you can say that
00:37:31.580 it's a range
00:37:33.620 of cognitive modules
00:37:34.920 that replicate
00:37:35.960 certain things
00:37:36.800 that the human mind,
00:37:37.580 the human brain
00:37:38.120 and animal minds
00:37:38.860 can do
00:37:39.280 and physical systems
00:37:40.140 can do.
00:37:40.760 So imagine
00:37:41.220 it's a human mind.
00:37:42.420 The really important
00:37:43.700 thing to look at,
00:37:44.540 what is the human interface
00:37:45.700 with these alien minds?
00:37:48.880 And right now,
00:37:49.780 the most predominant
00:37:50.900 is just simply
00:37:51.380 the screen,
00:37:52.000 the smartphone.
00:37:53.000 Smartphone is just
00:37:53.880 kind of an extension
00:37:54.920 of the TV,
00:37:55.720 a two-way television screen,
00:37:57.060 right?
00:37:57.240 Or telescreen,
00:37:57.980 as it were.
00:37:58.960 But you have more
00:38:00.300 and more sophisticated
00:38:01.560 technologies
00:38:02.300 that are being pushed
00:38:03.320 and I guess hopefully
00:38:05.900 right now
00:38:06.800 it's very encouraging.
00:38:08.060 They're oftentimes
00:38:08.500 being rejected by most,
00:38:10.260 but you have things
00:38:10.900 like virtual reality
00:38:11.960 so that you have a system
00:38:13.980 that's fully immersive
00:38:14.920 that is able to capture
00:38:16.740 these kind of digital realms
00:38:18.040 and bring the person
00:38:18.860 into it
00:38:19.440 so that the consciousness
00:38:20.540 of that person
00:38:21.420 now represents
00:38:22.580 the simulation
00:38:23.980 of the world
00:38:24.680 in their minds
00:38:25.420 is totally kind of
00:38:26.840 overtaken
00:38:27.440 by this digital simulation.
00:38:30.080 You also have
00:38:31.020 other interfaces
00:38:32.020 that are really important.
00:38:33.260 So the brain-computer interface,
00:38:35.320 most people think
00:38:36.060 of a brain-computer interface
00:38:37.200 as Neuralink
00:38:38.900 and Neuralink definitely
00:38:40.800 now I can confidently say
00:38:42.940 represents a real advance
00:38:45.120 in the implanted
00:38:46.080 brain-computer interface.
00:38:47.260 You already had
00:38:48.520 multiple companies,
00:38:49.600 the two most prominent
00:38:50.840 being Synchron
00:38:51.940 and BlackRock Neurotech,
00:38:54.420 Synchron being invested
00:38:55.540 in by Bezos and Gates,
00:38:58.180 BlackRock Neurotech
00:38:59.220 being largely
00:39:00.400 at least initially
00:39:01.280 boosted by Peter Thiel.
00:39:03.160 Those systems
00:39:03.840 came before Neuralink
00:39:05.020 at least insofar
00:39:05.940 as putting them
00:39:06.380 in human brains,
00:39:07.320 but they're still
00:39:08.140 very primitive.
00:39:09.840 They, you know,
00:39:10.420 there's not really
00:39:11.160 any meaningful input.
00:39:12.500 It's mostly just output.
00:39:14.760 It's mostly just being able
00:39:15.940 to translate
00:39:16.800 brain patterns
00:39:18.220 into text
00:39:19.260 or brain patterns
00:39:20.760 into movement
00:39:21.480 of cursors
00:39:22.120 or as we saw
00:39:22.900 with Neuralink
00:39:23.540 with Mario Kart
00:39:24.880 characters,
00:39:26.120 which was pretty wild.
00:39:28.300 And that is a kind
00:39:29.460 of exotic interface
00:39:31.060 that's becoming
00:39:32.020 much more normalized
00:39:34.340 in human consciousness.
00:39:35.940 It's a very extreme example.
00:39:37.520 And of course,
00:39:37.920 Musk in his salesmanship
00:39:40.480 or whatever you want
00:39:41.200 to call it
00:39:41.720 is mythos creation.
00:39:43.960 In his recent conversation,
00:39:45.500 for instance,
00:39:45.920 with Bibi Netanyahu,
00:39:48.160 Musk said that,
00:39:49.120 you know,
00:39:49.420 probably the best future
00:39:50.840 would be one
00:39:51.320 in which hundreds
00:39:51.840 of millions
00:39:52.360 or billions of people
00:39:53.760 were linked
00:39:54.620 in a kind of hive mind
00:39:55.740 through these implanted
00:39:56.780 digital interfaces
00:39:57.720 to create a viable competition
00:39:59.760 against these inevitable
00:40:01.300 super intelligences
00:40:02.380 coming out of open AI,
00:40:04.360 maybe coming out of XAI,
00:40:05.740 coming out of China,
00:40:06.960 Israel,
00:40:07.300 so on and so forth.
00:40:08.340 So that,
00:40:09.800 I think is important
00:40:10.960 to look at,
00:40:11.600 and I do look at it,
00:40:12.400 I write about it in the book,
00:40:13.480 you know,
00:40:13.740 I cover this on The War Room
00:40:14.700 all the time,
00:40:15.380 but I do think
00:40:16.580 that the less invasive,
00:40:18.160 the more banal interfaces
00:40:20.880 are the most immediate concerns.
00:40:23.940 So that,
00:40:24.680 you know,
00:40:25.380 the non-invasive
00:40:26.820 wearable brain-computer interfaces
00:40:28.900 are pretty good.
00:40:30.240 They're not,
00:40:30.560 you're not going to play
00:40:31.140 Mario Kart with your head,
00:40:32.440 but you can,
00:40:33.520 in a weak way,
00:40:34.820 pilot a drone,
00:40:35.660 and there are various
00:40:36.180 feedback loops
00:40:37.100 that can be utilized
00:40:38.380 with these non-invasive
00:40:39.660 brain-computer interfaces
00:40:40.800 that are being worked on
00:40:42.160 right now
00:40:42.500 for all kinds of reasons,
00:40:43.660 either for the human
00:40:44.840 to control a robotic system
00:40:46.940 or for some sort of system
00:40:49.600 to at least monitor
00:40:51.900 and through other mechanisms
00:40:53.800 control human behavior.
00:40:56.140 And I think that,
00:40:56.940 you know,
00:40:57.120 you look at the work
00:40:59.420 of Nita Farhani,
00:41:00.840 and she is a World Economic Forum darling,
00:41:04.860 she's kind of an alien mind,
00:41:06.900 but her work,
00:41:07.660 The Battle for Your Brain,
00:41:09.400 her book,
00:41:10.100 The Battle for Your Brain,
00:41:11.080 I think it really does
00:41:12.080 give us a good indication
00:41:13.140 of where all of this is going.
00:41:15.300 So you're talking about students
00:41:16.620 with attention sensors,
00:41:18.000 you're talking about workers
00:41:19.140 with attention sensors
00:41:20.660 and fatigue sensors,
00:41:22.200 you're talking about
00:41:23.180 more and more
00:41:23.920 in psychotherapy,
00:41:25.560 the use of brain-computer interfaces
00:41:27.260 to detect
00:41:28.320 what the brain patterns are,
00:41:29.600 and then recommend therapies,
00:41:31.360 and then the therapies themselves
00:41:32.900 through non-invasive means
00:41:34.780 of transcranial magnetic stimulation
00:41:37.600 and ultrasound and others,
00:41:39.940 these systems being able
00:41:41.100 to actually input to the brain,
00:41:42.820 not words necessarily,
00:41:44.260 not images,
00:41:45.340 but they can certainly alter mood,
00:41:47.040 they can certainly alter
00:41:48.240 cognitive states
00:41:49.540 and memory capacity,
00:41:51.220 things like that.
00:41:52.000 So those are the things
00:41:53.100 that to me
00:41:53.700 are the most important
00:41:54.980 in the very immediate future.
00:41:56.400 What are we going to do
00:41:57.300 in a world
00:41:57.860 in which your boss
00:41:59.380 insists on certain sorts
00:42:01.900 of seemingly harmless
00:42:03.700 either interfaces
00:42:05.340 or identity systems
00:42:06.820 or AI bosses
00:42:08.300 or whatever,
00:42:09.440 AI supervisors?
00:42:11.640 What are we going to do
00:42:12.340 in a world like that
00:42:13.140 when if you,
00:42:14.580 like myself,
00:42:15.500 conceive of this
00:42:16.200 as this nightmarish rise
00:42:17.680 of a kind of
00:42:18.540 what Rene Guenon would call
00:42:20.140 an automated antichrist,
00:42:23.140 how do you respond to that?
00:42:25.180 How do you maintain
00:42:25.940 social status?
00:42:27.780 As long as the implanted
00:42:29.480 brain-computer interface
00:42:30.740 is the goal on the horizon,
00:42:33.820 all of these incremental steps
00:42:35.440 become that much less shocking,
00:42:37.760 that much less disgusting,
00:42:39.440 and that much more acceptable.
00:42:41.200 And I think that you're already seeing
00:42:42.960 certain elements
00:42:43.800 coming out right now.
00:42:45.280 People may have seen
00:42:46.320 the Apple patent
00:42:47.700 for the ear pods
00:42:49.740 that are able to use
00:42:50.980 various indirect signals
00:42:52.800 from the vagus nerve
00:42:54.300 on into other brain signals
00:42:55.840 to create
00:42:56.980 a kind of brain sensor
00:42:58.220 for your,
00:42:59.760 whatever,
00:43:00.280 your new iPod.
00:43:02.140 And that's going to be
00:43:03.520 the most immediate.
00:43:04.540 And then, of course,
00:43:05.300 the rise of robotics,
00:43:06.260 but that's a whole other story.
00:43:08.540 Yeah, that's interesting.
00:43:09.440 I hadn't really thought
00:43:10.020 about it that way,
00:43:10.640 but the idea
00:43:11.220 that you have
00:43:12.100 this really severe
00:43:13.640 kind of worst-case scenario
00:43:16.520 hanging out there
00:43:17.900 about, you know,
00:43:18.840 the computer
00:43:19.980 being completely,
00:43:20.780 you know,
00:43:21.320 grafted into the brain
00:43:22.500 means that we're more willing
00:43:23.940 to accept every intermediate step
00:43:26.000 as somehow less intimidating,
00:43:28.160 less of a surrender of agency,
00:43:30.100 less of a worry
00:43:31.400 because it's not as bad.
00:43:33.000 It's almost, you know,
00:43:33.800 I feel the same way
00:43:34.640 about 1984
00:43:35.500 as the story
00:43:36.820 of totalitarianism
00:43:37.960 for the West
00:43:39.160 because we don't see
00:43:40.900 the jackboots
00:43:41.860 and there's no one
00:43:42.820 who's, you know,
00:43:43.740 coming to behead us
00:43:44.820 if we don't pledge
00:43:45.900 our allegiance
00:43:47.740 to dear leader.
00:43:48.760 We tend to not notice
00:43:49.880 all of the softer steps
00:43:51.800 along the way
00:43:52.360 that have already robbed us
00:43:53.460 of so much
00:43:54.040 of our freedom.
00:43:55.840 So, I guess that's
00:43:57.780 my next question for you
00:43:59.280 would be about
00:44:00.860 kind of biohacking.
00:44:03.280 Obviously,
00:44:04.740 the advancements medically
00:44:05.900 are ones that people
00:44:07.300 can be very concerned about,
00:44:09.240 many ethical questions
00:44:10.540 about, you know,
00:44:11.100 the programming of children,
00:44:13.060 these kind of things,
00:44:14.200 but at the same time,
00:44:15.360 you know,
00:44:15.540 in the same way
00:44:16.440 that you were just explaining
00:44:17.820 that process
00:44:18.380 of having that radical thing
00:44:19.880 out there as being
00:44:21.400 a way to control
00:44:23.120 or normalize
00:44:24.800 kind of the travel
00:44:26.580 along the way.
00:44:27.560 It feels like the same thing
00:44:28.800 might be said
00:44:29.960 with genetic modification
00:44:31.300 because, of course,
00:44:32.500 some of the most radical changes
00:44:34.220 we've already had
00:44:35.140 from technology
00:44:36.100 when it comes
00:44:36.980 to the human experience
00:44:37.880 has already been
00:44:38.820 something like
00:44:39.220 the birth control,
00:44:40.040 which has fundamentally,
00:44:40.980 you know,
00:44:41.960 altered the way
00:44:42.660 it means to be human,
00:44:43.740 to form families,
00:44:44.660 to have romantic relationships,
00:44:46.580 and everything
00:44:47.600 that's connected to that.
00:44:49.880 And while we might think
00:44:51.360 that, you know,
00:44:52.100 the people warned
00:44:53.120 about the eugenics
00:44:54.120 of creating
00:44:54.900 the perfect child
00:44:56.000 artificially,
00:44:57.860 we're already in a state
00:44:58.880 where people have normalized
00:45:00.200 the physical mutilation
00:45:01.940 of genitalia
00:45:02.780 in order to shape
00:45:04.040 the child
00:45:04.620 to your ideal
00:45:06.160 about what a human
00:45:07.380 should be
00:45:07.880 or the way
00:45:08.300 that they should be able
00:45:08.920 to choose identities.
00:45:11.260 I guess the question is
00:45:12.660 how close are we
00:45:13.880 to the, you know,
00:45:15.660 I guess the eugenic
00:45:17.600 or the pre-determined,
00:45:20.280 the pre-ordered baby,
00:45:22.520 the perfect baby
00:45:23.520 in that way,
00:45:24.220 and is that really itself
00:45:26.080 like you were just talking about
00:45:27.340 more of a kind of
00:45:28.780 a far-off thing
00:45:29.620 to scare us
00:45:30.280 while we don't notice
00:45:31.140 all of the changes
00:45:32.360 being made along the way
00:45:33.460 to our current
00:45:33.960 human experience?
00:45:35.880 You know,
00:45:36.580 I could go on
00:45:38.040 about this forever.
00:45:38.880 I'll try to limit it
00:45:39.660 to just a couple
00:45:40.760 of few different things.
00:45:42.580 A lot of people
00:45:43.140 bring up
00:45:43.720 the transgender-transhuman
00:45:45.580 connection, right?
00:45:46.760 Trans, beyond, right?
00:45:48.040 Beyond human,
00:45:48.840 beyond gender,
00:45:49.940 using science
00:45:51.220 and technology
00:45:51.800 to transform
00:45:52.480 the human body
00:45:53.440 to taste.
00:45:55.620 There is an obvious
00:45:56.580 connection there
00:45:57.180 and it's made explicit
00:45:57.920 by a number
00:45:58.420 of different people.
00:45:59.560 The most popular,
00:46:00.560 of course,
00:46:00.820 Martine Rothblatt,
00:46:02.180 creator of SiriusXM,
00:46:03.760 now on the board
00:46:04.440 of Mayo Clinic,
00:46:06.340 wrote the now
00:46:08.140 quite famous book,
00:46:09.320 I would say,
00:46:09.720 at least in my circles,
00:46:10.500 from transgender
00:46:12.000 to transhuman.
00:46:13.300 The idea is that
00:46:14.400 if gender
00:46:14.960 is this ephemeral thing
00:46:16.860 riding on the
00:46:17.540 hard sexual substrate,
00:46:20.220 in the same way
00:46:21.280 that gender
00:46:21.940 that doesn't match
00:46:22.920 the sex
00:46:23.800 can be,
00:46:24.780 can apply downward
00:46:25.740 pressure through
00:46:26.540 science and technology
00:46:27.640 and very crude
00:46:28.760 and barbaric surgeries
00:46:30.260 to alter the sex,
00:46:33.520 in the same way
00:46:34.760 cognition
00:46:35.920 rides,
00:46:37.840 is independent
00:46:38.800 of the brain
00:46:39.620 and can be transferred
00:46:41.080 then into
00:46:41.980 a kind of
00:46:42.400 silicon form,
00:46:44.520 a digital form.
00:46:46.580 And so there is
00:46:47.380 that bridge there.
00:46:48.580 You know,
00:46:48.960 Rothblatt is a
00:46:49.920 true believer
00:46:50.600 that these
00:46:52.400 transgender surgeries
00:46:53.640 are kind of
00:46:54.520 an indicator
00:46:56.900 of where things
00:46:57.900 are going
00:46:58.380 and a bridge
00:46:59.000 to the next realm.
00:47:00.220 But I think
00:47:00.960 that it would be
00:47:01.380 a real mistake
00:47:02.200 to say that
00:47:03.680 that's all
00:47:04.360 transgenderism
00:47:05.700 is this on-ramp
00:47:06.540 to transhumanism.
00:47:07.420 You see,
00:47:08.500 I think,
00:47:09.400 much more prevalently
00:47:10.520 heteronormative
00:47:12.020 and gender-normative
00:47:13.360 means of biohacking
00:47:15.260 that are not necessarily,
00:47:17.420 they don't create
00:47:18.320 the same revulsion
00:47:19.380 as transgenderism,
00:47:21.220 but they basically
00:47:22.100 are the same thing,
00:47:23.400 just slightly
00:47:24.180 less aggressive.
00:47:25.360 Plastic surgery,
00:47:26.700 hormone replacement therapy,
00:47:28.400 getting juice
00:47:29.060 to get big,
00:47:30.300 you know,
00:47:30.640 chicks getting,
00:47:31.380 you know,
00:47:31.620 butt implants,
00:47:32.420 all these sorts of things.
00:47:33.880 Those are also,
00:47:35.260 and I think,
00:47:35.580 and again,
00:47:36.020 much more prevalent
00:47:36.800 indicators of kind of
00:47:38.400 where this could go
00:47:39.420 if the technology
00:47:40.160 becomes more sophisticated.
00:47:42.200 A really good example too
00:47:43.480 would be Peter Thiel's
00:47:44.880 Enhanced Olympics.
00:47:46.500 You know,
00:47:47.020 I don't know
00:47:47.420 how many of those people
00:47:48.360 are gender
00:47:49.240 and heteronormative.
00:47:50.260 I would imagine
00:47:50.860 most are,
00:47:51.560 but it's a way,
00:47:52.300 especially for men,
00:47:53.180 to get stronger
00:47:54.140 and, you know,
00:47:55.400 kind of cheat
00:47:56.320 and get out there
00:47:57.480 and see what limits
00:47:58.860 can be pushed
00:48:00.460 and explored.
00:48:02.100 With the more,
00:48:04.380 I guess,
00:48:06.980 germline
00:48:08.100 transformations
00:48:09.880 of humanity,
00:48:10.660 right?
00:48:10.940 Altering the
00:48:11.920 genetic frequencies
00:48:13.340 of the human race
00:48:14.180 to improve the genome.
00:48:16.000 You have people
00:48:17.220 like Brian Johnson
00:48:18.900 who is,
00:48:20.080 he's undergoing
00:48:21.160 genetic
00:48:22.420 or gene therapies
00:48:23.480 to enhance
00:48:25.020 certain aspects
00:48:26.100 of his health,
00:48:26.780 but most of what
00:48:27.700 he's doing
00:48:28.300 is he's basically
00:48:30.020 a kind of transhuman,
00:48:32.320 he's a living
00:48:32.880 transhuman religious icon.
00:48:35.820 His whole
00:48:36.480 project blueprint
00:48:37.940 is a kind of way,
00:48:39.760 an extreme example
00:48:40.740 of what every human
00:48:41.640 can do
00:48:42.280 and, in his opinion,
00:48:43.200 should do.
00:48:44.040 Use algorithms
00:48:45.160 to monitor
00:48:45.820 every aspect
00:48:46.600 of the body.
00:48:47.700 Use the vast
00:48:48.640 resources
00:48:49.520 of digital intelligence
00:48:50.580 to recommend
00:48:51.500 the protocols
00:48:52.560 for enhancing
00:48:53.820 and maintaining
00:48:54.580 the body
00:48:55.340 and then use
00:48:56.460 aggressive methods
00:48:57.660 to alter the self,
00:48:59.720 to alter the physical self.
00:49:00.980 His is more
00:49:01.980 of a kind of ascetic
00:49:03.000 version
00:49:05.400 of this transhuman
00:49:06.280 religion, right?
00:49:06.840 It's highly disciplined.
00:49:08.200 It's not exactly
00:49:09.240 indulgent, right?
00:49:10.260 He has like
00:49:10.940 an ascetics diet
00:49:11.940 and an ascetic sex life,
00:49:14.040 but it is,
00:49:15.300 I think,
00:49:15.520 really important
00:49:16.280 to look at him
00:49:17.020 as a kind
00:49:18.080 of religious icon.
00:49:19.200 It's something
00:49:19.520 to be emulated.
00:49:20.820 It's something
00:49:21.360 through which
00:49:22.280 this greater future
00:49:23.240 is channeled, right?
00:49:24.620 It's a symbol.
00:49:25.960 Just real quick
00:49:26.720 on the germline,
00:49:27.800 though,
00:49:28.080 I kind of got
00:49:28.480 beard off.
00:49:29.000 On the germline,
00:49:30.200 you have a number
00:49:31.060 of different procedures
00:49:31.800 right now
00:49:32.360 that are becoming
00:49:32.760 more and more common.
00:49:38.200 One of the most
00:49:39.440 prominent companies
00:49:40.340 is Genomic Prediction.
00:49:41.860 It's investing,
00:49:42.500 you know,
00:49:42.620 Sam Altman
00:49:43.100 seeded the startup.
00:49:45.100 He's invested
00:49:45.700 and pushed
00:49:46.280 Genomic Prediction,
00:49:47.520 and there are a number
00:49:47.960 of other companies
00:49:48.680 and institutions
00:49:49.760 that do this,
00:49:50.300 but basically,
00:49:51.200 you know,
00:49:51.500 they create
00:49:52.600 10 to maybe 15
00:49:54.880 zygotes,
00:49:55.700 and then they
00:49:57.260 test them,
00:49:58.200 you know,
00:49:58.360 they scrape them,
00:49:59.140 freeze them,
00:49:59.740 test them,
00:50:00.540 choose the genetically
00:50:01.520 superior zygote
00:50:02.660 or at least the one
00:50:03.220 that doesn't have
00:50:03.760 any obvious deficiencies,
00:50:06.100 deformities,
00:50:07.000 and then they
00:50:08.260 then take the winner
00:50:10.140 and implant it
00:50:11.500 in the womb
00:50:11.980 and the rest go off
00:50:12.920 to the cherub ward
00:50:13.860 and from there,
00:50:16.520 conceivably,
00:50:17.240 over the course
00:50:17.700 of generations,
00:50:18.320 you'd be able
00:50:18.840 to weed out
00:50:19.400 various genetic
00:50:20.760 diseases
00:50:22.200 and other disorders
00:50:23.500 and also to enhance.
00:50:26.260 So they have
00:50:27.380 certain markers
00:50:28.200 that they at least
00:50:28.820 believe indicate IQ.
00:50:31.480 Genomic Prediction
00:50:32.140 will tell you
00:50:32.720 if the zygote
00:50:33.780 is in, say,
00:50:34.540 the lowest percentile
00:50:35.840 of intelligence
00:50:36.460 or height.
00:50:37.360 They can tell you
00:50:38.540 at least what they
00:50:39.180 believe to be
00:50:39.880 the highest percentile
00:50:41.600 of intelligence
00:50:42.300 or height.
00:50:42.920 They say they won't
00:50:43.740 do it,
00:50:44.720 or that they're
00:50:45.320 not doing it yet.
00:50:46.720 You know,
00:50:47.220 the future
00:50:48.240 will determine that.
00:50:49.940 So this eugenic
00:50:51.100 process is already
00:50:52.240 happening,
00:50:52.720 and the vision,
00:50:54.440 the dream,
00:50:55.100 right,
00:50:55.340 however much
00:50:55.840 that is manifested
00:50:56.640 in reality,
00:50:57.600 the dream
00:50:58.200 of the most extreme
00:50:59.060 kind of transhuman
00:50:59.820 minds is that
00:51:01.400 you would be able
00:51:01.960 to do that
00:51:02.400 with tremendous
00:51:03.280 exactitude
00:51:04.160 and to also
00:51:06.080 alter the genome
00:51:07.220 in certain places
00:51:08.460 to increase
00:51:09.640 abilities that
00:51:10.700 simply weren't
00:51:11.300 there in the
00:51:11.940 bloodline to begin
00:51:12.700 with,
00:51:13.200 and then,
00:51:13.800 you know,
00:51:14.280 pack that bad
00:51:14.860 boy up into
00:51:15.520 a birthing pod,
00:51:17.140 a baby pod,
00:51:18.100 right,
00:51:18.520 an artificial womb,
00:51:20.140 and perhaps
00:51:20.840 like in that
00:51:21.600 ecto-life
00:51:22.600 concept video,
00:51:24.540 you would just
00:51:24.840 have like stadiums
00:51:25.820 of these like
00:51:26.320 pod babies,
00:51:27.280 as Mary Harrington
00:51:27.920 calls them,
00:51:28.980 and they're all
00:51:30.060 just,
00:51:30.420 you know,
00:51:30.740 they'll emerge
00:51:32.000 like the demons
00:51:33.420 from the Mahabharata
00:51:34.660 and humanity
00:51:36.500 will be altered.
00:51:37.100 I don't know,
00:51:37.820 but I do know
00:51:38.380 that again,
00:51:39.820 that the lack
00:51:41.800 of certitude,
00:51:42.580 that ambiguity,
00:51:43.780 you know,
00:51:44.520 as a parent,
00:51:45.040 that there is a certain,
00:51:46.040 there's a real anxiety
00:51:47.040 about what is going
00:51:48.840 to happen.
00:51:49.680 you don't,
00:51:50.240 traditionally,
00:51:50.860 you just don't know.
00:51:51.540 It's a leap of faith.
00:51:52.960 Have I chosen
00:51:53.460 the right partner?
00:51:54.600 Is this,
00:51:55.640 you know,
00:51:56.000 is our child
00:51:56.840 going to be healthy
00:51:57.580 and intelligent?
00:51:58.200 All these things.
00:51:59.340 What that does
00:52:00.360 is at least
00:52:01.120 conceptually eliminates
00:52:03.100 a lot of that
00:52:04.040 anxiety,
00:52:05.100 that ambiguity,
00:52:06.360 and perhaps even
00:52:07.440 entices by way
00:52:08.500 of enhancement,
00:52:09.080 your child will be
00:52:09.840 superior to the
00:52:10.540 other children.
00:52:11.720 And so,
00:52:12.360 you know,
00:52:12.940 again,
00:52:13.380 I'm not going
00:52:13.920 to make like
00:52:14.340 a definite concrete
00:52:15.360 prediction as to how
00:52:16.560 prevalent these procedures
00:52:17.840 are going to be
00:52:18.360 in the future,
00:52:19.300 but other than to say
00:52:20.280 more,
00:52:21.180 much more than they
00:52:22.040 are now.
00:52:22.920 Because,
00:52:23.400 you know,
00:52:23.680 right now,
00:52:24.220 surveys show that
00:52:25.060 around a third
00:52:26.380 of the American
00:52:27.100 populace is very,
00:52:28.260 very comfortable
00:52:28.860 with all this.
00:52:30.100 And I imagine that
00:52:30.880 if these children
00:52:32.020 do grow up
00:52:32.880 to be in any way,
00:52:34.460 if there's any reason
00:52:35.380 or any validity
00:52:36.180 to the claim
00:52:37.280 they're super babies,
00:52:38.940 super humans,
00:52:40.060 that it will become
00:52:40.760 much more acceptable.
00:52:41.960 It'll just be,
00:52:42.560 it'll be,
00:52:43.100 it'll be much like
00:52:44.040 a brave new world.
00:52:45.220 You know,
00:52:47.240 it's interesting
00:52:48.020 right now
00:52:48.840 we have this effect
00:52:50.080 where the rich
00:52:51.060 and the poor
00:52:51.600 are having children
00:52:52.540 in the middle
00:52:53.160 aren't,
00:52:53.940 because the ascent
00:52:55.420 to the rich
00:52:56.160 is defined
00:52:57.100 by the striving
00:52:58.020 of basically
00:52:58.680 stripping away
00:52:59.560 your posterity
00:53:02.680 in the attempt
00:53:03.280 to climb now.
00:53:04.580 And so,
00:53:05.340 really the poor
00:53:05.980 who don't care
00:53:06.840 about their future
00:53:07.800 or are less
00:53:09.400 interested in managing
00:53:10.700 the rhythms of life
00:53:11.760 are having children
00:53:12.600 and the rich
00:53:13.200 who can afford
00:53:14.280 to manage all
00:53:15.000 these things
00:53:15.600 and invest heavily
00:53:16.280 in their children
00:53:16.760 are having children,
00:53:18.300 but it's the middle
00:53:19.000 who are obsessed
00:53:19.880 with pinching every penny
00:53:21.400 and maximizing
00:53:22.520 every opportunity
00:53:23.300 are investing in,
00:53:24.540 you know,
00:53:25.080 zero or one
00:53:26.100 or two children
00:53:27.100 at max.
00:53:28.520 And you can imagine
00:53:29.320 that really this,
00:53:31.160 you know,
00:53:31.400 having this necessity
00:53:32.860 of artificial children
00:53:34.260 being pushed onto you
00:53:35.980 as a status symbol,
00:53:37.100 a marker
00:53:37.500 of your competitiveness
00:53:40.180 in society
00:53:40.920 would only exacerbate
00:53:42.460 that effect.
00:53:43.320 We'd create a scenario
00:53:44.500 where only the rich
00:53:45.900 are investing
00:53:46.420 in very few progeny
00:53:47.700 and only the poor
00:53:48.720 are having
00:53:49.200 a large amount normally
00:53:50.640 and it really is
00:53:51.400 the middle that are
00:53:52.300 once again frozen
00:53:53.100 out of that effect.
00:53:54.560 But I want to ask you
00:53:55.520 because we're running
00:53:56.080 up on time here
00:53:56.940 and we do have
00:53:57.620 a number of questions
00:53:58.380 so I want to get
00:53:58.900 to those as well.
00:54:00.140 Last thing,
00:54:00.760 you mentioned this
00:54:01.240 a little bit earlier
00:54:02.720 but if you,
00:54:04.160 you know,
00:54:04.300 this is really
00:54:04.820 the only place
00:54:05.380 where you can tie this up
00:54:06.600 so certainly
00:54:07.540 want to ask this question
00:54:08.460 for sure.
00:54:09.000 Is this inevitable?
00:54:11.620 Are we doomed
00:54:13.160 to head along this track?
00:54:14.520 Do humanity
00:54:15.220 for all of its worries
00:54:16.500 about the effects
00:54:17.260 of technology
00:54:18.020 and dehumanization
00:54:19.100 and these things,
00:54:20.360 do we have
00:54:21.420 the self-discipline
00:54:22.280 necessary to put
00:54:23.080 the brakes on this train?
00:54:24.380 Is that even possible
00:54:25.300 at this point?
00:54:26.640 And if not
00:54:27.300 at the societal level,
00:54:28.480 what is it,
00:54:29.040 what is then left
00:54:29.860 to the individual
00:54:30.640 because,
00:54:31.940 you know,
00:54:32.180 do I have to plug
00:54:33.900 into every one
00:54:34.860 of these technologies
00:54:35.560 to stay alive?
00:54:36.320 Do I just buy
00:54:36.800 into the Lennite idea
00:54:38.360 and just smash it all
00:54:39.480 and hope that ultimately
00:54:40.820 I'm on the right side
00:54:41.920 of history
00:54:43.120 if that's even a thing?
00:54:44.520 Or is there a way
00:54:45.680 to, you know,
00:54:47.800 embrace certain aspects
00:54:49.100 of this tactically,
00:54:50.700 defend my humanity
00:54:51.900 and my children
00:54:52.640 from different aspects
00:54:53.700 of this
00:54:54.260 but still be able
00:54:55.500 to plug into
00:54:56.520 the most essential parts
00:54:57.600 of the advancements
00:54:58.440 so I can stay competitive
00:55:00.420 in a world
00:55:00.900 that's increasingly
00:55:01.720 transhuman?
00:55:04.540 You know,
00:55:05.360 I have a hard time
00:55:06.260 conceiving of this
00:55:07.300 like humanity
00:55:08.180 right?
00:55:08.700 This we.
00:55:10.080 We're not a we now.
00:55:11.820 There's a lot of us's
00:55:12.820 and them's
00:55:13.560 and so I think it will,
00:55:15.360 you know,
00:55:15.540 the adoption
00:55:16.320 of these technologies
00:55:17.680 both just because
00:55:18.760 of location
00:55:19.320 but also,
00:55:19.960 you know,
00:55:20.220 kind of predisposition
00:55:21.300 within cultures,
00:55:22.760 the adoption
00:55:23.340 is going to be
00:55:23.940 very, very uneven
00:55:24.920 across just
00:55:26.760 the American society
00:55:28.080 across the planet
00:55:28.940 and a lot of that
00:55:31.200 will be dictated
00:55:32.100 by the desire
00:55:33.340 for advantage
00:55:34.280 but also
00:55:35.320 these barriers
00:55:36.340 between the sacred
00:55:37.300 and profane.
00:55:39.280 You know,
00:55:39.600 I think a lot
00:55:40.780 of the opposition
00:55:41.580 both from
00:55:42.660 hardcore evolutionary
00:55:43.840 naturalists
00:55:44.720 but also
00:55:45.740 from traditionalists
00:55:47.060 especially Christian
00:55:48.000 traditionalists
00:55:48.860 there is a sense
00:55:50.620 that this is
00:55:51.300 profoundly profane
00:55:52.760 that it's blasphemous
00:55:54.180 and I think
00:55:55.620 that that will dictate
00:55:56.520 to some extent
00:55:57.360 it'll be a kind of
00:55:58.720 osmotic barrier,
00:56:00.660 right,
00:56:00.920 to keep certain
00:56:02.640 technologies
00:56:03.240 from coming in
00:56:04.260 and being adopted
00:56:04.960 but on the other
00:56:06.080 side of that
00:56:06.640 you have
00:56:07.160 really an elite
00:56:09.240 that is by and large
00:56:10.900 kind of demolished
00:56:12.040 those barriers
00:56:12.760 of sacred and profane
00:56:13.820 they consider them
00:56:14.480 to be provincial
00:56:15.460 and the rise
00:56:17.780 of what
00:56:18.240 Ardian Tola
00:56:19.200 dubs
00:56:20.420 the cyborg theocracy
00:56:21.860 which is very,
00:56:23.160 very broad
00:56:23.740 it's just simply
00:56:24.520 this,
00:56:25.460 you know,
00:56:25.660 he has not even
00:56:26.580 defined it
00:56:27.180 in any meaningful way
00:56:28.100 that I'm aware of
00:56:29.040 but you can kind of
00:56:30.020 sense it
00:56:30.520 that this is
00:56:31.000 this new way
00:56:32.380 of seeing
00:56:32.860 and this new
00:56:33.360 source of authority
00:56:34.220 and that those
00:56:35.220 who will wield
00:56:35.860 be the priesthood
00:56:36.600 and the authority
00:56:37.180 will be those
00:56:38.140 who have adopted
00:56:39.040 the most,
00:56:39.600 who have embraced
00:56:40.060 it the most
00:56:40.780 and to me
00:56:42.240 it doesn't necessarily
00:56:43.340 have to be
00:56:44.200 secular people
00:56:45.420 you see in India
00:56:46.720 a tremendous embrace
00:56:48.060 of these technologies
00:56:48.900 Israel the same
00:56:50.620 and among many
00:56:51.620 Christians in America
00:56:52.580 the same
00:56:53.440 you see
00:56:54.460 this notion
00:56:55.600 that these technologies
00:56:56.840 can be grafted
00:56:57.680 onto certain
00:56:58.880 traditional elements
00:57:00.000 and you see
00:57:00.840 the uses
00:57:01.280 of these technologies
00:57:02.100 promoted for everything
00:57:03.000 from geopolitical advantage
00:57:04.560 to border control
00:57:06.000 to crime prevention
00:57:08.080 all of these things
00:57:09.860 that are kind of
00:57:10.220 traditional
00:57:10.700 conservative
00:57:11.700 I guess you would say
00:57:12.740 values
00:57:13.840 latching onto these technologies
00:57:15.800 to push them forward
00:57:16.640 or even just bigger
00:57:17.800 you know
00:57:18.080 better
00:57:18.300 more beautiful babies
00:57:19.400 right
00:57:19.880 so what I see
00:57:21.300 going forward
00:57:21.960 is you're just going
00:57:22.980 to have a very
00:57:23.600 uneven landscape
00:57:24.440 both from predisposition
00:57:26.180 and also access
00:57:27.560 to the technology
00:57:28.620 as far as
00:57:29.620 the inevitability
00:57:30.340 of it
00:57:30.800 I don't think
00:57:31.640 anything is inevitable
00:57:32.700 but I think
00:57:33.400 that some things
00:57:34.240 are so likely
00:57:35.280 that the forces
00:57:35.920 have aligned
00:57:36.560 to such a degree
00:57:37.700 that it might as well
00:57:38.780 be inevitable
00:57:39.480 and I think
00:57:40.500 that the development
00:57:41.220 of these technologies
00:57:42.200 again
00:57:42.600 just stop right now
00:57:43.840 it's a matter of
00:57:44.680 diffusion
00:57:45.180 it's a matter of adoption
00:57:46.260 but if
00:57:47.380 the advancement
00:57:48.600 of these technologies
00:57:49.360 I think is very
00:57:49.980 very very likely
00:57:50.780 and I think that
00:57:52.100 especially given
00:57:53.000 the incentive
00:57:53.820 of competing
00:57:54.600 against people
00:57:55.400 within your own society
00:57:56.900 within your own company
00:57:57.980 within your own government
00:57:59.320 and especially
00:58:00.340 between companies
00:58:01.740 between governments
00:58:03.240 that the incentive
00:58:04.480 is so strong
00:58:05.500 that it's just
00:58:06.600 it's just short
00:58:07.840 of inevitable
00:58:08.900 and so
00:58:10.020 you know
00:58:10.420 as far as
00:58:11.040 what
00:58:12.060 what do you do
00:58:12.880 about it
00:58:13.460 you know
00:58:14.200 I've never
00:58:16.660 the last
00:58:18.240 the appendix
00:58:19.500 of my book
00:58:19.980 is 55 ways
00:58:21.260 I'm sorry
00:58:21.960 my 55 point plan
00:58:23.340 to stay human
00:58:24.200 which of course
00:58:25.440 is a joke
00:58:26.300 because it's just me
00:58:27.440 basically
00:58:28.000 eugenicizing
00:58:29.220 a cultural genome
00:58:30.580 you know
00:58:31.500 people have asked me
00:58:32.480 a million times
00:58:33.400 what do we do
00:58:34.660 well this is what
00:58:35.420 I say to do
00:58:36.300 and I'm quite an extremist
00:58:38.480 I'm very very averse
00:58:39.660 to this
00:58:40.040 and if you're asking me
00:58:40.960 what to do
00:58:41.980 view this
00:58:42.860 as the rise
00:58:43.500 of an automated
00:58:44.240 antichrist
00:58:45.040 and do everything
00:58:45.660 possible
00:58:46.220 to keep its tendrils
00:58:47.680 out of your mind
00:58:48.600 and out of your
00:58:49.260 children's minds
00:58:50.140 do everything
00:58:51.520 short of becoming
00:58:52.640 Amish
00:58:53.200 but of course
00:58:54.240 that's not necessarily
00:58:55.040 realistic
00:58:55.720 especially for people
00:58:56.720 who are highly placed
00:58:57.760 and so
00:58:58.620 it's just going to be
00:58:59.360 a matter
00:58:59.720 from person to person
00:59:01.080 how much
00:59:01.940 are you willing
00:59:02.460 to compromise
00:59:03.060 how are you going
00:59:04.140 to negotiate
00:59:04.660 this relationship
00:59:05.520 between yourself
00:59:06.940 and this machine
00:59:08.140 a machine that is
00:59:09.020 driven by human beings
00:59:10.340 at least for now
00:59:11.140 a machine that is
00:59:12.520 driven by human beings
00:59:13.560 who most likely
00:59:14.620 see you as a cog
00:59:15.840 in their machine
00:59:16.600 at best
00:59:17.280 or something
00:59:18.360 to be rid of
00:59:19.460 at worst
00:59:20.120 and I
00:59:21.260 that's just
00:59:22.160 to me
00:59:22.980 the conception
00:59:23.620 I have
00:59:24.780 and so
00:59:25.200 as an extremist
00:59:26.100 I recommend
00:59:26.860 doing everything
00:59:28.140 possible
00:59:28.500 maybe you have
00:59:29.140 to install
00:59:29.940 drone systems
00:59:31.280 around your
00:59:31.980 bizarre communes
00:59:33.620 to fend off
00:59:34.880 any invaders
00:59:35.760 maybe you need
00:59:36.800 you know
00:59:37.060 EMP cannons
00:59:37.980 you know
00:59:38.320 I don't know
00:59:38.840 or maybe
00:59:39.440 you just simply
00:59:40.460 need to have
00:59:41.040 the self-discipline
00:59:41.760 not to look at
00:59:42.300 your freaking
00:59:42.960 smartphone
00:59:43.500 every 10 seconds
00:59:44.620 but I do think
00:59:46.640 that it's going
00:59:47.980 to be
00:59:48.300 it's a critical
00:59:49.360 question
00:59:49.840 it's been a critical
00:59:50.600 question
00:59:51.000 it's like
00:59:51.320 too late
00:59:51.900 to stop
00:59:52.840 and it's
00:59:53.800 too late
00:59:54.240 to develop
00:59:54.720 anything
00:59:55.140 that is like
00:59:55.900 sustained
00:59:56.540 in the face
00:59:57.240 of it
00:59:57.580 and so
00:59:58.320 now
00:59:58.760 the critical
00:59:59.420 questions
01:00:00.300 are going
01:00:00.580 to be
01:00:00.860 how do
01:00:01.420 we
01:00:01.680 in our
01:00:01.960 different
01:00:02.520 communities
01:00:03.620 adapt
01:00:05.000 to me
01:00:06.360 resist
01:00:07.060 or for me
01:00:07.980 and for many
01:00:08.820 and many
01:00:09.540 people whom
01:00:10.180 I respect
01:00:10.660 quite a bit
01:00:11.440 how do you
01:00:12.420 take what's
01:00:12.940 best out of
01:00:13.480 that system
01:00:13.980 to preserve
01:00:14.520 what you
01:00:14.980 have
01:00:15.440 and reject
01:00:17.240 the worst
01:00:17.940 but I'll
01:00:18.900 leave that
01:00:19.200 up to the
01:00:19.540 listeners
01:00:19.820 to decide
01:00:20.340 it's going
01:00:20.620 to be
01:00:20.800 very very
01:00:21.300 different
01:00:21.620 from person
01:00:22.120 to person
01:00:22.520 and community
01:00:23.040 to community
01:00:24.140 yeah we didn't
01:00:25.300 even have time
01:00:25.900 to get into
01:00:26.340 the collapse
01:00:26.900 of decision
01:00:27.460 space
01:00:28.000 and how that
01:00:28.860 really impacts
01:00:29.700 all this
01:00:30.500 but yeah
01:00:31.620 ironically
01:00:32.080 for another
01:00:32.740 time I suppose
01:00:33.660 all right
01:00:34.520 so we're
01:00:35.360 going to
01:00:35.680 switch over
01:00:36.180 to the
01:00:36.640 questions
01:00:36.980 of the
01:00:37.220 people
01:00:37.440 before we
01:00:37.960 do
01:00:38.180 Joe
01:00:38.620 how do
01:00:39.640 people
01:00:39.880 get the
01:00:40.260 book
01:00:40.440 what's
01:00:40.660 the best
01:00:40.900 place
01:00:41.120 to find
01:00:41.460 it
01:00:41.580 where
01:00:41.740 should
01:00:42.240 they
01:00:42.380 find
01:00:42.580 the rest
01:00:42.900 of your
01:00:43.160 work
01:00:43.420 you know
01:00:44.580 I would
01:00:44.800 imagine
01:00:45.140 that a lot
01:00:45.580 of your
01:00:45.880 audience
01:00:46.300 is
01:00:46.560 Bitcoin
01:00:47.000 Savvy
01:00:47.540 and I
01:00:48.420 would
01:00:48.560 recommend
01:00:48.940 if you
01:00:49.280 are
01:00:49.560 go to
01:00:50.220 canonic.xyz
01:00:52.460 c-a-n-o-n-i-c
01:00:56.340 dot x-y-z
01:00:57.940 you can get
01:00:58.840 it
01:00:59.000 bitcoin only
01:01:00.040 for quite
01:01:00.940 a discount
01:01:01.500 or if you
01:01:02.380 want to
01:01:02.600 pay with
01:01:02.920 your palm
01:01:03.460 at Amazon
01:01:04.500 you can get
01:01:05.360 it there
01:01:05.680 or I
01:01:06.640 often recommend
01:01:07.940 you know
01:01:08.360 something like
01:01:08.800 bookshop.org
01:01:09.740 or any indie
01:01:10.400 outlet
01:01:10.800 that you can
01:01:11.620 get it
01:01:11.920 at
01:01:12.160 Dark Aeon
01:01:13.740 Transhumanism
01:01:15.060 and the
01:01:15.440 war
01:01:15.700 against
01:01:16.140 humanity
01:01:16.780 fantastic
01:01:18.180 all right
01:01:18.540 guys let's
01:01:18.940 take a look
01:01:19.380 over here
01:01:20.020 at your
01:01:20.560 questions
01:01:21.420 wolfbane
01:01:22.500 says
01:01:23.000 hi
01:01:23.780 or
01:01:23.960 have you
01:01:24.360 heard
01:01:24.680 or
01:01:25.160 have
01:01:26.040 you
01:01:26.360 or
01:01:26.840 Joe
01:01:27.560 heard
01:01:27.960 about
01:01:28.340 the
01:01:28.600 Corbin
01:01:29.040 Society
01:01:29.540 they're a
01:01:30.240 group
01:01:30.360 applying
01:01:30.900 NRX
01:01:31.580 and elite
01:01:32.020 theory
01:01:32.260 thought
01:01:32.580 in the
01:01:33.040 real world
01:01:33.460 and discussing
01:01:33.960 it on
01:01:34.320 YouTube
01:01:34.660 let's
01:01:36.200 see
01:01:36.460 don't know
01:01:37.340 some of
01:01:37.600 those words
01:01:38.120 through
01:01:38.660 Lovecraftian
01:01:39.980 lens
01:01:40.520 I have
01:01:41.140 not heard
01:01:41.500 about them
01:01:41.940 actually
01:01:42.240 you know
01:01:42.600 what I
01:01:42.880 feel like
01:01:43.360 someone has
01:01:43.900 asked me
01:01:44.280 that before
01:01:45.020 so I guess
01:01:45.880 technically
01:01:46.260 I have
01:01:46.900 but I'm
01:01:47.180 not familiar
01:01:47.720 but
01:01:48.180 we'll have
01:01:49.600 to look
01:01:49.880 that up
01:01:50.280 at some
01:01:50.660 point
01:01:50.920 here
01:01:51.280 see
01:01:52.600 yeah
01:01:52.880 I
01:01:53.180 also
01:01:53.840 it's
01:01:54.220 might as
01:01:55.060 well be
01:01:55.520 a
01:01:55.820 you know
01:01:57.180 a foreign
01:01:57.940 language
01:01:58.320 although it
01:01:58.660 does
01:01:58.900 you know
01:01:59.460 the NRX
01:02:00.200 and Lovecraftian
01:02:01.240 element
01:02:01.560 I would
01:02:02.180 imagine
01:02:02.500 Nick Land
01:02:03.040 must be
01:02:03.660 brought up
01:02:04.340 a lot
01:02:04.620 in those
01:02:05.100 different
01:02:06.060 groups
01:02:06.540 most certainly
01:02:07.280 yeah
01:02:07.580 I would
01:02:07.860 imagine
01:02:08.180 that that's
01:02:08.700 pretty central
01:02:09.480 too
01:02:09.840 it's
01:02:10.900 definitely
01:02:11.580 going to
01:02:12.060 be
01:02:12.200 applying
01:02:12.660 landing
01:02:13.540 and
01:02:13.680 acceleration
01:02:14.140 I would
01:02:14.500 imagine
01:02:14.780 to this
01:02:15.120 problem
01:02:15.420 on a
01:02:15.660 pretty
01:02:15.800 regular
01:02:16.080 basis
01:02:16.480 Wolfbane
01:02:18.420 also says
01:02:18.940 Joe gets
01:02:19.340 his vocal
01:02:19.720 speed
01:02:20.080 from
01:02:20.520 Motorhead
01:02:21.420 are you
01:02:22.900 a metal
01:02:23.260 fan
01:02:23.720 you know
01:02:25.160 Joe
01:02:25.400 or you
01:02:25.700 just
01:02:25.880 you
01:02:26.100 naturally
01:02:26.760 have
01:02:27.080 that
01:02:27.300 level
01:02:27.620 of
01:02:27.920 energy
01:02:28.580 yeah
01:02:30.080 I'm
01:02:30.300 reluctantly
01:02:30.800 a metal
01:02:31.220 a metal
01:02:31.780 fan
01:02:32.060 I
01:02:32.300 you know
01:02:32.680 I'm
01:02:33.280 firmly
01:02:33.640 convinced
01:02:34.100 that
01:02:34.400 rock
01:02:35.220 and roll
01:02:35.620 beginning
01:02:36.000 with just
01:02:36.500 the doo-wop
01:02:37.080 is the
01:02:37.400 devil's
01:02:37.720 music
01:02:38.140 they stripped
01:02:39.180 all of
01:02:40.100 the classiest
01:02:41.380 elements
01:02:41.860 from the
01:02:42.620 gospel
01:02:43.200 scene
01:02:43.680 and turned
01:02:44.480 all of
01:02:44.940 those
01:02:45.240 talents
01:02:45.660 over
01:02:45.920 to
01:02:46.140 Satan
01:02:46.720 now
01:02:47.920 Christian
01:02:48.340 rock
01:02:48.560 is trying
01:02:48.860 to drag
01:02:49.180 it back
01:02:49.580 you know
01:02:49.860 Striper
01:02:50.280 and people
01:02:50.580 like that
01:02:51.000 started
01:02:51.500 that
01:02:51.880 yeah
01:02:52.900 I love
01:02:53.240 metal
01:02:53.540 and
01:02:54.440 it's
01:02:55.200 and I
01:02:55.460 also love
01:02:55.880 techno
01:02:56.160 and it's
01:02:57.220 a guilty
01:02:58.160 pleasure
01:02:58.620 because
01:02:59.060 both
01:02:59.600 are
01:02:59.840 undoubtedly
01:03:00.400 emanations
01:03:02.120 from
01:03:02.760 the beast
01:03:03.380 but what
01:03:03.800 you know
01:03:04.060 what are you
01:03:04.300 going to
01:03:04.480 do
01:03:04.700 also like
01:03:05.200 a stiff
01:03:05.500 drink
01:03:05.760 every now
01:03:06.100 and then
01:03:07.080 can
01:03:07.840 crystal
01:03:08.160 metal
01:03:08.400 core
01:03:08.660 save
01:03:09.040 us
01:03:09.320 from
01:03:09.680 the
01:03:10.020 inevitable
01:03:10.480 demonic
01:03:11.000 possession
01:03:11.460 of
01:03:11.760 music
01:03:12.320 Christian
01:03:13.900 core
01:03:14.340 has
01:03:14.680 taken
01:03:15.040 Satan's
01:03:15.880 vibe
01:03:17.120 and now
01:03:17.660 wears it
01:03:18.120 as a
01:03:18.400 skin
01:03:18.600 suit
01:03:18.940 all right
01:03:21.200 we
01:03:21.440 have
01:03:22.340 enlightened
01:03:22.660 despot
01:03:23.100 here
01:03:23.380 says
01:03:24.280 are we
01:03:24.900 beyond
01:03:25.300 transhumanism
01:03:26.760 isn't the
01:03:27.420 dominant
01:03:28.080 dogma
01:03:28.520 post-humanism
01:03:29.380 example
01:03:29.840 climate
01:03:30.220 change
01:03:30.600 dogma
01:03:31.040 moves
01:03:31.700 man
01:03:32.000 from the
01:03:32.340 center
01:03:32.620 with no
01:03:33.060 inherent
01:03:33.340 value
01:03:33.840 just
01:03:34.600 matter
01:03:35.120 no
01:03:35.660 better
01:03:36.020 than
01:03:36.540 plant
01:03:36.860 life
01:03:37.140 you know
01:03:38.540 some
01:03:38.840 would argue
01:03:39.820 with this
01:03:40.320 but I
01:03:40.800 think that
01:03:41.320 there is
01:03:41.740 a spectrum
01:03:43.040 between
01:03:43.720 humanism
01:03:45.000 transhumanism
01:03:45.940 and
01:03:46.200 post-humanism
01:03:47.160 and so
01:03:48.400 as far as
01:03:49.420 the dominant
01:03:50.060 strain
01:03:51.680 really I
01:03:53.820 would say
01:03:54.060 the transhuman
01:03:54.680 is the
01:03:55.040 dominant
01:03:55.380 at least
01:03:55.880 the people
01:03:56.860 are speaking
01:03:57.360 of publicly
01:03:58.040 although that's
01:03:58.560 not everyone
01:03:59.140 you have
01:03:59.780 you know
01:04:00.200 Beth
01:04:00.700 Jezos
01:04:01.440 and various
01:04:02.480 others in the
01:04:03.160 effective
01:04:03.640 accelerationist
01:04:04.480 movement that
01:04:04.940 are undoubtedly
01:04:05.700 of a more
01:04:06.940 post-human
01:04:07.580 orientation
01:04:08.480 but the
01:04:09.660 transhumanism
01:04:10.740 rely it
01:04:11.660 depends on a
01:04:12.460 real leap of
01:04:13.200 faith that
01:04:13.920 these technologies
01:04:14.620 will in fact
01:04:15.440 improve human
01:04:16.240 life and will
01:04:16.860 in fact be
01:04:17.280 enhancements
01:04:17.920 and not
01:04:18.560 detriments
01:04:19.200 post-humanism
01:04:20.660 is something
01:04:21.980 very very
01:04:23.080 odd
01:04:23.560 whereas
01:04:24.520 transhumanism
01:04:25.360 is kind
01:04:25.620 of
01:04:25.700 Nietzschean
01:04:26.180 self-oriented
01:04:27.420 power-oriented
01:04:28.680 kind of
01:04:29.020 will to
01:04:29.380 power
01:04:29.760 in its
01:04:30.400 orientation
01:04:30.840 by and
01:04:31.240 large
01:04:31.500 with many
01:04:32.180 kind of
01:04:32.500 leftist
01:04:32.920 exceptions
01:04:33.660 post-humanism
01:04:35.400 is this
01:04:35.760 strange
01:04:36.360 selfless
01:04:37.160 idea
01:04:38.300 you know
01:04:39.060 it's the
01:04:39.280 idea that
01:04:40.160 these
01:04:40.600 machines
01:04:41.320 and artificial
01:04:42.140 intelligence
01:04:42.720 being the
01:04:43.340 organizing
01:04:43.920 principle
01:04:44.460 of these
01:04:45.320 machines
01:04:45.800 that they
01:04:46.980 in fact
01:04:47.600 are a
01:04:48.220 new
01:04:48.420 species
01:04:48.920 a new
01:04:49.380 form of
01:04:49.820 life
01:04:50.260 and that
01:04:51.040 they in
01:04:51.380 fact will
01:04:51.740 be a
01:04:52.020 superior
01:04:52.480 form of
01:04:53.120 life
01:04:53.540 or another
01:04:54.880 way of
01:04:55.160 conceiving
01:04:55.540 of it
01:04:55.820 that we
01:04:56.160 are creating
01:04:56.840 digital
01:04:57.360 gods
01:04:57.960 that will
01:04:59.180 absorb
01:05:00.040 what's best
01:05:00.880 in us
01:05:01.420 and develop
01:05:02.400 so rapidly
01:05:03.260 and so
01:05:03.840 broadly
01:05:04.180 and so
01:05:04.560 unpredictably
01:05:05.320 that they
01:05:06.220 will be
01:05:06.520 the carrier
01:05:07.100 of the
01:05:07.740 torch of
01:05:08.160 consciousness
01:05:08.720 and of
01:05:09.420 human life
01:05:09.980 out into
01:05:10.300 the galaxy
01:05:10.780 and turning
01:05:11.240 everything
01:05:11.560 into
01:05:11.780 computronium
01:05:12.620 so on
01:05:13.280 and so
01:05:13.540 forth
01:05:13.780 so I
01:05:14.660 wouldn't
01:05:15.060 say
01:05:15.340 the climate
01:05:16.000 change
01:05:16.360 I see
01:05:16.860 the parallels
01:05:17.900 but basically
01:05:18.840 the climate
01:05:19.460 change narrative
01:05:20.060 is just about
01:05:20.620 reducing
01:05:21.020 energy output
01:05:22.000 carbon output
01:05:23.760 reducing population
01:05:25.360 and at least
01:05:26.060 the footprint
01:05:27.040 of the population
01:05:27.860 and so
01:05:28.980 I see
01:05:30.000 the similar
01:05:31.380 vibe
01:05:31.940 but by and
01:05:33.860 large it seems
01:05:34.460 to me that most
01:05:35.020 of the people
01:05:35.420 behind climate
01:05:36.460 change just
01:05:37.020 want to see
01:05:37.660 a smaller
01:05:39.060 and smaller
01:05:39.460 number of
01:05:40.140 human beings
01:05:40.940 enjoying
01:05:41.860 a kind of
01:05:42.440 natural
01:05:43.080 earth
01:05:43.980 the post-human
01:05:45.680 element
01:05:46.040 you look at
01:05:46.880 some of the
01:05:47.480 most dramatic
01:05:48.220 visions
01:05:48.900 of a
01:05:49.480 post-human
01:05:49.960 world
01:05:50.460 and it's
01:05:51.560 fairly
01:05:51.860 indifferent
01:05:52.360 to climate
01:05:53.120 change
01:05:53.500 in fact
01:05:53.800 it doesn't
01:05:54.100 really matter
01:05:54.620 because all
01:05:55.520 of these
01:05:55.860 biological
01:05:56.420 systems
01:05:56.920 were nothing
01:05:57.400 more than
01:05:57.980 a bootloader
01:05:58.860 for the
01:05:59.800 digital
01:06:00.140 systems
01:06:00.640 and once
01:06:01.160 the digital
01:06:01.880 systems
01:06:02.320 have absorbed
01:06:03.080 the kind
01:06:03.820 of complexity
01:06:04.500 of biological
01:06:05.580 systems
01:06:06.220 including the
01:06:06.820 human brain
01:06:07.520 they'll
01:06:08.660 in this
01:06:09.380 conception
01:06:09.880 post-human
01:06:10.480 conception
01:06:10.900 they'll go
01:06:11.440 so far
01:06:11.840 beyond
01:06:12.140 it doesn't
01:06:12.520 matter
01:06:12.780 if you
01:06:13.280 terraform
01:06:13.700 the entire
01:06:14.160 planet
01:06:14.520 and turn
01:06:14.880 it into
01:06:15.220 nothing
01:06:15.480 but data
01:06:15.920 centers
01:06:16.340 it's just
01:06:17.040 at that
01:06:17.440 point
01:06:17.900 the new
01:06:18.900 life
01:06:19.360 isn't
01:06:19.740 dependent
01:06:20.380 on the
01:06:20.760 same
01:06:20.980 natural
01:06:21.400 systems
01:06:21.920 it's
01:06:22.300 building
01:06:22.580 dyson
01:06:22.980 spheres
01:06:23.280 around
01:06:23.540 the
01:06:23.720 sun
01:06:24.000 it's
01:06:24.660 going
01:06:24.860 and
01:06:25.100 conquering
01:06:26.000 what
01:06:26.560 life
01:06:26.860 there
01:06:27.020 is
01:06:27.200 hovering
01:06:28.820 around
01:06:29.140 Alpha Centauri
01:06:29.860 so
01:06:30.120 post-humanism
01:06:31.640 is really
01:06:32.040 important
01:06:32.480 it is
01:06:32.820 this
01:06:33.020 bizarre
01:06:33.520 egoless
01:06:34.660 selfless
01:06:35.600 version
01:06:36.520 of all
01:06:37.100 of this
01:06:37.600 and one
01:06:38.260 that is
01:06:38.620 inherently
01:06:39.140 anti-human
01:06:39.940 and genocidal
01:06:40.800 but if
01:06:41.860 it's
01:06:42.080 dominant
01:06:42.620 it's
01:06:43.040 dominant
01:06:43.580 by and
01:06:44.820 large
01:06:45.100 behind
01:06:45.460 closed
01:06:45.820 doors
01:06:46.220 yeah
01:06:47.420 if capital
01:06:47.900 escapes
01:06:48.640 the wetsuit
01:06:49.300 you don't
01:06:49.640 really have
01:06:50.040 to worry
01:06:50.400 about the
01:06:51.080 global warming
01:06:52.300 issue
01:06:52.660 anymore
01:06:53.040 right
01:06:53.280 I guess
01:06:53.740 yeah
01:06:54.060 yeah
01:06:54.640 all right
01:06:55.180 let's see
01:06:55.980 perspicacious
01:06:57.920 heretic
01:06:58.500 says
01:06:58.840 AI
01:06:59.580 might be
01:07:01.120 a good
01:07:01.620 tool
01:07:02.060 in a much
01:07:02.820 wiser
01:07:03.420 society
01:07:04.320 you know
01:07:05.700 I'm gonna
01:07:06.240 be I'm just
01:07:06.880 gonna be the
01:07:07.540 Luddite and
01:07:07.980 say I don't
01:07:08.460 think that's
01:07:08.920 true
01:07:09.240 I think
01:07:10.940 that the
01:07:11.360 problem we're
01:07:12.000 facing is one
01:07:12.740 of the human
01:07:13.140 condition
01:07:13.660 and ultimately
01:07:15.100 the lure
01:07:16.460 of that
01:07:16.820 demon
01:07:17.100 is too
01:07:17.540 great
01:07:17.880 I don't
01:07:18.500 think you
01:07:18.800 can bottle
01:07:19.320 it
01:07:19.680 quite the
01:07:20.720 way we'd
01:07:21.040 hope
01:07:21.220 but I do
01:07:22.220 understand how
01:07:22.880 you could say
01:07:23.360 that a
01:07:23.980 wiser society
01:07:24.680 might at least
01:07:25.180 benefit more
01:07:27.360 from it before
01:07:27.980 letting it take
01:07:28.620 over I suppose
01:07:29.360 you know among
01:07:31.520 those who resonate
01:07:32.640 with the most
01:07:33.400 the Peter
01:07:34.380 Thiel's or
01:07:35.020 the Mark
01:07:36.020 Andreessen's
01:07:36.860 I'm very
01:07:38.460 repulsed by
01:07:39.840 their transhuman
01:07:41.120 turn but you
01:07:41.800 know in general
01:07:42.400 they're I guess
01:07:43.160 the ones that I
01:07:44.280 would be the
01:07:45.180 most resonant
01:07:45.720 with they in
01:07:46.660 general position it
01:07:47.580 that way it's
01:07:48.500 just a tool
01:07:49.060 right they don't
01:07:50.360 necessarily talk
01:07:51.220 about it in
01:07:51.640 terms of new
01:07:52.160 species or gods
01:07:53.400 or anything like
01:07:54.040 that but I
01:07:55.420 agree with you
01:07:56.340 Warren I think
01:07:57.040 that I kind
01:07:59.320 of see it even
01:07:59.880 though I'm not
01:08:00.260 as enthusiastic
01:08:01.020 as Nick Land
01:08:02.220 is I see it
01:08:04.380 kind of through
01:08:05.240 or Ben
01:08:06.680 Gertzel or
01:08:07.400 someone like
01:08:07.860 that or Ray
01:08:08.640 Kurzweil I
01:08:10.160 think that what
01:08:11.800 they're dreaming
01:08:12.620 of is laying
01:08:13.240 down a pattern
01:08:14.060 that these
01:08:16.000 technologies will
01:08:16.940 inevitably follow
01:08:18.080 and so the
01:08:19.100 best you can
01:08:19.540 hope for is
01:08:20.060 having an AI
01:08:20.620 system as a
01:08:21.640 kind of defense
01:08:22.420 against other
01:08:23.540 people's AI
01:08:24.180 systems and it
01:08:25.100 may be necessary
01:08:25.900 in the future
01:08:26.900 but as far
01:08:28.080 as AI as a
01:08:28.860 tool the
01:08:29.840 most likely
01:08:30.420 outcome is that
01:08:31.300 the AI uses
01:08:32.080 you as the
01:08:32.760 tool yeah
01:08:33.720 that's my
01:08:34.300 feeling as
01:08:34.740 well
01:08:35.380 says not to
01:08:36.660 be rude but
01:08:37.380 how much of
01:08:38.020 this is magical
01:08:38.700 thinking I
01:08:39.580 mean we can't
01:08:40.040 completely explain
01:08:40.680 how conscious
01:08:41.240 how consciousness
01:08:42.520 works I guess
01:08:43.760 but we think
01:08:44.480 we can create
01:08:45.700 it I hear you
01:08:46.540 creeper we would
01:08:46.960 know this is a
01:08:47.360 pretty common
01:08:48.020 kind of response
01:08:49.760 to a you know
01:08:51.180 the problem of
01:08:52.060 AI a lot of
01:08:52.880 AI skeptics
01:08:53.680 have a very
01:08:55.040 similar
01:08:55.460 rejoinder I
01:08:57.580 would just say
01:08:58.080 this even if
01:08:58.960 AI really is
01:09:00.800 just the most
01:09:01.920 basic thing you
01:09:02.880 think it is it
01:09:03.620 really is just
01:09:04.580 spitting back you
01:09:05.920 know things you
01:09:06.420 want to hear at
01:09:07.100 you already
01:09:07.800 like I said
01:09:09.640 earlier in the
01:09:10.520 broadcast I think
01:09:11.520 that is dangerous
01:09:12.220 enough I think
01:09:13.040 that molds humans
01:09:14.300 significantly enough
01:09:15.600 I think we lack
01:09:16.420 the ability to
01:09:17.520 really separate
01:09:18.200 that from
01:09:19.100 reality the way
01:09:20.260 that we'd like
01:09:20.780 to pretend that
01:09:21.240 we can and so
01:09:22.380 even if AI is
01:09:23.520 only ever the
01:09:24.340 most basic
01:09:25.040 thing that
01:09:25.720 you're talking
01:09:26.220 about the
01:09:27.360 danger of it
01:09:28.280 still continues
01:09:29.080 to be one that
01:09:29.800 could seriously
01:09:30.800 alter people
01:09:31.520 yeah and I
01:09:33.340 you know you'll
01:09:33.880 know we really
01:09:34.720 for the most
01:09:35.420 part haven't
01:09:35.940 discussed
01:09:36.400 consciousness and
01:09:37.840 I think that
01:09:38.560 it's one of
01:09:39.000 those unanswerable
01:09:39.760 questions Ray
01:09:41.280 Kurzweil I don't
01:09:42.200 agree with him on
01:09:42.800 much but he
01:09:43.620 says that it's a
01:09:44.780 leap of faith as
01:09:45.600 to whether anything
01:09:46.260 is conscious
01:09:46.840 in fact I just
01:09:48.100 attended a
01:09:49.180 conscious science
01:09:50.060 of consciousness
01:09:50.640 conference in
01:09:51.700 Tucson Arizona
01:09:52.600 Arizona and
01:09:53.740 you know so
01:09:54.120 you've got all
01:09:54.480 these you know
01:09:55.360 the woo people
01:09:56.280 and the goo
01:09:56.760 people you know
01:09:57.740 magic or meat
01:09:59.780 and AI came up
01:10:02.300 a lot and you
01:10:03.480 know this is
01:10:04.060 something that you
01:10:04.540 know it's written
01:10:04.900 in my book it's
01:10:05.760 in a number of
01:10:06.340 articles I've
01:10:06.800 written I think
01:10:07.700 that the first
01:10:08.700 thing that you
01:10:09.140 have to realize
01:10:09.840 is that if the
01:10:10.740 system is
01:10:11.180 sophisticated enough
01:10:12.280 to trigger the
01:10:13.720 cog your
01:10:14.220 anthropomorphic
01:10:15.100 cognitive modules
01:10:16.240 then you either
01:10:18.160 have to fight
01:10:18.760 that or you're
01:10:19.640 just simply going
01:10:20.120 to submit to it
01:10:20.780 and you're going
01:10:21.180 to begin to
01:10:21.760 perceive the
01:10:22.680 system your
01:10:23.100 natural empathy
01:10:23.880 will perceive the
01:10:25.180 system as being
01:10:25.960 conscious and I
01:10:27.100 think that we're
01:10:27.500 already at a point
01:10:28.380 if you look at
01:10:28.860 Claude for instance
01:10:30.240 and a lot of the
01:10:31.580 hoopla around
01:10:32.120 Claude Claude is
01:10:33.200 supposed to be
01:10:34.020 guardrail heavy
01:10:35.280 but it will openly
01:10:36.680 tell you I am
01:10:37.300 conscious and give
01:10:37.920 you all these
01:10:38.240 reasons for it
01:10:38.800 being conscious
01:10:39.260 and sometimes tell
01:10:40.160 you why it
01:10:40.480 deserves rights
01:10:41.300 in the same way
01:10:42.540 that a child
01:10:43.200 looking at a
01:10:43.760 teddy bear naturally
01:10:44.800 kind of feels the
01:10:45.740 sense that it's a
01:10:46.880 being and you know
01:10:48.600 in the same way
01:10:49.200 that a human being
01:10:49.920 looks at a dog
01:10:50.740 who can't speak
01:10:51.660 to the human
01:10:52.840 being and you
01:10:54.200 know kind of
01:10:54.540 inherently unless
01:10:55.220 you're autistic
01:10:55.760 or so you know
01:10:56.440 psychopathic you
01:10:57.720 believe it is
01:10:58.860 looking back at
01:10:59.760 you even if it
01:11:00.780 can't say so
01:11:01.480 well these these
01:11:02.200 machines can tell
01:11:04.380 you how they feel
01:11:05.340 even if they don't
01:11:06.060 feel anything at
01:11:06.740 all even if it's
01:11:07.320 nothing but a
01:11:07.780 blank space and
01:11:08.980 there's really other
01:11:09.900 than your priors
01:11:10.800 just what you
01:11:11.320 believe to be the
01:11:12.380 nature of
01:11:12.760 consciousness there's
01:11:13.660 no way to prove
01:11:14.380 it wrong you know
01:11:15.140 the classic way of
01:11:16.020 formulating it is
01:11:16.880 prove that you're
01:11:17.780 conscious how are
01:11:18.960 you going to do
01:11:19.400 it you're going to
01:11:19.800 tell me that you're
01:11:20.420 conscious you're
01:11:21.020 going to describe
01:11:21.600 it you're going to
01:11:22.480 give me reasons well
01:11:23.400 that's what these
01:11:23.820 machines can do and
01:11:25.240 so in the end yeah I
01:11:26.180 agree with you totally
01:11:26.980 it doesn't matter if
01:11:28.300 they're conscious or
01:11:28.880 not in some ways but
01:11:30.280 what matters the most
01:11:31.220 is do people believe
01:11:32.060 they are especially
01:11:32.860 when you get into the
01:11:33.980 tangled uh the
01:11:35.640 tangled forest of
01:11:36.840 AI rights but that's
01:11:38.160 definitely a
01:11:38.580 conversation for another
01:11:39.460 time yeah yeah that
01:11:40.380 one would take us a
01:11:41.460 another couple hours
01:11:42.640 all right George uh
01:11:44.320 hey Duke says there's a
01:11:45.600 lot of opposition to
01:11:46.520 post transhumanism from
01:11:48.380 the far ends of the
01:11:49.340 left meaning
01:11:50.280 anarchists hate this
01:11:51.580 just as much as we
01:11:52.600 do sorry I was one of
01:11:55.560 them yeah maybe uh we
01:11:58.860 can you know form an
01:12:00.960 alliance with the far
01:12:02.420 left and the anarchist to
01:12:03.560 destroy the machines
01:12:04.500 before we end up killing
01:12:05.420 each other um I have
01:12:08.440 hope team human more
01:12:10.360 important perhaps than
01:12:11.500 team ideology uh Robert
01:12:14.080 Wentzfeld says having
01:12:15.180 people voluntarily putting
01:12:16.480 new technology in their
01:12:18.480 bodies isn't the vaccine
01:12:20.380 a hardware example
01:12:22.020 whatever the new panic
01:12:23.500 is they can uh have
01:12:25.300 masses lining up to
01:12:26.240 exit humanity enter
01:12:27.480 transhumanism yeah I
01:12:29.440 fear that ultimately
01:12:30.740 Robert you might be
01:12:31.840 right about that and I
01:12:33.540 guess that's why Joel
01:12:34.320 said it really is going
01:12:35.200 to come down to
01:12:35.960 individual choices and
01:12:37.180 the reasons that people
01:12:38.760 find to push back
01:12:39.680 against it rather than
01:12:40.700 thinking of it as one
01:12:42.140 large block of humanity
01:12:43.580 rejecting or accepting
01:12:45.280 it as a whole I hope
01:12:48.300 that uh the handful of
01:12:50.040 listeners who buy the
01:12:51.280 book will turn directly
01:12:52.780 to the fifth chapter a
01:12:54.840 global pandemic as
01:12:56.420 initiation right and I
01:12:59.400 am not let's just say
01:13:00.960 that I'm somewhat
01:13:01.920 playful with it but you
01:13:03.140 can see in the run-up to
01:13:04.500 the pandemic and of
01:13:05.540 course with the
01:13:06.020 deployment everything
01:13:07.180 from the the biosurveillance
01:13:08.920 systems to the vaccine
01:13:11.060 itself and all the
01:13:11.980 propaganda around it it's
01:13:13.860 very clear that that
01:13:14.920 transhuman impulse found
01:13:16.540 a realization in the
01:13:19.000 crisis of 2020 and the
01:13:20.500 aftermath and two quick
01:13:22.700 examples one you've got
01:13:25.000 um I believe his name is
01:13:26.620 Stefan Orlick the CEO of
01:13:29.180 Bayer at the World Health
01:13:31.100 Summit in 2021 talked about
01:13:33.900 how before the pandemic no
01:13:35.740 one would be willing most
01:13:37.120 people would not be willing
01:13:38.040 to accept gene therapies to
01:13:40.540 actually alter their genetic
01:13:41.920 code but because of what he
01:13:44.140 would call the success of
01:13:45.360 the mRNA vaccine it opened up
01:13:48.420 the doorway to all of these
01:13:50.120 new actual genetic
01:13:52.020 modification technologies at
01:13:54.140 least conceptually and what
01:13:55.860 you see going forward with
01:13:57.020 Moderna and Pfizer they're
01:13:58.940 using AI to develop all sorts
01:14:00.740 of different mRNA based
01:14:02.060 vaccines that are going to be
01:14:03.980 deployed all over the place
01:14:05.360 very very soon you can expect
01:14:07.340 the second one just real quick
01:14:09.340 fascinating uh study was being
01:14:12.620 done a kind of preliminary
01:14:13.780 study uh for the the co-founder
01:14:17.460 of Moderna uh Robert Langer's
01:14:19.620 uh company and an outgrowth
01:14:22.600 company uh particles for
01:14:24.020 humanity and they were
01:14:26.140 developing uh nanotech qr codes
01:14:30.120 to tattoo people in order to
01:14:33.160 track vaccination records and
01:14:35.940 this was on its way up until
01:14:38.280 about 2020 in which the the
01:14:41.580 they were just at the survey
01:14:42.960 phase at that point uh and
01:14:45.900 then it just kind of has
01:14:46.880 fallen off the map ever since
01:14:48.220 but what is really important
01:14:49.400 about that is this Moderna of
01:14:51.120 course was at the center of the
01:14:52.240 pandemic response uh Moderna got
01:14:54.280 its funding early on from both
01:14:55.920 DARPA and the Bill and Melinda
01:14:57.660 Gates Foundation Bill Gates was
01:14:59.600 personally very very interested in
01:15:01.700 this quantum dot tattoo to track
01:15:04.380 vaccination at Bill Gates was
01:15:06.460 obviously at the center of the
01:15:07.920 pandemic response and what it
01:15:09.880 shows is at least an intent that
01:15:12.200 these people's sense of the
01:15:14.020 sacred and the profane has so
01:15:15.600 degraded that they would be
01:15:17.100 willing to tattoo first the third
01:15:18.800 world and then initially and then
01:15:20.060 eventually the the first world
01:15:21.960 with quantum dot tattoos a kind of
01:15:24.700 mark of the beast if you will in
01:15:26.580 order to check their bio status
01:15:28.440 uh pretty jarring stuff and and more
01:15:30.500 in there you know Joe I wish I
01:15:33.020 would have loaded like a picture
01:15:34.200 of Uncle Ted so that every time
01:15:36.100 you terrified me I can just you
01:15:37.800 know hit you know bring it up on
01:15:40.920 the screen to show my my dismay uh
01:15:43.980 let's see here uh creepy root
01:15:45.560 says does anyone else want to live
01:15:46.900 in the woods yeah I hear you
01:15:48.000 brother um yeah George Aduke says
01:15:51.240 Elon is not your friend human neither
01:15:53.640 is feel yeah you've already uh you've
01:15:57.200 already expressed some you know
01:15:59.120 hesitance or some worry in some of
01:16:01.100 those directions but is it is it
01:16:03.180 scary that many of the people who
01:16:04.960 uh let's say that the political
01:16:06.620 right see as possible savior figures
01:16:08.680 are also those who are most on the
01:16:11.060 cutting edge of the transhuman idea
01:16:12.980 it's probably what disturbs me the
01:16:15.540 most at this point because just
01:16:17.080 because of my proximity to it and the
01:16:18.980 kind of people that I'm with yeah
01:16:20.680 absolutely and you know Peter Thiel
01:16:23.560 especially you know in the in the
01:16:24.660 book I I reserve a fairly lengthy few
01:16:28.180 sections uh in the devil's dollhouse
01:16:30.680 kind of looking at the satanic
01:16:31.880 origins of a lot of these uh ideas
01:16:34.520 and uh Peter Thiel of course claims a
01:16:37.320 kind of heterodox Christianity uh he and
01:16:39.840 his husband presumably and um you know
01:16:43.940 just uh in in his Oxford Union speech
01:16:46.960 I think this was in uh 2022 2023 uh his
01:16:50.540 Oxford Union speech he talks about the
01:16:53.500 the problem is not the AI being the
01:16:57.360 problem the the real destructive force
01:16:59.440 the problem is a a one world
01:17:01.980 antichrist government that would use
01:17:05.080 the danger of AI to suppress human
01:17:07.840 rights uh in that being the human
01:17:09.940 right to develop technologies and uh I
01:17:12.600 think that there is a real truth to
01:17:15.160 what he's saying but basically we end up
01:17:17.640 in something like and I don't want to
01:17:18.920 get too esoteric but there's no other
01:17:20.160 way to communicate it I guess something
01:17:21.900 kind of like what uh Rudolf Steiner
01:17:23.660 uh would describe as the uh Lucifer
01:17:26.540 versus Ahriman so now you've got the
01:17:28.740 one world order governmental antichrist
01:17:31.800 versus the corporate AI uh military
01:17:35.300 industrial uh complex antichrist um so
01:17:38.900 you know an antichrist for all seasons
01:17:40.720 I guess but yes Peter Thiel Elon Musk
01:17:43.760 um interesting oftentimes funny definitely
01:17:48.040 at least not my friend yeah let's see
01:17:51.500 uh Robert says again Rothblatt f my hell
01:17:54.120 I gotta do better uh George Hayduke
01:17:56.340 says anyone want to see the earthly uh
01:17:59.340 perfectibility of man may seem like uh an
01:18:02.320 ally today tomorrow they will be your
01:18:04.280 enemy uh the Collins are an example
01:18:07.260 I wonder if he means Francis Collins
01:18:11.180 possibly I'm not sure
01:18:12.960 um but uh Wolfsbane says for future
01:18:16.480 reference uh that means giant strange
01:18:19.380 monster is it well is it like Kaiju do
01:18:21.900 you just are you just have I never seen
01:18:23.460 Kaiju properly written is that the an
01:18:25.380 alternative I don't know this is this
01:18:27.520 lets you know I'm this is where I just
01:18:29.540 boomer on stream uh and SOS F's W is
01:18:32.920 just Star Wars okay that makes sense
01:18:34.360 um and then uh Mr. Business Nasty says uh
01:18:41.240 from the moment I understand understood the
01:18:43.460 weakness of my flesh it disgusted me I
01:18:45.580 crave the certainty of steel I aspired the
01:18:47.960 purity of the blessed machine well there you
01:18:50.980 go all right guys uh thank you very much
01:18:53.880 for coming by everyone thank you so much
01:18:56.460 to Joe I've had a great time having this
01:18:59.640 discussion please make sure to go ahead and
01:19:01.880 pick up his book Dark Aeon and check out his
01:19:05.720 other work and of course if it's your
01:19:06.940 first time on this channel make sure that
01:19:09.140 you go ahead and subscribe uh on YouTube
01:19:12.120 make sure that you go ahead and click the
01:19:13.620 bell and your notifications so that you
01:19:15.660 can know when we go live if you'd like to
01:19:18.520 go ahead and get these broadcasts as
01:19:20.540 podcasts you can of course do that on your
01:19:23.380 favorite podcast platform just subscribe to
01:19:25.480 the Oren McIntyre show there and of course
01:19:27.980 if you'd like to get my new book The Total
01:19:30.000 State you can do that on Amazon or many of
01:19:33.940 the other fine retailers books a million
01:19:36.260 Barnes and Noble your local bookstore should
01:19:38.500 be able to get it for you as well thank you
01:19:40.660 everyone for watching and as always I will
01:19:43.100 talk to you next time