The Auron MacIntyre Show - July 28, 2025


The Transhumanist Temptation | Guest: Grayson Quay | 7⧸27⧸25


Episode Stats

Length

52 minutes

Words per Minute

177.72966

Word Count

9,358

Sentence Count

455

Misogynist Sentences

13

Hate Speech Sentences

33


Summary

In this episode, I chat with author and consultant Grayson Quay about his new book, "The Transhumanist Temptation," and how transhumanism has infiltrated every area of our lives, from the spiritual to our business to our personal lives.


Transcript

00:00:00.260 Hey everybody, how's it going? Thanks for joining me this afternoon. I've got a great stream with a great guest that I think you're really going to enjoy.
00:00:08.720 Transhumanism is a huge issue and it's one that I don't think we talk about near enough. It's something that is approaching us at every level of our lives from the spiritual to our business to our personal lives.
00:00:20.720 Everything we do seems to some way interact with technology. We're being asked to alter our bodies, our minds in so many ways. I think it's critical that we think about this now before it all overtakes us.
00:00:33.500 My guest today is a consultant and an author. He's just written a book called The Transhuman Temptation. Grayson Quay, thank you so much for coming on, man.
00:00:42.060 Yeah, thank you for having me.
00:00:42.920 Absolutely. So like I said, most people, when they think of transhumanism, you know, they think, okay, this is a problem, but I see it in like a sci-fi movie somewhere.
00:00:52.080 Someone's getting an artificial, you know, cyborg body, something like this. It's very radical. It's in the future.
00:00:59.480 Maybe they start to see a little bit of that in their day-to-day life, but they don't think of themselves as being in a transhuman world.
00:01:06.080 And I think a lot of this comes from a limited definition of what transhumanism would mean, which is something you definitely address in your book.
00:01:14.500 So could you talk to me a little bit about your understanding of transhumanism and how it applies to someone who's just going in through today's world?
00:01:23.120 Yes, this is one of the big goals I had in my book was to try to get people to see this larger definition of transhumanism,
00:01:30.120 to kind of see all the ways that it's already infiltrated the world around us.
00:01:34.100 So, yeah, like you said, it's something that's very easy to sort of dismiss as science fiction or as kind of kooky.
00:01:40.200 You see, you know, Brian Johnson doing his weird, you know, biohacking thing.
00:01:45.040 I don't know if you saw it the other day. He just tweeted, there's microplastics in my ejaculate.
00:01:50.240 Yeah, he's constantly like measuring the number of erections his son gets at night.
00:01:54.960 It's a very disturbing obsession for sure.
00:01:57.960 Yeah, or Ray Kurzweil, who famously, you know, is big on the uploading my brain to the cloud and living forever thing.
00:02:06.100 See, all this very bizarre stuff, you know, they put out these books that are kind of poorly formatted and badly written.
00:02:11.560 They have these weird websites.
00:02:13.780 You know, they make these grandiose predictions that like, oh, if you can live to 2020, you'll live to be a thousand years old.
00:02:19.300 Uh, and none of it seems to pan out, but the problem is what they're doing is they're just taking the, uh, premises or the, the ideology that so many people have already accepted just tacitly.
00:02:34.360 It's just kind of in the air we breathe and sort of taking it to its logical conclusion.
00:02:38.240 Um, they're just a little further down the slippery slope than the rest of us are.
00:02:42.360 Uh, the kind of root of transhumanism is really a denial of what you'd call, uh, the natural law tradition that, you know, you can find running through the, uh, you know, the scriptures and through Plato and Aristotle and Cicero and Aquinas and all the way through the Western tradition.
00:02:59.960 This is kind of one of the central ideas of the Western tradition is that human beings have a nature and because we have a nature, there's a type of, of excellence, uh, that's fitting to what we are, um, that there's a type of flourishing that's proper to us.
00:03:15.780 The eudaimonia is the Aristotelian term and that, you know, our telos or our goal is aligned with achieving that.
00:03:22.860 Um, transhumanism is basically the rejection of all of that.
00:03:26.520 It's this, you know, Nietzschean or Sartrean idea that, you know, humanity has no nature.
00:03:32.260 Humanity has no essence except for what we define for ourselves.
00:03:36.180 It's up to you to decide what a good human is.
00:03:40.180 It's up to you to decide what a human is in general.
00:03:42.920 Uh, you know, the meaning of your life is to give your life a meaning, right?
00:03:46.540 And once you've accepted that, like once you've.
00:03:50.440 Sort of enshrined your will in the central location where some fixed idea of human nature used to be.
00:03:56.520 Then suddenly all of the things that the transhumanists propose are on the table.
00:04:01.040 There's no reason not to do any of that.
00:04:04.140 Yeah.
00:04:04.620 And this is a real problem.
00:04:05.840 I think for many people who think of themselves as political conservatives, because obviously what they are usually trying to do is conserve the last version of their current society.
00:04:16.440 They, they don't always see around these corners.
00:04:19.160 And one of the problems we run into repeatedly is as technology develops, we usually are introduced to the technology before we've really evaluated the moral implications of it.
00:04:31.680 And so when you think about abortion or trans stuff, in a lot of ways, Christians were lagging behind on these issues because the technology appeared before the theological, you know, bulwark could be built up against these possible problems.
00:04:47.640 And so one of the things that happens when we lose sight of that telos, when we lose sight of our excellence, what we are supposed to be a part of, what tradition we are supposed to achieve that excellence inside, when we don't have that as a guiding light from the beginning, it's very, very easy for technology to go well past us.
00:05:05.600 And we've spent all of our time playing catch up because we simply don't recognize the dangers before they've already arrived.
00:05:12.380 Yeah. And we've lost the philosophical resources that we would need to really draw a line in the sand for how a lot of these technologies are used.
00:05:19.240 So if you take one piece of technology, like artificial wounds, for example, this is, you know, a big transhumanist technology.
00:05:27.080 Feminists have been talking about it for decades, going back to, I think, Shulamith Firestone in the 1970s.
00:05:31.280 You know, she says that the other feminists were wrong.
00:05:34.320 It's not society that's oppressive, it's biology.
00:05:37.460 And the only way to conquer that is through artificial wombs.
00:05:41.560 And there hasn't been one yet that really works for humans.
00:05:45.100 It hasn't been tried, but they've tried it on animals and it's worked okay.
00:05:49.640 But there's actually a really valid application for those, which is if, you know, a woman goes into labor before the baby is viable,
00:05:58.180 you could stick it in an artificial womb and let it kind of come to the point of viability.
00:06:04.320 When the technology was first being discussed, there was actually some feminist outcry against it because they thought it would be used to restrict abortion for that reason.
00:06:10.960 It kind of undermines the whole bodily autonomy thing because now suddenly you can expel the intruder without killing it.
00:06:18.080 But that's not the point.
00:06:19.400 The point is to avoid parenthood altogether.
00:06:21.960 But anyway, we don't have the philosophical resources we would need to actually draw that line there and say, no, like this would be a good use of this technology.
00:06:31.380 But making it the new default for human reproduction and just growing all the babies in giant warehouses full of pods would be a step too far.
00:06:42.300 And we're going to restrict that or ban that.
00:06:45.340 That's not going to happen.
00:06:46.720 You know, it's going to be sold to people.
00:06:49.520 This is what always happens is they'll sell you a new technology with kind of the most unobjectionable version of it possible or the most unobjectionable use of it possible.
00:06:58.480 So Elon Musk's Neuralink, right?
00:07:00.960 They're smart.
00:07:01.840 The first person they give it to is a quadriplegic who wants to use it to move his mouse around so he can play Civilization VI and do his online Bible study.
00:07:11.760 But if you think that that's where that technology is going to stop, you're crazy, right?
00:07:15.500 It's going to become a general purpose consumer technology and suddenly we'll have an entire society of glassy eyed people who are staring into space, just kind of operating this digital UI with their brains, which will only be a slight deterioration from the current society of glassy eyed people we have staring at phone screens, but a deterioration nonetheless.
00:07:38.040 So this actually creates, I think, a problem even for people with our perhaps concerns when it comes to transhumanism, because, you know, I think about something like surrogacy, right?
00:07:51.240 Again, something that Christians just had not caught up on.
00:07:55.060 They, you know, it was sold to them as, well, this is just helping people who want to have families have families, right?
00:08:00.940 And that's a pretty easy thing to sell Christian conservatives on.
00:08:05.340 They want people to have the ability to have children.
00:08:07.860 We couldn't have children because of our biological situation or some other one.
00:08:12.260 And so therefore we're just getting assistance in this right now, down the road, we start to see, oh, no, there are like gay couples, specially ordering children.
00:08:21.040 If they don't get the right one, they force the mother to have abortions.
00:08:24.220 And in several documented cases, special ordering children specifically to abuse.
00:08:32.220 Right.
00:08:32.960 And someone who was arrested for messaging in a group chat saying like how they plan to abuse their surrogate child once it was born.
00:08:39.920 Right. And so we see that the horrific uses of the technology show up after, you know, the horse is kind of out of the barn.
00:08:49.320 And so I think about this when it comes to an artificial womb.
00:08:52.120 So you say, well, a legitimate use of this would be to keep a child alive who otherwise wouldn't be viable outside the mother's womb.
00:09:00.320 Right. But as you point out, once that technology has been sold for that use, there's no reason it won't expand to all the other uses that you're terrified of.
00:09:11.140 The existence of the technology itself is, you know, it's a self-fulfilling prophecy.
00:09:17.120 We will get to the end of that use.
00:09:20.120 The most horrifying use will be the one that it actually gets used for if it exists at all.
00:09:25.720 And so I think this does and this is probably a much wider question, but we already got here.
00:09:30.660 So I guess we'll deal with it now.
00:09:32.600 This is a much larger problem for, I think, those who are worried about the transhuman issue.
00:09:38.520 Is some level of neo-Ludditism required to eventually avoid some of this or is ultimately it always about simply trying to find the most ethical use for technologies when they come out, even if they will inevitably get, you know, corrupted by some other application?
00:09:57.880 Yeah, I think I think there will need to be some kind of point at which people who want to keep their humanity just draw a line in the sand.
00:10:08.280 I think, you know, a good way to do that is just not getting implants, maybe, because I think that'll be I don't mean like breast implants, though.
00:10:15.340 You probably shouldn't get those either.
00:10:16.440 But I mean, you know, like a neural implant.
00:10:18.740 I think that, you know, kind of letting them inside your head is probably a bridge too far.
00:10:23.600 I really admire people, though, who've tried to make the step back to dumb phones.
00:10:27.000 I haven't taken that step yet myself, but it's something I've thought about a lot.
00:10:31.200 Generally speaking, though, I think that most technologies, there's certain reproductive technologies that I think probably don't have any good uses, really.
00:10:42.120 You know, I would probably ban surrogacy altogether, for example.
00:10:44.740 But with most things, I think it's a question of whether we use it as a tool or whether we use it as something that's going to make us into a different type of being.
00:10:54.660 All right. So one of the examples I use in my book is, you know, I'm not against technological innovation.
00:11:00.520 I would love it if a construction worker could go to work and plug a, you know, plug a set of four robot arms into his into his brain and operate them with his mind as though they were part of his body and use them to pick up giant steel girders and carry them around the construction site all day.
00:11:19.240 That would be very cool.
00:11:20.140 It would increase productivity. It would have all it would have all these benefits.
00:11:24.280 But when he clocks out, I want him to unplug those and leave them at the work site.
00:11:28.060 I don't want him to go home and move through life as a six armed transhuman cyborg.
00:11:33.360 But unfortunately, that's what's going to happen.
00:11:37.040 Or, you know, I think that something like augmented reality technology where, you know, glasses now and then potentially, you know, implants in the future give you some kind of digital overlay over the visual field that you're perceiving.
00:11:52.800 I think there's a lot of uses for that in the workplace and in job training and in education and things like that.
00:11:59.100 My concern is that you will keep those on all the time.
00:12:04.140 And now suddenly everything you see is mediated by and filtered through the preferences of whatever big tech company controls your your neural implants and the augmented reality interface.
00:12:16.340 So get ready for a lot of pop up ads in your peripheral vision.
00:12:20.160 And, you know, perhaps if you're seeing anti social behavior by, you know, members of groups that the the powers that be would not want you to form a negative impression of it'll just blank it out for you or something.
00:12:36.020 Who knows?
00:12:37.660 Yeah. And this is what really I think it creates the huge problem for me is, as you say, of course,
00:12:43.040 we could all see the benefit of some construction worker not having to break his own back to lift all these girders and, you know, having the interface that allows that.
00:12:52.920 But I just think of the cell phone, right?
00:12:54.920 The minute I have this phone, it it yeah, in theory, it allows me to do more work.
00:13:00.620 But it also is a Pandora's box into 9000 other things simultaneously.
00:13:06.120 And so the creation of that interface itself, you know, it's the McLuhan.
00:13:10.960 The medium is the message, right?
00:13:13.440 Ultimately, the way that these things are designed, the way that they're delivered, the way the technology develops, we can't control it.
00:13:20.400 We can't contain it.
00:13:21.320 We can't limit its use in the areas that we want specifically.
00:13:24.840 I think the only I don't know if you've ever seen the show.
00:13:27.680 It was on Netflix.
00:13:28.600 It was called Altered Carbon.
00:13:31.760 But it was it's a very interesting show in that the conceit of it is basically people keep all their memories in like this, you know, implant that goes in the back of their spine.
00:13:40.940 And if you want to travel into some other body or you want or you die, you can be basically moved in these things.
00:13:48.140 But it's it's only the rich that can really afford to continually swap the bodies and keep, you know, most average people get stuck in some kind of digital slavery with their with their memory because they can't afford the next upgrade, the next body, that kind of thing.
00:14:03.620 The only people that maintain a real human existence are the Catholics, because they're part of their religious commitment is literally to turn to make their stack unmovable.
00:14:16.800 So they still have to keep the memory thing.
00:14:18.820 They still have to adapt the technology, but they have a hard block on it as part of their religious practice that makes it impossible for anyone to hack in or move them around or, you know, extend their life by moving them to a different body.
00:14:31.600 They they are stuck in what they are.
00:14:33.540 And so, you know, that is the only, I think, sci fi that I've seen in, you know, addressing this issue of transhumanism that explicitly showed how a religious minority might stop themselves from being pulled into this.
00:14:49.440 They're they're still integrating the technology, but they're putting the heavy limitations on it that make them particularly unique and often seen as backwards in that scenario.
00:14:58.000 But that might be what it would take.
00:15:00.260 Yeah, I grapple with this a lot in the conclusion of my book where I look at potential kind of ways forward and ways to sort of arrest the progress of transhumanism.
00:15:06.780 I wish I'd seen that show.
00:15:07.620 It sounds really interesting.
00:15:10.760 One that I talk about a lot is this sci fi novel by Neil Stevenson from the 90s called The Diamond Age, where there's sort of a kind of minarchist, you know, global justice system of sorts that you can opt into.
00:15:26.400 And then everything else is these kind of large, decentralized, intentional communities.
00:15:32.120 They're called files, you know, from the Greek P-H-Y-L-E-S.
00:15:35.100 And they can be built on ethnicity, on ideology, on religion, on some kind of shared belief system, anything.
00:15:44.660 So one is the one is just the Han Chinese.
00:15:47.460 One is the some of them go like full transhumanist.
00:15:51.300 One's called the Drummers, and they just have this this neural uplink that turns them into this hive mind.
00:15:56.620 And they just are underground having this constant like hive mind rave orgy.
00:16:01.640 It's real weird.
00:16:02.680 But the one that you kind of spend the most time with in the book is called the Neo-Victorians.
00:16:08.340 And their whole thing is being very intentional about the ways they use this advanced nanotechnology and all the other stuff in this world in order to kind of maintain competitiveness with the other files in this world,
00:16:20.720 while also maintaining the kind of morals and mores of their own society.
00:16:25.380 And I think that the Diamond Age gives you a really good vision of that, because when it's just you or it's just your family,
00:16:32.940 it is really hard to swim against the current with technology.
00:16:37.040 It is really hard to be the only kid on the block without a smartphone.
00:16:39.760 But if you have a large intentional community that's making specific choices to reject these technologies,
00:16:46.960 you can reinforce each other in that and you can even potentially extract concessions from the outside world.
00:16:52.800 So, you know, one example, this wasn't like a transhumanist technology thing,
00:16:56.620 but a lot of apartment buildings in areas with high Jewish populations will have these Sabbath elevators that stop automatically on every floor
00:17:06.760 because you can't press an electronic button on the Sabbath.
00:17:10.020 You know, that's a way that a particular community's restrictions on the use of technology have caused the outside world to accommodate them in some way.
00:17:17.500 Now, how long our, you know, enlightened, immortal, you know, 300 IQ transhumanist overlords will accept and accommodate us,
00:17:29.060 you know, un-augmented peons in their beautiful utopia is another question.
00:17:35.260 And the Catholic Church is an interesting question.
00:17:37.600 You know, I'd love to believe that the Church would, and practicing Catholics, would reject certain technological innovations in that way.
00:17:46.540 I mean, it is the case that a large number of Catholics use birth control, for example.
00:17:51.480 I think it's like 90% of self-identified Catholics, which doesn't mean a lot.
00:17:55.000 But, you know, if you just look at the ones in the pews, it's a lot lower, of course.
00:17:59.300 But, you know, I think that the Church will have to probably be a little more aggressive with excommunicating and disfellowshipping people
00:18:09.920 who disregard its bioethical teachings in this way if they want to sort of try to hold the line against this onslaught of transhumanism.
00:18:21.140 Yeah, I would bet much more on the intentional communities that you're pointing to than I would a wider, you know, transnational body, you know, ultimately applying those things.
00:18:34.320 It's interesting, actually, the scenario that you're describing there from that sci-fi novel sounds an awful lot like neo-reactionary patchwork.
00:18:42.840 It sounds a lot like Curtis Yarvin and Nick Land suggesting the way that communities could interact in the future.
00:18:49.100 But we said, okay, transhumanism isn't all this sci-fi stuff.
00:18:54.040 And then it's my fault.
00:18:54.840 We immediately went to all the sci-fi explanations and the end of the story first.
00:18:59.860 So maybe we should walk back a little bit and hit some of the things that are happening right now.
00:19:03.940 So one of the things that you point to here is just the trans movement itself and body modification.
00:19:13.320 And again, a lot of people would probably not think of this as being transhuman, but it's, of course, the very, very basic step, right?
00:19:20.520 You're rearranging the body.
00:19:22.680 You're removing organs.
00:19:24.420 You are altering, you know, biological processes.
00:19:28.640 And you're not even really getting the real thing, obviously.
00:19:31.240 In most of these cases, you're getting a very, very ghastly attempt at a facsimile.
00:19:37.380 Whatever you do, don't look into what trans surgery actually does to people.
00:19:41.080 You, or at least don't eat after.
00:19:43.500 And so, you know, these things are almost horror movie level alterations to a human body to produce a facsimile that is not even close.
00:19:55.960 But most, again, most people would not think of this as being transhuman.
00:20:00.180 And can you talk a little bit about what opening up the body modification and ability to alter your gender, these kind of things?
00:20:07.400 What does that do to the human mind?
00:20:09.280 And what doors does it open up to transhumanism?
00:20:13.240 Yeah.
00:20:13.660 So I think that part of it is just that, you know, it all comes back to this rejection of teleology and of an idea of human nature and of the substitution of the sovereign human will where we used to believe in human nature.
00:20:28.040 So, yeah, the transgender thing is an excellent example.
00:20:32.620 You know, ideally, there would be kind of a teleological understanding of medicine where, okay, you are a human being.
00:20:39.780 We know what a healthy, flourishing human being looks like.
00:20:42.800 It is my job as a physician to restore you to that state of flourishing.
00:20:46.460 You come to me with your arm broken, I will set your arm so that it heals in the way it used to be.
00:20:52.180 Or even, you know, you come to me with your arm cut off, I'll supply you with a prosthetic that approximates the function of the limb you lost as best as possible in order to do what I can to restore you toward kind of normative human flourishing.
00:21:06.100 You know, transgenderism, throw that out the window.
00:21:09.900 That's, you know, I go to the doctor and say, my left arm makes me upset.
00:21:14.680 I want you to cut it off.
00:21:17.160 And any doctor that would do that, you know, most doctors probably still wouldn't do that.
00:21:20.560 But, you know, I've had trans people argue that they should if it's genuinely causing me distress.
00:21:24.540 So to really understand, I think, what the trans movement really is and where it's headed, there's two people you should read.
00:21:32.700 One's Andrea Longchu, who, you know, had these two essays.
00:21:37.240 One was called, who's a male to female transgender who wrote, famously wrote this long book about how watching sissy porn turned him trans and that that's fine and okay.
00:21:47.980 But he wrote these two essays.
00:21:51.600 One was called something like, if puberty blockers are too extreme for children, then puberty is too.
00:22:01.000 Which right off the bat is just this bizarre false equivalency.
00:22:03.760 Like, one is your body doing what it's naturally supposed to do and the other one is inhibiting your body in its attempt to carry out this normal physical biological process.
00:22:19.740 The only difference is that, is whether you want it or not, right?
00:22:23.480 It's, again, it comes back to this idea of the sovereign will and how even your own biology doesn't have the right to restrain that sovereign will.
00:22:31.580 It's all about your desires, not about your nature or about your flourishing.
00:22:37.000 Chu has this other essay called, My New Vagina Won't Make Me Happy, And That's Okay, where he basically argues that, he actually admits that he didn't become, I shouldn't use the word, he didn't have the desire to take his own life.
00:22:52.040 Didn't use the S word, there we go.
00:22:53.620 He didn't have the desire to take his own life until he got further along in his transition.
00:22:58.500 Did not see his Canadian health care.
00:23:00.280 Yeah, exactly.
00:23:01.220 He didn't seek, he didn't have the desire to, yeah, until he got further along in his transition.
00:23:06.160 But he says, that doesn't make my transition any less valid because I want it.
00:23:10.020 You know, it's what I desire.
00:23:11.860 He actually says, like, happiness and desire are totally separate things, which is like a complete rejection of the entire Western tradition, you know, both Christian and classical, right?
00:23:23.260 There's always this idea that your rightly ordered desires will point you toward eudaimonia, toward proper human flourishing and in the Christian sense, ultimately toward God.
00:23:34.600 And actually, in the Christian understanding, the place where your desire and your happiness becomes fully divorced is hell.
00:23:40.880 Because you get what you desire at the cost of the only thing that can ultimately make you happy, which is God.
00:23:47.240 And so this is really, you know, there's really a Luciferian impulse here.
00:23:51.240 My original title for the book was actually The Serpent's Promise, because part of my argument is like transhumanism goes back to the Garden of Eden in a sense.
00:23:59.500 So that was Andrea Longchu.
00:24:02.240 The other person you should read is Martine Rothblatt, who is another male-to-female transgender, was one of the inventors of Sirius Satellite Radio, interestingly.
00:24:15.160 I think may have gone to school with Chris Rufo, maybe, to business school, something like that.
00:24:22.940 But yeah, Martine Rothblatt wrote this book called From Transgender to Transhuman.
00:24:29.500 And on the cover of this book is actually his avatar in the video game Second Life, who is a black woman.
00:24:36.500 So it's just kind of, you can see sort of the progression.
00:24:39.700 It's like physically I can go from being, you know, a Jewish, a white-looking Jewish man to a white-looking Jewish woman who still looks like a man.
00:24:48.840 But in cyberspace, you know, all this meat isn't holding me back.
00:24:52.760 I can be a black woman if I want.
00:24:54.260 And this is really, you know, he says that the first, transgenderism is just kind of a first baby step toward what he calls freedom of form, right?
00:25:04.700 That I am this kind of free-floating will and consciousness, and that should be able to take any shape or no shape according to my whims and my desires.
00:25:15.660 So it sort of comes back to what you were talking about with our altered carbon.
00:25:18.260 There's this kind of stick that contains your personality, and that can be swapped between, you know, vat-grown bodies or robotic bodies of any type or shape.
00:25:28.180 It can even just kind of bounce around in the cloud and not be physically embodied anywhere.
00:25:34.480 And then at that point, you know, his argument is once you have people who are just living in virtual spaces entirely,
00:25:41.580 then who's to say that you can't have people who are created in those virtual spaces?
00:25:47.780 You know, these AIs that can then later be downloaded into, you know, vat-grown bodies and just walk around and interact with everyone else in society.
00:25:58.460 He actually says, like, if you have an AI assistant that can impersonate you, you know, to make appointments or carry out mundane tasks or whatever,
00:26:07.200 and you die and that AI says, you know, I'm Grayson or I'm Orin now.
00:26:11.400 So Rothblatt says we should have a psychiatrist talk to the AI, and if the psychiatrist thinks the AI is, you know, close enough, then it can continue your legal existence.
00:26:26.620 You know, it can take possession of all your assets and your birth certificate and everything and be you, which really just –
00:26:33.400 And then, you know, one of his other crazy things is that you can actually, you know, if you fall in love with an AI,
00:26:40.880 you can have children with the AI by just kind of melding your mind files together.
00:26:46.100 And then now you have sort of purely post-human digital beings that have been born there in cyberspace,
00:26:54.180 you know, which I think we're already experiencing with people just getting one-shotted by these chatbots
00:26:59.880 and, you know, falling in love with them and, you know, losing their minds and jumping off buildings
00:27:04.540 and trying to fight the police and all this other stuff.
00:27:06.740 Like, there's really – so far, it's kind of, I think, at the fringes of people who are already very lonely
00:27:13.420 and already maybe a little mentally unstable.
00:27:15.400 But I think as the technology gets better, it's going to swallow up more and more people.
00:27:20.520 Well, we also have more and more people who are lonely and mentally unstable,
00:27:23.400 so the market is going to keep growing as well.
00:27:25.260 Yeah, yeah. And the, you know, there's going to be such a – and also there's going to be AIs all over social media
00:27:32.740 and all over any, you know, online platform you use.
00:27:36.260 So you're going to have to perform the Turing test a million times a day,
00:27:39.020 and you'll get it wrong some of the time.
00:27:40.740 So we're already seeing an erosion of the idea that there's something special about humanity
00:27:45.760 or human consciousness, right?
00:27:47.560 That's already, you know, the very presence of AIs, the growing presence,
00:27:52.060 is already sort of subtly insinuating into our minds the idea that, you know,
00:27:56.220 hey, it's all just information processing.
00:27:57.960 Some of it runs on carbon, some of it runs on silicon, but it's all sort of the same thing, right?
00:28:03.500 Yeah, so this creates an interesting issue as well.
00:28:08.460 You talked a little bit about how part of this was the discarding of metaphysics, right?
00:28:13.360 The classic understanding, philosophical understanding of Telos and, you know, these things.
00:28:19.660 And that's true, of course.
00:28:21.380 I do think, however, it is the embracing of another metaphysics.
00:28:25.060 So, you know, Oswald Spangler, I think very appropriately, called Western man Faustian,
00:28:31.480 which given, you know, your allusion to the satanic nature, the Luciferian nature of this temptation,
00:28:36.720 I think is pretty apt.
00:28:39.320 And obviously, some humans are all, there's some level of human nature that is always going to be tempted by this.
00:28:48.020 But I think particularly, actually, post-Enlightenment rationalist Western Europeans
00:28:53.940 have a particular cultural metaphysic that drives them towards this end.
00:29:00.020 And in a lot of ways, the escaping of all things that have otherwise been dictated or limited by God
00:29:09.060 is something that is deeply built in to, you know, kind of our psyche and our understanding of the world
00:29:16.960 and the way we should move forward.
00:29:18.760 And you kind of hit this in a couple of different areas.
00:29:21.160 I don't know if this was something that was top of mind for you,
00:29:23.520 but whether it's AI creating this infinite space of friends we can have or lovers we can have
00:29:31.440 or children we can have all artificially, or it's, you know, it's capital, you know,
00:29:37.020 disassembling different aspects of human nature and its limitations and the borders and boundaries
00:29:42.280 and different traditions that make us who we are.
00:29:46.460 All of these things are things we seem to be driving forward inevitably,
00:29:51.000 whether we want to stop them or not.
00:29:54.200 It seems like we are, you know, from the very beginning, you know, Mary Shelley
00:29:57.820 and the beginning of the sci-fi genre, it was, oh, we created this thing and it's going to destroy us, right?
00:30:04.000 Like that's been the theme since we more or less started doing science in any real and serious way.
00:30:10.660 And yet, despite this constant worry, you know, the AI safetyism and everything that we hear
00:30:15.500 from some of our biggest tech gurus today,
00:30:18.380 ultimately we seem to do little or nothing to prevent this and we seem to be driven towards it no matter what.
00:30:25.960 Is there a tragic flaw that will always draw us across this line no matter how we battle it?
00:30:32.360 Or is this ultimately something that can be recaptured by a proper application of discipline and philosophy
00:30:39.880 and understanding of tradition?
00:30:42.440 Well, I think we are really kind of in the third instantiation of a pattern that we see established in Scripture,
00:30:51.120 where, you know, you have the period leading up to the flood in Genesis,
00:30:56.480 where this isn't like crystal clear in the actual text of Genesis.
00:30:59.520 But if you look into the Anakic literature that kind of expands upon it, you know,
00:31:03.600 it talks about how these, you know, the fallen angels are interbreeding with Cain's descendants
00:31:08.420 and creating these Nephilim kind of hybrid super beings, half angel, half human creatures.
00:31:14.420 And then these fallen angels are kind of serving as, you know, spirit guides to these clans,
00:31:22.400 these Nephilim clans and giving them technology, right?
00:31:25.740 So you see that in the text of Genesis itself, when it says Tubal Cain is the first to smelt bronze
00:31:30.460 and other members of this line are the first to create musical instruments.
00:31:34.660 So these are all technologies, many of which increase the civilization's prowess in war
00:31:41.200 and in kind of divination and ritual and other ways of, you know, other forms of techne
00:31:46.880 that they can use to increase their power over nature and over their fellow man.
00:31:52.340 And the sort of devastation that they cause, the tyranny that they inflict both on the natural world
00:32:00.420 and on other human beings is really insane.
00:32:04.220 It's described in very graphic detail in the Book of Enoch.
00:32:08.320 And the flood kind of knocks that back.
00:32:12.360 And then you see, once again, kind of another Faustian application of human techne
00:32:16.900 after the flood in the Tower of Babel.
00:32:18.880 And God sort of sets the clock back again.
00:32:22.820 And I think that if you look at that pattern, we're sort of now approaching another such moment
00:32:28.200 where we have sufficient technology not just to harm creation, to harm our fellow man.
00:32:35.940 It's not just a question of having better weapons or, you know, fossil fuels or whatever that is.
00:32:41.020 It's a question like we can literally unmake humanity.
00:32:46.020 So I think there is, I think there has always been this sense, yeah, that our technology,
00:32:52.740 our techne is dangerous and could lead us to this rebellion against God if we're not held back from doing that.
00:33:00.360 And we currently don't seem to be being held back from doing that.
00:33:03.980 Perhaps some kind of cataclysm is on the horizon that'll put a little more time back on the clock.
00:33:10.480 But I think we're headed toward some kind of point of no return here.
00:33:15.760 I want to be more optimistic about technology.
00:33:19.280 So C.S. Lewis talks about, you know, he goes back to sort of the period of the, you know,
00:33:26.220 where you saw science emerging kind of alongside alchemy,
00:33:28.760 that these were two ways of attempting to master nature through, you know, human technique.
00:33:34.560 And, you know, science worked and alchemy didn't basically.
00:33:37.220 And that's why science carried forward.
00:33:40.860 But he takes the position, Lewis, that man's conquest of nature is always going to turn around
00:33:48.500 and try to turn around on man and try to become a conquest of human nature,
00:33:52.020 which has the effect of reducing humanity to raw material in the hands of humanity,
00:33:58.920 which at that point, you know, it doesn't, it no, the humanity is no longer a meaningful category, right?
00:34:04.020 There is no, we've discarded any idea of human nature.
00:34:08.020 There's no, you know, the fact value distinction, the is ought problem is totally enshrined in our, in our thinking.
00:34:15.020 So now humans are just whatever humans want to make them or whatever, you know,
00:34:21.220 comes after humans want to make them after that, you know, recipe for transhumanist dystopia.
00:34:26.300 I'd like to believe that there could at least hypothetically be a way to channel our kind of innovative
00:34:34.720 and Faustian impulse in ways that don't attack and degrade human nature itself.
00:34:42.280 So, I mean, Elon Musk is a perfect example.
00:34:45.560 You know, he came out with this sort of pornographic AI anime girl that can talk to you on Grok.
00:34:53.940 And I would like him very much to stop doing that and just to build spaceships.
00:34:59.620 I would, I would really like to, I think, I think we can, you know,
00:35:04.460 we could either abolish humanity or we can conquer the stars.
00:35:07.160 And I would much rather conquer the stars.
00:35:08.760 I'd way rather have a...
00:35:10.760 This episode is brought to you by Square.
00:35:13.640 You're not just running a restaurant, you're building something big.
00:35:17.600 And Square's there for all of it, giving your customers more ways to order,
00:35:21.760 whether that's in person with Square Kiosk or online.
00:35:25.680 Instant access to your sales, plus the funding you need to go even bigger.
00:35:29.580 And real-time insights so you know what's working, what's not, and what's next.
00:35:34.260 Because when you're doing big things, your tools should too.
00:35:37.960 Visit square.ca to get started.
00:35:40.760 You know, I'd way rather have a colony on Mars than have an infinite virtual metaverse
00:35:45.560 in which we can all lose ourselves.
00:35:48.420 I do, but this is, yeah, and this really is ultimately what I wonder is, is modernity the
00:35:53.280 great filter?
00:35:53.960 Is that the reason that we never actually, you know, get out there and find alien life in
00:35:57.940 the stars?
00:35:58.720 Because there is a complexity of civilization that reliably returns us back to this.
00:36:05.320 And, you know, as you point out, this is a cycle in the Bible.
00:36:08.160 It's a cycle throughout history.
00:36:09.500 The Bronze Age collapse, the collapse of the Roman Empire, you know, tons of technology and
00:36:15.120 sophistication are lost in any of these periods.
00:36:17.840 And so we see, you know, the human species continue to get up to a certain threshold and then,
00:36:25.920 you know, have a collapse that knocks it back down.
00:36:28.500 But every iteration of this seems to bring us closer and closer to many of the things
00:36:33.640 that you're fearing.
00:36:34.680 And so I do wonder if ultimately this is something that we can get beyond.
00:36:39.380 Now, I am fascinated with the idea of accelerationism, not the stupid political, you know, application,
00:36:48.160 but the actual technological accelerationism, the cybernetic feedback loops and how they
00:36:52.920 unmitigated positive feedback leads us to predictable and dangerous places.
00:37:00.020 And so, you know, two thinkers that I have looked at a lot of this are Nick Land and Alexander
00:37:05.000 Dugan.
00:37:06.360 People could feel about them, how they like, I don't care.
00:37:09.200 The point is they're both addressing something no one else is addressing.
00:37:12.000 And as long as they are, I'm going to keep thinking about it.
00:37:14.720 And so both of them recognize that this like runaway cybernetic feedback loop, this feedback
00:37:21.060 between intelligence and production is going to continue to bring us closer and closer to
00:37:26.280 kind of the singularity closer and closer to the loss of humanity, the abolition of man,
00:37:32.060 as you're alluding to with C.S.
00:37:33.300 Lewis there, where we completely rewrite ourselves into something that is no longer the original
00:37:38.240 program, is no longer originally human in a real sense.
00:37:41.760 And they have two different divergent understandings of how this ends.
00:37:46.020 For Nick Land, it's it's capital's complete emancipation from us.
00:37:50.740 The capital just completely becomes its own thing.
00:37:53.520 We are nothing but a launch vehicle for this like hyper realization of the like the the
00:38:01.280 Faustian Anglo spirit that like launches itself into the stars and leaves its its meat puppets
00:38:06.320 behind or in the more Dugan-esque understanding approaching this moment, kind of going through
00:38:14.440 this moment creates a level of absurdity where we recognize that all of the things that we
00:38:19.900 lost the death of God, all of these things fall away and we are able to emerge from the other
00:38:26.760 side, re-engaging with the spiritual, re-engaging with the divine.
00:38:31.760 You talk about re-enchantment in your book as well.
00:38:34.540 I know you may not be familiar with those those outlooks necessarily in their particulars, but
00:38:39.680 I was wondering, do you do you lean one one way or another?
00:38:42.560 Do you think ultimately we can move along the line of returning to a more spiritual understanding?
00:38:49.060 We can re-embrace those parts of our humanity and become more whole in that way?
00:38:53.080 Or will we continue to move down this track until we have basically jettisoned that which
00:38:59.360 makes us human?
00:39:01.180 Yeah, well, it's it's hard to say.
00:39:02.940 I, I'll say I don't see any meaningful way to restrain transhumanism through just kind of the
00:39:12.720 normal political process or the cultural resources currently available to us.
00:39:17.920 You know, we talked about intentional communities.
00:39:20.200 I'm sure the Amish will be fine, but I don't think that we can form.
00:39:24.760 I think that that sort of late liberal society is such that it's prohibitively difficult to form
00:39:31.080 intentional communities on the necessary scale to really push back in this way.
00:39:37.360 And I think that, you know, our politics is, you know, like you said, it's just it's just
00:39:42.420 progressivism driving the speed limit.
00:39:43.900 In this case, it's transhumanism driving the speed limit.
00:39:47.460 There's really, you know, even on the right, most people have lost this kind of classical
00:39:51.700 idea of humanity and human nature and human flourishing.
00:39:54.760 And you saw that in just last year with the Alabama Supreme Court case over IVF, which
00:40:02.440 was an incredibly modest court ruling, like somebody in the fertility clinic knocked over
00:40:07.740 a, you know, Petri dish and broke it and the embryos were destroyed.
00:40:11.740 And Alabama has a wrongful death statute where, you know, if you're in a car accident and you
00:40:16.220 miscarry, you can sue the person who hit you and recover damages for that lost baby.
00:40:21.980 All the court said was, yeah, you can recover damages for an embryo that's destroyed in
00:40:26.860 IVF lab the same way you can recover damages for an embryo that's destroyed in your womb.
00:40:31.240 And of course, you know, from the left, Handmaid's Tale, hysteria, dystopia, you know, they're
00:40:35.280 going to high status men are going to make me wear a robe and hold me the whole meme.
00:40:41.120 But definitely not a fantasy.
00:40:42.160 Yeah, no, not fetish material at all.
00:40:44.840 But the more disappointing thing was from the right, there was an immediate, you know, reaction
00:40:49.180 of no, no, no, we didn't mean it.
00:40:50.160 We didn't mean it.
00:40:50.720 You know, Alabama's supermajority Republican legislature immediately passes a bill exempting
00:40:56.120 the IVF industry from the wrongful death statute.
00:40:59.420 Republican governor, you know, stalwart pro-life Republican governor signs it.
00:41:04.900 Multiple Republican lawmakers start talking about the miracle of IVF.
00:41:09.980 You know, this is a transhumanist technology.
00:41:11.780 Like it explicitly severs reproduction from the marital act and inserts, you know, the medical
00:41:20.420 and technological establishment into that with, you know, results that we're already
00:41:25.060 seeing are horrifying.
00:41:26.720 Like embryo screening startups are already giving you a little app where it ranks your
00:41:31.860 embryos, you know, divides them up by gender and tells you which ones are going to be smartest
00:41:36.660 and which ones have potential to develop disorders and just really turns the relationship from
00:41:43.380 parent-child into consumer product.
00:41:46.440 Yeah.
00:41:46.600 It's a weird thing.
00:41:49.380 I've sat at a bar with guys who are doing this illegally, you know, like they're, you know,
00:41:55.160 and that's the thing is you can restrict this in the United States.
00:41:57.840 You could restrict this, you know, somewhere, but somewhere, some guy in Belize, you know,
00:42:03.000 or El Salvador or somewhere in, in, in Asia is going to do this.
00:42:08.780 Right.
00:42:09.080 It's going to, it's much like the, uh, much like the gain of function research.
00:42:13.720 We know it's stupid.
00:42:14.940 We know it's wrong.
00:42:15.840 You could ban it everywhere, but somewhere, some idiot's going to do this.
00:42:19.500 Yeah.
00:42:20.080 In the background of the Star Trek universe, someone should make this if they want to do
00:42:24.660 like a really based, uh, Star Trek prequel.
00:42:27.560 Um, but in the Star Trek timeline in the 1990s, uh, it's only alluded to very briefly.
00:42:33.480 It's never like filled in really, but there were a series of conflicts called the eugenics
00:42:38.380 wars between China and the West.
00:42:40.400 Yeah.
00:42:41.080 Where, you know, apparently China, the Chinese, uh, communists were attempting to kind of
00:42:46.340 genetically engineer themselves into supermen.
00:42:48.260 And there was this, you know, great crusade on behalf of humanity itself to defeat them
00:42:53.120 and destroy these technologies.
00:42:54.740 Um, so I can, I can out-nerge you one better here.
00:42:57.280 So in deep space nine, Dr. Julian Bashir is actually a genetically modified child.
00:43:03.560 And it's like this huge secret that he has to keep to be in star fleet the whole time,
00:43:08.020 because they fought this whole war over this.
00:43:10.280 And it's so highly illegal inside of star fleet that even minor applications of the technology
00:43:15.640 are considered completely disqualifying for basically, uh, interacting with society.
00:43:20.380 So they do go back to it in a way.
00:43:22.280 Like it becomes this like underground thing that there was, there, there was a small contingent
00:43:27.200 of, uh, families who broke the rules and did this.
00:43:29.940 And he's one of these people and he's desperately trying to hide it from, you know, getting, uh,
00:43:34.280 getting thrown out of the fleet.
00:43:36.020 So, yeah, it's, it's a similar concept to like the Butlerian Jihad and Dune with, you
00:43:40.200 know, is, is usually applied to AI, you know, you shall not make a machine in the image
00:43:44.920 of a man's mind, but the taboo, you know, if you read some of the later novels, the taboo
00:43:48.700 appears to extend to cover artificial reproduction in some senses, you know, there's in Dune Messiah,
00:43:53.860 I think, uh, Paul at one point suggests, you know, having a child with Irulan through, uh,
00:43:59.040 Princess Irulan through IVF and the, the Bene Gesserit are like, no, that's against our
00:44:03.760 religion.
00:44:04.080 Like you should be executed for that.
00:44:06.180 And he says, you can't execute me.
00:44:07.400 I'm the emperor.
00:44:08.020 And it's a whole, a whole thing.
00:44:09.360 Uh, but yeah, I think, I think absent some kind of, uh, strongly enforced religious taboo
00:44:15.980 like that, uh, it would be very difficult for us to push back politically or culturally
00:44:22.120 on transhumanism.
00:44:23.640 I think to stop it or to roll it back, we'd really need some kind of cataclysm.
00:44:28.740 And from there, it just becomes a question of what that looks like.
00:44:32.180 I, I look at a few scenarios in my book.
00:44:34.660 Um, you know, one would be, we kind of go back to just full primitivism, right?
00:44:41.220 Like we, uh, and then we're just stuck there forever because there's no more easy to extract
00:44:45.800 minerals, uh, near the surface of the earth that we could use to bootstrap ourselves into
00:44:49.840 a new industrial age.
00:44:51.440 So, all right, it's just subsistence agriculture forever and infant mortality goes back up
00:44:56.840 to 70%.
00:44:57.580 And you have to worry about, uh, everyone dying if a harvest fails or your village getting
00:45:01.780 eaten and killed by step raiders and all this other stuff.
00:45:06.060 The, the kind of best case scenario would be, I guess, some kind of, um, you know, solar
00:45:13.300 punk type future where you have a kind of sustainable level of prosperity.
00:45:18.860 That's maybe not quite where we are now, but not sort of full on, uh, primitivism and,
00:45:26.040 uh, you know, who knows it's all, it's all really a toss of the dice.
00:45:31.280 Who knows what the cataclysm would even look like or whether there will even be one.
00:45:36.000 Um, but I think that if the, the kind of Nick land side of it wins out, that we're really
00:45:41.960 staring down the barrel of, uh, the end times, if whether you want to conceive of that religiously
00:45:47.720 or as a metaphor, I think that's, that's what it is.
00:45:50.220 Well, they, but you definitely have a escalate eschatological, uh, or miss eschatology, say
00:45:57.100 that correctly, uh, uh, implications of course, in, in both of these scenarios.
00:46:02.440 And, um, we could, we could go pretty much everywhere for a couple hours with this.
00:46:07.120 Cause this is a topic that I'm, I'm actually rather fascinated with.
00:46:10.320 Um, people should be reading your book so they can get, uh, the outlier that, that you're
00:46:15.080 addressing here.
00:46:15.880 But one thing I want to ask before we wrap this up, uh, Peter Thiel recently gave a interview
00:46:22.920 with, uh, Ross Douthat of the New York times.
00:46:25.600 And, uh, it really hits on a lot of what you're talking about here because very interesting.
00:46:29.120 You know, I was just at arc, uh, in, in London a little bit ago, Peter Thiel, you did this
00:46:34.560 interview with Jordan Peterson.
00:46:35.500 And I was very interested because Thiel directly seemed to address, uh, the problem of the enlightenment
00:46:42.860 and that the enlightenment was a huge issue.
00:46:45.240 Uh, he, he actually went right after Cartesian dualism and we can't know what a real life
00:46:52.140 is by just measuring intelligence.
00:46:54.240 We actually have to embody this that had, you know, real value has to be embodied.
00:46:59.340 It has to be divine, the, uh, given in some way, uh, feel has always, or has had a strange
00:47:05.760 version of Christianity.
00:47:07.120 He's kind of been knocking around for a while here that seemed to be speaking to that.
00:47:11.300 But while recognizing that problem and the, you know, what he said there, I thought was
00:47:15.360 relatively insightful.
00:47:16.600 He then turns around to Ross Douthat and Douthat says more or less, you know, he's like, I'm
00:47:20.780 worried about the antichrist.
00:47:21.840 I'm worried that, that like, uh, green energy will, you know, the, the, you know, the green
00:47:26.080 new deal will produce some kind of antichrist.
00:47:28.220 And Douthat looks at him, he's like, aren't, aren't you like producing this like global
00:47:32.840 intelligence network and like all these, these, uh, you know, possible transhuman things.
00:47:38.100 Isn't that more likely to yield, or at least as likely to yield, uh, some kind of antichrist.
00:47:43.380 And you, you know, like feel like almost glitches.
00:47:46.340 Like you think someone needs to run out and restart him there for a second.
00:47:49.900 When, when that happens, what do you read, uh, you know, about the future when you look
00:47:54.640 at a man like this, who is probably one of the most powerful guys in this space, probably
00:47:59.540 one of the most thoughtful ones as well.
00:48:01.840 Whatever you think about his philosophical conclusions, he's obviously thinking pretty
00:48:06.780 deeply about these issues and their implications.
00:48:09.120 And yet still when he's point blank, like, but what if you're doing that?
00:48:13.300 It's just like, eh, probably not.
00:48:14.960 It's fine.
00:48:15.740 Yeah.
00:48:16.240 There was a great meme about that interview where it was, you know, Peter Thiel looking
00:48:19.280 like a deer in a headlights.
00:48:20.480 And the caption was when the, when the interviewer hits you with tough gotcha questions, like,
00:48:25.180 are you the antichrist?
00:48:26.260 And do you want humanity to survive?
00:48:28.380 Right.
00:48:28.820 Yeah.
00:48:29.460 What do you say here?
00:48:30.820 Yeah.
00:48:31.240 Yeah.
00:48:31.440 He has this whole thing where the, you know, there's the antichrist and there's the catacomb
00:48:34.820 that, which restrains the antichrist.
00:48:36.500 And, you know, it can be, you can mistake one for the other, right?
00:48:39.460 You can mistake the, the regime that attempts to restrain technology could actually be the,
00:48:45.280 could actually end up being the global tyranny that becomes the antichrist rather than,
00:48:49.800 you know, the, the sort of AI God mainframe or whatever that it's meant to hold back.
00:48:58.000 I actually got to talk to him recently, which was, was really interesting.
00:49:01.720 I gave him a, gave him a signed copy of my book and he, he said this elsewhere, so I don't
00:49:06.980 feel bad repeating it, but he basically says like, I don't think transhumanism is ambitious
00:49:12.500 enough.
00:49:12.940 Like he says, Oh, I think, you know, transgenderism is just about changing your body.
00:49:17.420 I think that we need to be more ambitious about that.
00:49:19.460 I think you, it can't change your, you know, your heart or your mind or your faith or anything.
00:49:24.400 And I said, well, hold on.
00:49:25.580 Do you, do you want to, do you want a tech piece of technology that can do that?
00:49:29.960 Cause that's scary.
00:49:32.420 Um, you know, you want the, the Ayatollah to have a little, uh, you know, airport scanner
00:49:36.720 that I stepped through and come out a Muslim or something like, well, uh, I'm not, you
00:49:41.200 know, I'm not entirely sure what he, what he means by that.
00:49:45.440 Um, but I would definitely lump Teal in with people like Mark Andreessen and Elon Musk as kind
00:49:51.360 of right-wing progressives, um, as people who, for them, you know, their, their presence within
00:49:58.980 the right or Republican coalition is mostly based on this kind of opposition to DEI and
00:50:05.540 other ideologies that they think are restraining meritocracy in some way that are preventing
00:50:12.040 us from, you know, building the best companies with the best people and, uh, using those to
00:50:17.600 create the best products to kind of launch ourselves and bootstrap ourselves into the
00:50:22.000 future.
00:50:23.040 Um, and I think there's a real, uh, contempt among those people for those of us who still
00:50:28.540 believe in human nature and in some kind of intrinsic human value that's rooted in, in who
00:50:35.020 we are and not just in what we achieve.
00:50:38.340 Um, and I think they're really eager to kind of leave us on the ash heap of history.
00:50:44.040 Uh, so yeah, these are people that I'm very wary of and I'm very, uh, concerned about their
00:50:50.840 influence within the right.
00:50:53.540 Yeah, certainly, certainly not lacking, uh, though, as we just saw with Elon and Trump,
00:50:58.660 perhaps the tech right doesn't have quite the hold, uh, that they ultimately, uh, would
00:51:02.820 like to, uh, have at least not yet.
00:51:04.820 But, uh, that said, uh, Grayson, like I said, we could do this for a long time, but, uh, we
00:51:10.360 got to limit it a little bit here.
00:51:11.580 Uh, tell people about the name of the book again, where they can find it.
00:51:16.120 How can they pick it up?
00:51:17.220 All that stuff.
00:51:18.240 Sure.
00:51:18.420 Yeah.
00:51:18.620 This is the transhumanist temptation and, uh, how technology and ideology are reshaping
00:51:25.080 humanity and how to resist.
00:51:26.500 It's available from Sophia Institute press.
00:51:29.320 So if you go to Sophia institute.com slash transhuman, that'll take you there.
00:51:33.460 It's always best to buy it straight from the publisher because I get more money.
00:51:36.640 But if you want to get on Amazon, you can, it's also available on Kindle and as an audio
00:51:40.580 book, I don't read it.
00:51:41.600 Uh, so if my voice is annoying to you, you can still get the audio book.
00:51:45.120 Someone else read it.
00:51:46.120 Uh, and you can follow me on Twitter at Hemingway, H E M I N G Q U A Y.
00:51:52.780 My DMS are open.
00:51:53.740 So feel free to, uh, tell me that I'm wrong and you hate my book.
00:51:57.120 All right, man.
00:51:59.060 Well, thank you so much for coming on guys.
00:52:00.940 Make sure to check out that book.
00:52:02.220 And of course, if it's your first time on this channel, make sure to click subscribe on
00:52:07.200 YouTube.
00:52:07.880 Make sure you click the bell and the notifications.
00:52:09.920 Obviously this is prerecorded today.
00:52:11.540 So unfortunately we won't be doing questions, but normally we do the live streams and you
00:52:16.720 can chime in.
00:52:17.540 And when we are there, of course, if you'd like to get these broadcasts as podcasts, you
00:52:21.460 need to subscribe to the Orr McIntyre show on your favorite podcast network.
00:52:25.100 And if you want to support the show, don't forget, we've got merch over at shotblazemedia.com.
00:52:29.760 You can pick up the shirt I'm wearing right now.
00:52:32.280 Uh, so if you would like to support the show, you always have that option.
00:52:35.280 Thank you everybody for watching.
00:52:36.540 And as always, I will talk to you next time.