Mysterium Fasces


Mysterium Fasces Episode 25 — Transhumanism


Episode Stats

Misogynist Sentences

20

Hate Speech Sentences

130


Summary

In this episode, I discuss the current political situation in the United States, including the election of Donald Trump, and the implications for transhumanism and the future of the world as a whole. Also, we have a special guest on the show to discuss the Kabbalah and the role of God in our salvation.


Transcript

00:00:00.000 Hristos an esti e knecron tannatot tannatot pa tisas k e tis e tis vnimas i zauin a risa me nus.
00:00:28.020 Hristos an esti e knecron tannatot tannatot pa tisas k e tis e tis vnimas i zauin a risa me nus.
00:00:54.580 Hristos an esti e knecron tannatot tannatot pa tisas k e tis e tis vnimas i zauin a risa me nus.
00:01:20.980 Hristos an esti e knecron tannatot tannatot pa tisas k e tis e tis vnimas i zauin a risa me nus.
00:01:35.400 Hristos an esti e knecron tannatot tannatot pa tisas k e tis e tis vnimas i zauin a risa me nus.
00:01:49.400 Hristos an esti e knecron tannatot tannatot pa tisas k e tis e tis vnimas i zauin a risa me nus.
00:02:05.400 Hristos an esti e tis vnimas i zauin a risa me nus.
00:02:19.600 Hristos an esti e tis vnimas i zauin a risa me nus.
00:02:49.780 Thank you.
00:02:50.300 Alleluia, alleluia.
00:02:51.360 Christ is risen indeed.
00:02:53.120 Indeed he is risen.
00:02:54.720 And we've got Ziger, regular contributor.
00:02:56.720 Thank you very much, Ziger.
00:02:58.380 Hey, it's very nice to be here.
00:02:59.940 It's been a long time.
00:03:01.060 It's true.
00:03:01.620 Always a pleasure to have you on.
00:03:03.000 I remember very distinctly the last episode we did together on the Kabbalah.
00:03:07.780 So we might get into some of the similar subject matter this evening.
00:03:12.400 May be.
00:03:13.340 This evening.
00:03:14.480 So Christ is risen.
00:03:16.460 Happy Pascha.
00:03:17.140 Happy Easter to all of my Christian brothers and sisters.
00:03:21.740 I want to apologize for not releasing an episode for the last two weeks.
00:03:24.740 It's, you know, Pascha and final exams and all this.
00:03:28.340 But we should be back to our regular schedule.
00:03:30.860 And you should be able to reliably expect reasonable content from us going forwards.
00:03:36.120 So before we get into the main subject today, transhumanism, I wanted to address kind of very briefly the current political situation with Donald Trump.
00:03:46.800 There's been significant development since I last made a podcast.
00:03:50.380 And so for those of you who are not in contact with me, there are other means you might not know my opinion.
00:03:56.800 Basically, just to refresh what I've been saying this whole time, liberal democracy was never going to save us.
00:04:04.480 The system will not save us.
00:04:09.020 This is just the this is the long and short of it.
00:04:13.840 The solution, our salvation will come ultimately, well, from God, but from his action in our lives.
00:04:20.900 And so we've discussed before on another podcast what the plan ought to be in order to gain victory over our enemies.
00:04:27.560 But it doesn't fundamentally involve voting that can be useful.
00:04:31.760 And of course, there's certain utility in having far right candidates elected.
00:04:36.380 But long term, our victory will be it will not be parliamentary because this is a it's just I'm sorry if if you still, you know,
00:04:50.580 thought that that Donald Trump could could change things.
00:04:52.960 But, you know, you can see quite clearly now why the system will not will not be changed from within.
00:05:00.740 Anybody else would like to make a comment?
00:05:02.340 I mean, it's not it's kind of been talked about quite to death.
00:05:04.640 I know, as I agree, certainly Daily Stormer has been on top of this.
00:05:07.740 Yeah, maybe shortly.
00:05:08.700 I mean, we on the Mysterium Fascist.
00:05:10.820 I mean, we've talked about this before, before even Trump was elected.
00:05:13.600 And we've never really been on the Trump train, so to speak.
00:05:16.560 I mean, I remember us talking about different scenarios and what was the best scenario and worst scenario for us.
00:05:23.640 And really, this scenario that we have was kind of outside of all of the speculation.
00:05:29.620 But thinking about it, I think it may be the absolute best case scenario from our perspective,
00:05:34.840 in the sense that Trump was elected and everyone was like excited.
00:05:40.680 And, you know, from a certain perspective, that was bad because it made people kind of complacent.
00:05:50.320 But then this dramatic, unforeseen flipping around of Trump and him, like, going back on all of his agendas instantly.
00:05:58.760 It kind of it's kind of the worst, the most aggressive possible undermining of the concept of voting and democracy imaginable.
00:06:06.860 I mean, I've been known to say that the best case scenario is if Trump gets elected and then gets assassinated.
00:06:14.720 Because I was like, well, then people will say, oh, that democracy is really a scam and we can't get out of it by voting.
00:06:21.060 But even in that case, people might have still said, like, well, maybe if we elect some other guy, maybe he won't get assassinated this time.
00:06:27.320 But the fact that he's elected and now, for some reason, he's at some specific moment, he's gone back on everything,
00:06:36.700 shows that there's probably actors in the background or pressures that are, like, visible that make it so that all of this voting,
00:06:44.260 all of this politics is completely useless and just an illusion, a show of illusion that, you know, we can't really affect.
00:06:53.680 So that message, of course, most people, the normies, probably won't understand that, but they don't want to say anything.
00:07:00.640 But people who are capable of understanding this, they're understanding it right now.
00:07:05.440 And that's good for us, basically.
00:07:08.380 Still, I mean, let's say Trump is real, right?
00:07:12.280 Let's say he was real and let's say he's super real and does everything which he was supposed to do and, you know, didn't bomb Syria.
00:07:18.420 But then, you know, will it help against the rampant immorality we have?
00:07:23.420 Will it help against homosexuality?
00:07:25.120 Will it help against anything like that?
00:07:26.700 No, that would still remain.
00:07:28.520 Well, you're kind of getting us down into the weeds here.
00:07:30.420 But I think that you bring up a good point, Greva, which is that, yeah, ultimately, you know,
00:07:34.460 the issues which assail us are, as we've kind of always maintained here, are spiritual.
00:07:39.240 It's because there's been a degeneration in the moral fiber of our people.
00:07:43.040 Our virtue has declined, and that's what has led us down this road to the situation that we find ourselves in.
00:07:50.640 And as you know, Zyger, this sort of scenario where people have such a betrayal, a failure of democracy to affect change,
00:07:58.440 looming in their face, you know, especially because so many people saw Trump really as this kind of like last chance to avert disaster,
00:08:05.300 is a brilliant opportunity to promote our propaganda and to, you know, demonstrate to people and just awaken them to the fact that,
00:08:14.820 yes, democracy is broken.
00:08:17.120 It's always broken.
00:08:18.200 It's a flawed system.
00:08:19.660 Democracy is always a front for oligarchy because the money powers always have the most influence in the civil realm.
00:08:26.220 Now, that's just trueism as old as political action itself.
00:08:32.560 Sorry, I think Greva Hans interrupted you if you wanted to make a comment.
00:08:36.180 Well, my basic thing is that they've raised a cacophony in the light of this happening,
00:08:42.420 where most of us and a lot of Trump's foremost supporters have said this was a wrong step.
00:08:48.120 It was a misstep, a faux pas.
00:08:49.200 And they have raised a hell of voices, which are, it's made of many, spit from many tongues,
00:08:56.860 people who despise Trump, people who ran as Democrats, people who ran as never-Trumpers.
00:09:02.200 And they've cacophonied this massive noise to say, you weren't his real base.
00:09:09.000 This is what he should be doing.
00:09:11.720 And it's been, it seems like it has been lost, but it hasn't.
00:09:15.140 And your words have not been lost on people who are thinking and people who are rational.
00:09:20.500 And you should keep speaking the truth because the Trump campaign,
00:09:25.580 even if it seems like it has lost all to Kushner, that's not quite the case.
00:09:31.760 And there are powerful people in the Trump regime who will be able to make use of
00:09:36.140 everything that you're doing right now.
00:09:38.780 Well, yes, there are certainly, there are, um, I think that even, even a Donald Trump
00:09:46.000 presidency gives us more room to organize than a Hillary Clinton one would have, because
00:09:49.560 she would have, uh, attacked the, um, she would have attacked the First and Second Amendment
00:09:54.060 almost, almost surely.
00:09:55.560 Right, and she'd brought us more time.
00:09:56.440 Yes.
00:09:56.740 And she'd like a little bit more time.
00:09:58.100 Yeah, I really, I wanted to address this just because I'm seeing so many people who are,
00:10:02.000 um, despondent and blackpilled and, um, feeling defeated and nihilistic.
00:10:06.500 And, uh, by no means should you feel so, uh, indeed, we're in a better position than we
00:10:12.960 ever have been.
00:10:14.120 You know, if, uh, I think Andrew England said he would have done everything over again,
00:10:20.580 because yes, we've, uh, our message has been inflated massively from this time, two or three
00:10:26.280 years ago.
00:10:27.780 Right.
00:10:28.380 If I, if I might interrupt with the guns or anything, um, the ATF has recently reversed
00:10:33.520 an opinion that, um, a lot of gun nuts are finding, um, puzzling, but I don't, and it's
00:10:38.800 a, an opinion based on whether, if you bought a SIG brace for your, um, gun or a tech blade
00:10:45.180 or whatever, um, other thing, a brace stabilizer, they said that if you shoulder it while you're
00:10:50.720 firing, you're not redesigning your firearm to be a short barreled rifle anymore.
00:10:53.860 To our non-American listeners or non-gun enthusiasts, this is jargon.
00:10:58.380 And it means nothing and, um, that's okay.
00:11:01.040 But to those of you who know what I'm talking about, this was not a coincidence that this
00:11:05.160 happened.
00:11:05.880 Basically they've legalized short barrel rifles.
00:11:09.600 If you non-intentionally, um, unintentionally shoulder them.
00:11:13.860 So essentially we're, we're rolling back some really, really awful provisions and we're doing
00:11:20.380 it because I believe that it is purely a product of the fact that we have the current governmental
00:11:26.540 establishment install.
00:11:28.560 So when you think that nothing good's being done, you need to temper that with the understanding
00:11:33.940 that not everything is going to be sang and set in front of you to praise.
00:11:41.120 Yeah.
00:11:41.700 Indeed.
00:11:42.000 Certainly there will be some, um, likely mild bureaucratic executive changes.
00:11:48.740 Now, I think that we kind of addressed that really, um, just take heart comrades.
00:11:53.040 Don't, uh, please, you know, just if I know most of the people who are listening to this
00:11:58.380 podcast probably are not, um, weeping into buckets of ice cream, but I just, I know that
00:12:04.620 some people have been really, um, feeling lowly and there is just no reason.
00:12:08.260 There's no reason we are more successful than we have ever been.
00:12:12.020 And so we need to keep our hands on a gospel plow, so to speak.
00:12:15.720 Here, here, here's, here's a question actually, weeping into ice cream for daddy Trump not
00:12:21.920 being the one they thought.
00:12:25.000 Really?
00:12:26.380 Well, why is there to actually do that?
00:12:28.540 That's fucking pathetic.
00:12:30.100 I was, uh.
00:12:31.060 This is what happens when you put your, all your faith, all your hope in a political figure
00:12:36.400 and not some higher ideal.
00:12:39.440 Trust he not in princes.
00:12:40.540 You start to be eating into ice cream.
00:12:44.740 That's the full manifestation of your ideology.
00:12:47.880 All right.
00:12:48.600 Now, get that out of the way.
00:12:51.240 I think that we can move into, um, um, our discussion proper for this week.
00:12:55.980 So, I'm sorry, I just moved some of the notes around here.
00:12:59.540 Um, so, uh, transhumanism.
00:13:03.240 Why talk about transhumanism?
00:13:05.040 Well, uh, it seems like a little bit of an esoteric subject, although it should not surprise
00:13:09.380 any of you who, uh, like to listen to this, uh, program that we discuss this kind of thing.
00:13:14.140 Um, but it doesn't seem directly related to Christianity or fascism.
00:13:17.860 Well, if you are, are a listener of this podcast, you'll know that I've talked about the subject
00:13:22.640 several times before as it's come up in Kali Yuga News.
00:13:25.580 As the technology, uh, uh, biotechnology has developed, um, we're entering an era where
00:13:32.880 it is, in fact, possible and realistic, um, to engage in, uh, transhuman activity.
00:13:40.500 And, um, by this we mean, uh, I'm going to give you a definition that's probably, uh,
00:13:47.700 grounds you a little bit better.
00:13:48.780 So, the vulgar definition is a philosophy that explores human transcendence above or beyond
00:13:53.600 organic corporeal limitations through technological and philosophical evolution.
00:13:59.100 Um, we're going to unpack that a little bit later, but basically what I mean when I say
00:14:03.480 transhumanism is any application of, um, technology to the human person that fundamentally changes
00:14:11.100 the, the way in which human beings, um, core person operates.
00:14:17.380 So, for instance, the introduction of an artificial womb, uh, into the reproductive cycle,
00:14:22.460 uh, fundamentally changes the way in which human beings, uh, their personality operates.
00:14:28.840 It detaches them from, uh, the organic norm, right?
00:14:33.280 As an example, we're going to get into this pretty deep, but the reason why we need to address
00:14:37.660 this is because I think the transhumanism is of all of the technological, um, threats that
00:14:43.880 are approaching us in the near future is the most serious and existential of them all because
00:14:49.720 it has the capacity to change human beings' core nature through technology.
00:14:55.380 Um, now before we dive into it a little bit further, I would open it up if anybody has
00:14:59.440 any comments.
00:15:01.420 Well, I'd just like to say that, um, you know, it's, it's easy to view, like, this transhumanist
00:15:08.940 stuff as something in the future.
00:15:11.060 And like, uh, when you're, you're thinking of like artificial limbs or like cybernetic
00:15:16.420 implants or, um, like things of that nature that don't exist yet.
00:15:21.860 But I really view this, this stuff as a continuum and we're already on that continuum.
00:15:29.860 Like I view the smartphone and the computer and all of these things as, um, you know, devices
00:15:37.320 that makes a human existence different and that augments human capacities and their, their
00:15:42.800 tools.
00:15:43.220 And they have all of the same properties as like these artificial limbs or implants would
00:15:48.320 have just to a lesser extent.
00:15:49.980 And, you know, they work differently.
00:15:51.740 And if you go back in time, even like the hammer or more primitive tools also changes the
00:15:57.620 way that, uh, humans live.
00:15:59.860 And I'm not saying that to say that, uh, you know, this, you know, the, these future
00:16:05.360 technologies are no big deal.
00:16:07.640 Rather, I'm saying that all of these technologies, uh, have potentially harmful consequences and
00:16:14.900 we know that they've had harmful consequences.
00:16:17.120 So, um, but viewing it in like this broad perspective kind of allows us to laser in on what makes it
00:16:25.020 really dangerous rather than superficially.
00:16:28.080 Cause, um, it's easy to say, Oh, it's, it's horrible to have like artificial arms or things
00:16:34.380 like that because of X, Y, Z reason, but it's not inherently, you know, the specific technology
00:16:40.400 of having artificial limbs.
00:16:41.980 Cause that's bad.
00:16:42.480 Of course, indeed.
00:16:43.820 I got, it's, I got, it's something beyond that.
00:16:47.000 So would you, would you consider an artificial womb, uh, as potentially being, uh, a good
00:16:54.500 thing?
00:16:55.320 We're going to get into that later on.
00:16:56.740 I think that that's, yeah, it's a more complicated question than that.
00:16:59.160 Yeah.
00:16:59.300 We're going to, we're going to, we can't ask this sort of in the weeds questions right
00:17:02.220 away.
00:17:02.800 Rude, did you have a comment you wanted to make?
00:17:04.140 Um, sure.
00:17:05.180 I, I've been a student of anorex before I was a student of fascism, but, um, the, the
00:17:09.740 anorex group had a discussion about this where a peg leg is essentially a transhumanist idea.
00:17:15.620 You, it, it does augment your human existence.
00:17:18.480 It can't be infected.
00:17:19.780 It takes a little bit more to destroy than a human limb.
00:17:23.440 Um, and we've dealt this for a while and we've been okay.
00:17:27.600 The, the question is, are you willing to cut off your nose to enhance your face, not to
00:17:32.320 spite it?
00:17:33.100 And that's what we get into this idea of skeuomorphism too, which is the idea of should
00:17:38.320 technology shape us or should we shape technology?
00:17:43.500 Yeah.
00:17:44.060 And that's a very interesting question.
00:17:45.620 I probably would be inclined to say that both happen no matter what we want.
00:17:49.720 Um, but yeah.
00:17:50.820 And so these, these are all, I think, important points.
00:17:54.420 We're going to get into kind of, uh, the, the nature of the technology in a minute.
00:17:58.820 And so, yeah, I think that, that, but Ziger, you bring up a good point that this, um, you
00:18:05.320 know, transhumanism, as you say on a continuum, it's already here.
00:18:08.460 We're already dealing with this issue.
00:18:10.860 It's not some sort of, uh, deus ex video game future.
00:18:15.040 That's going to take another 30 or 40 years to arrive.
00:18:17.260 It's already here.
00:18:18.620 Um, you know, the technology, uh, exists already, but it more than not,
00:18:23.440 it's not just the technology.
00:18:24.680 There is an ideological philosophical component to this, uh, that has already been firmly instantiated
00:18:30.800 or is currently being instantiated in the public mind.
00:18:34.500 And this is, uh, I'm going to skip ahead, but it's very, very, very, this is what the whole
00:18:39.100 transgender movement, I think is fundamentally about is it's about disassociating the, uh,
00:18:45.900 inborn characteristics of human biology from personal identity.
00:18:51.700 Uh, and we're going to, I think we're going to dive deep into this later on, but this is
00:18:55.880 this kind of Gnostic bifurcation of the soul and the body.
00:19:00.700 Human beings are not recognized as being, uh, an integrated, um, uh, dual, uh, elemental
00:19:08.940 unit where we're spiritual.
00:19:10.240 We have a soul, which is where the rational mind and the will are located, but we also
00:19:15.080 have a body, which is how we interact with this material world.
00:19:18.000 And both of these things are natural and good and proper to human existence.
00:19:21.820 We're not spirits imprisoned within a body against our will.
00:19:25.700 Um, we were made to have a body.
00:19:30.020 Our spirit is made for it.
00:19:32.060 And that's not, you know, and this is what the, the core philosophical idea of transhumanism
00:19:38.260 proposes is that no, actually, this is an accident.
00:19:42.040 And I mean that in a philosophical sense, the human body is just the accident of millions
00:19:47.280 of years of evolutionary flux, uh, and the product of particular environmental pressures.
00:19:53.640 And so there's nothing inherent or sacred about the human form.
00:19:58.920 Uh, it's, it's not seen to be a sort of manifestation of the cosmic order and the, um, natural and
00:20:05.420 divine laws, which undergird the universe and draw it into being, it's just sort of seen
00:20:09.920 as being this kind of nominalist, uh, fleshy robot.
00:20:13.860 And so there's not a fundamental issue.
00:20:16.040 I mean, you know, through your penis doesn't really matter if it's not actually an existential
00:20:19.060 part of yourself and you can chop it off and try to change your gender at any time.
00:20:22.200 Well, the rest of your body is ultimately accidental.
00:20:26.580 It has no, um, grounding in your identity.
00:20:30.980 That's irrevocable.
00:20:32.300 It's not essential.
00:20:33.740 You know, Florian, um, maybe before we move to this spiritual argument, I've actually thought
00:20:39.280 for a very long time about the purely material aspect.
00:20:43.060 I think I can disprove this, this notion of transhumanism purely from like, you know, using
00:20:52.380 their own materialistic, um, point of view.
00:20:55.580 Uh, I think one, one of the things, the basis of my reflection is I was thinking about like,
00:21:01.720 uh, extraterrestrials.
00:21:03.080 I was thinking like, what kind of creatures could these extraterrestrials be and how different
00:21:08.340 from us could they be?
00:21:11.000 And, uh, you know, in, in the same line of thought, I was like, how, how can humans evolve?
00:21:15.720 How can we improve, you know, using whatever evolutionary methods we can?
00:21:21.160 And, you know, that, that's really the question of transhumanism, you know, improving humanity.
00:21:26.020 And the more I thought about it, the more I came to the conclusion that there's very little
00:21:32.680 that we could do to actually improve humans, uh, in terms of both their intellects and their
00:21:38.660 bodies and things like that, because, um, pretty much any change that we can think of is going
00:21:45.800 to have severe negative consequences, even things as basic as say, oh, what if we had
00:21:52.420 technology to, um, double our lifespan, for example, that's, that's one of the core things
00:21:57.460 that transhumanists are interested in, you know, uh, increasing the lifespan.
00:22:01.780 But when you think about it, um, if humans live twice as long, it's quite possible that,
00:22:07.620 uh, all human civilization would have, would have collapsed a long time ago and we would
00:22:12.920 never have attained any of our achievements because, um, basically humans are kind of,
00:22:20.880 uh, well, let's go to the extreme, let's say ants or other insects that live for a very
00:22:26.340 short time.
00:22:27.080 They're different from us in that they, they tend to adapt extremely fast to circumstances
00:22:31.760 like, uh, bacteria.
00:22:33.540 They evolve entirely new sets of, um, immunities each generation, um, because they die so quickly
00:22:42.180 that they can adapt very quickly because generally even humans, when we're young, that's the,
00:22:47.380 we have a short period of very flexible, um, like learning where we can like observe the
00:22:55.700 world as it is and say, okay, I'm going to, you know, adapt to that.
00:22:59.580 And as we get older, we kind of become more rigid and set in our ways and things like that.
00:23:06.040 And I, I'm pretty sure that if people lived for a much longer period of time, um, they
00:23:12.660 would both lose their motivation to like work hard and do something because death is a huge
00:23:19.120 motivation for doing things.
00:23:21.120 If you feel like you have all the world in front of you, um, you're going to be less active
00:23:26.040 and the, the period of time when you're flexible is shorter too.
00:23:31.700 So I think a longer lifespan would not be good for civilization.
00:23:35.800 It would probably destroy civilization.
00:23:37.660 If you think of aliens would live a thousand years, um, I don't think they would develop
00:23:43.000 at nearly the speed that we develop at.
00:23:45.440 So that's one, um, one argument from a purely materialist standpoint.
00:23:49.860 If we think about like strength, speed, intelligence, well, uh, if we were super fast, super strong
00:23:57.620 compared with all the other organisms on earth, why would we have ever needed to build houses
00:24:03.280 or develop tools or, um, you know, organize in a social manner, it would not be, it would
00:24:10.580 not be required at all.
00:24:11.640 I mean, if you were stronger than a lion or a tiger or any other predator, um, we could just
00:24:17.840 be individuals who walk in the jungle and kill any animal we see and just eat them and, uh,
00:24:23.620 live like Kings.
00:24:24.500 So all of human civilization, all, uh, institutions wouldn't be necessary.
00:24:30.340 So again, if we were very strong, there would be no need to develop further and, you know,
00:24:35.840 be, uh, developed and have those spiritual qualities or even just material qualities of civilization.
00:24:42.660 So that's another thing where increasing our abilities will not necessarily be a good
00:24:47.700 thing.
00:24:47.980 Really.
00:24:48.820 I think the, the human form, uh, and the human level of intelligence is kind of a perfect
00:24:56.800 mean, basically.
00:24:57.620 Like if you're significantly smarter, it starts to cause problems where you can't focus on
00:25:04.800 the immediate things in front of you.
00:25:06.060 And you're like stuck in the clouds where you're like, Oh, what's going to happen a thousand
00:25:10.120 years?
00:25:10.540 Or what if, what if, what if, and you become paralyzed.
00:25:13.980 And we see this because we, we have humans who have IQs of 200 or 300.
00:25:18.320 And these people, uh, when you look at their biographies, they're fucking useless.
00:25:22.820 I mean, sometimes they have great ideas and they produce things, but they're not, um, I
00:25:27.140 mean, uh, our, our world would not be better off if everyone had 200 point IQ.
00:25:32.020 That's that we, we know that because we know how these people live.
00:25:34.880 Well, on the other hand, it's not like the cyber organization would be for the, uh, plebeian,
00:25:41.200 so to speak.
00:25:42.400 No.
00:25:43.100 Um, but I'm just saying that it's not like all, all qualities that humans have.
00:25:49.440 We, as a society and as a race and as a, whatever you want to call it, we wouldn't necessarily
00:25:55.220 be improved as a whole by increasing our, our specs, basically just making us stronger,
00:26:00.680 faster, smarter, greater memory, all that.
00:26:04.180 Um, I, I think that there's some kind of perfect mean and we're probably not like all of that
00:26:11.180 perfect mean by any stretch.
00:26:12.540 I mean, I think we can refine and improve and, but I think that the improvement is going
00:26:16.800 to be in, uh, kind of getting to that, that perfect ideal state rather than just going wild
00:26:23.440 and experimenting with wildly different models because, um, it seems that there's an equilibrium
00:26:31.060 that leads to this, this thing we call civilization and, um, social harmony and all those good
00:26:37.940 things.
00:26:38.880 And, um, it's not at all clear that all of that is going to improve dramatically by just,
00:26:44.960 uh, making people smarter or whatever.
00:26:47.460 So just, um, just, just, just at that material level, I think the transhumanist ideal is completely
00:26:54.880 like off base.
00:26:56.780 But doesn't that still imply that, um, you have some sort of, how should I put it, uh, collective
00:27:04.380 thinking, collective loyalty, because I mean, purely in the individualistically, how is that
00:27:11.220 not a good thing?
00:27:12.800 Um, oh yeah.
00:27:14.420 Yeah.
00:27:14.900 And that's the kind of situation we have here.
00:27:16.680 We're all atomized.
00:27:17.720 We all, uh, we, we have lost all organic connections.
00:27:22.040 We don't have families.
00:27:23.140 We don't have, uh, you know, we, we have no traditions.
00:27:27.620 We have no, you know, religiosity.
00:27:29.700 We have no nothing like that.
00:27:30.840 We just have this, this very shallow materialistic, materialistic in the correct sense and just
00:27:39.680 very empty modern existence, right?
00:27:41.700 Yeah, but what I'm saying is, uh, if, say humanity splits in two, like, or white people
00:27:47.500 split in two, and there's one that's going to be like individualistic demigods or super
00:27:52.160 smart, super strong.
00:27:53.520 And the other one is going to be a collectivist, you know, normal people group.
00:27:57.400 I think the collectivist normal people are going to annihilate the individualistic demigods
00:28:02.760 basically because they're not going to be able to cooperate and they won't care about
00:28:07.020 the greater scheme of their society and things like that.
00:28:10.140 Um, that, that's basically what I'm saying.
00:28:12.140 It's not, it's a dead end.
00:28:14.020 Right.
00:28:14.600 If I might, if I might, please make two, two notes.
00:28:17.700 Number one, your body is not an avatar.
00:28:20.020 That's an, especially from a Christian point of view, it's an early heresy where you are
00:28:26.060 your body and your mind together.
00:28:27.760 You are, you are the entirety of you.
00:28:30.240 And second of all, if, if people are really, really interested in the idea of transhumanism
00:28:35.840 and dealing with it on a basis of traditionalism, there, there was an author who dealt with this
00:28:42.380 long before any of us could actually have conceptualized it.
00:28:45.260 It's a, it's a trilogy of books by C.S.
00:28:48.000 Lewis called the Space Trilogy.
00:28:49.560 And they were, they were primarily concerned with science and the advancement of human knowledge
00:28:56.720 beyond years.
00:28:57.620 And their idea was preserving the consciousness of these great geniuses beyond their earthly
00:29:05.020 given years.
00:29:06.720 And the, the idea, it's juxtaposed, C.S.
00:29:10.540 Lewis is a masterful writer.
00:29:11.680 Even when he writes, um, things that are pretty much blatant, blatant, um, or want a better
00:29:18.460 word, ironic works, um, their parodies, their allegories at the best, he still writes them
00:29:26.320 masterfully and you'll, you'll develop a sense of what exactly our world is missing.
00:29:31.040 It's written from the point of view of a professor and you'll understand how much professor,
00:29:36.620 professors and great academics have waned in our age.
00:29:40.560 But in another sense, you're, you're never an avatar of something.
00:29:44.100 Your body is not just some sort of, um, CPU to run your operating system on.
00:29:51.240 Um, it's so much more than that.
00:29:52.620 You are the body you live in.
00:29:54.180 You are your, your person.
00:29:56.440 Capus corpus.
00:29:58.520 Basic principle of English common law.
00:30:01.080 You are your body.
00:30:02.920 So whatever your body is, your personal identity is associated to that.
00:30:06.700 That's why if somebody sees you committing a crime, you can't just say, oh, well, I wasn't
00:30:11.580 identifying with my body that day.
00:30:13.120 Well, maybe you were identifying as a woman that day.
00:30:17.740 Ah, yes.
00:30:18.280 Thank you for that.
00:30:19.600 No, but I think that, Zyger, that was an extremely astute observation.
00:30:23.780 Um, and I think that you kind of get down to just the, the real issue here is that, you
00:30:28.640 know, human, um, success is defined by our unique anthropology, which is, um, like directly
00:30:35.780 derived from our environment.
00:30:37.060 You know, it's not, it wasn't created in a vacuum.
00:30:39.180 Um, you know, there, there is a very complex and subtle interplay between our, our inherent
00:30:45.200 on changing nature and the inculc, the inculcation of that nature by our environment.
00:30:50.560 And then, um, how that's expressed in different, uh, environmental settings.
00:30:56.900 And I think that you kind of isolate correctly that this is, this is basically whatever, if
00:31:02.040 you were to change the human nature, if you were to change human anthropology, uh, through
00:31:07.380 the application of technology, you know, we, uh, we have no idea at all what would happen
00:31:12.520 because all of our, our, our politics, our social life, our spiritual life, our knowledge
00:31:18.760 is built upon a consistent understanding of the human, uh, form of human nature.
00:31:25.900 You know, we, like even you make do the very, very stupe point, just even in terms of the
00:31:30.680 lifespan, right?
00:31:31.900 You know, we don't have any examples of people living past 120.
00:31:36.540 Um, you know, and so we, most people don't, uh, even in a, in a relatively, even in like
00:31:42.540 a first world country, most people don't live past 85.
00:31:44.620 And so this, you know, our whole, um, civilization, our whole, uh, across the entire world is built
00:31:52.520 upon this presupposition that there is a fairly fixed limit to human existence and that they
00:31:58.700 will pass away after that.
00:32:00.380 I, um, I got a bit of a thought, uh, correctly if I'm wrong here, but the old Greeks considered,
00:32:05.620 uh, death, um, basically the separation of body and soul, right?
00:32:10.260 Now, if you modify your body this much, right, you, I don't know, replace your parts of your
00:32:16.620 brain with a CPU or artificial limbs, all this kind of stuff, wouldn't that essentially be
00:32:23.060 the same thing?
00:32:24.080 But instead of the soul leaving your body, it's essentially your body leaving your soul.
00:32:30.220 Interesting.
00:32:30.900 Potentially, uh, potentially, uh, probably the, well, that's a complicated question, but probably
00:32:36.380 it's the brain that would have the biggest, uh, impact.
00:32:39.200 If you start messing with the brain, it's possible that your soul is not, uh, going to be able
00:32:43.100 to interface with it anymore.
00:32:44.960 Well, would it really?
00:32:46.280 Because, I mean, we, we're, the body is more than the brain, right?
00:32:50.580 And there is a unity of the, the spirit or the soul and the body.
00:32:54.860 We've all agreed on this right now.
00:32:56.840 And then you can't simply reduce the body to the brain, right?
00:33:00.980 That would be kind of a sort of soft version of this, uh, sort of.
00:33:04.540 No, no, it's just a practical issue that, um, you know, we have people who have lost limbs.
00:33:11.140 We have people who have lost internal organs and things like that.
00:33:14.040 And, uh, while I'm sure there is.
00:33:16.120 We've had hemispheric ectomies before.
00:33:18.160 Yeah.
00:33:19.040 Also, um, so it's, it's unclear that, um, any specific part is going to disconnect the soul.
00:33:27.560 But, um, all right, it's like, and we, we also know that people who lose their limbs
00:33:34.140 have the phantom limb thing where they, they still feel like the thing.
00:33:37.840 So probably there might be, like, I don't know, there might be, uh, in the soul, all
00:33:42.980 the body might still be there and we can still feel it as though it was still there.
00:33:46.880 But even though, so it might not be relevant, but, uh, there might be a question of connection
00:33:52.680 where if you replace enough of it, uh, the identity is going to change and ship it's
00:33:57.540 easiest and all that.
00:33:58.420 Yeah.
00:33:58.960 But we don't know.
00:34:00.620 We'll have to experiment with that, but maybe we shouldn't even try.
00:34:04.160 Well, yeah, you're not an avatar.
00:34:06.200 You are, you are the, some of your parts as well as the, it's not like you're an operating
00:34:12.140 system running on a computer.
00:34:14.280 You're more than that.
00:34:15.240 And your, your parts are indeed part of your whole.
00:34:18.080 Now, correct me if I'm wrong again, but, um, I want to go ahead that there has actually
00:34:24.460 been personality changes with people who, for example, has, uh, uh, been forced to
00:34:29.440 replace a liver or kidneys and that kind of stuff.
00:34:32.820 Um, have you heard that as well?
00:34:35.480 Because that's something I recalled a while ago.
00:34:37.940 Um, you're saying that people who got a replacement liver have personality changes?
00:34:43.280 Is that what you're saying?
00:34:44.160 I have a, I have a memory of hearing that at least.
00:34:46.940 Um, I have a memory of seeing that in movies and shows, but I don't remember seeing that
00:34:51.660 in real life.
00:34:52.460 It might be true.
00:34:53.360 I wouldn't really be surprised.
00:34:55.240 It's a bit fussy.
00:34:55.820 I haven't heard of it.
00:34:57.820 Yeah.
00:34:58.040 Yeah.
00:34:58.380 I agree.
00:34:58.680 Bizarre there.
00:34:59.600 Yeah.
00:35:01.480 And so the kind of, um, to get back to it.
00:35:05.280 So this, this, this, this is what it comes down to is that, um, we can't, we, we run into
00:35:12.120 a very, very, uh, dangerous, incredibly dangerous territory when we, we begin to, um, chop off
00:35:21.000 parts of the human form.
00:35:22.940 And we begin to say, this part is not essential to human identity.
00:35:26.460 Now these questions are raised naturally by us as either.
00:35:30.180 And, uh, rude brought up, you know, people lose limbs and so on.
00:35:33.280 And, you know, we have to, um, you know, prosthetics have existed for thousands and thousands
00:35:39.320 of years for just for this reason.
00:35:41.620 Right.
00:35:42.180 But these are all, you know, this is always understood in the context of the, the depriving
00:35:48.360 one of the fullness of their organic being.
00:35:50.900 Right, right.
00:35:53.680 Because we've never hit the point where the replacement was better in whatever way than
00:35:58.820 the original.
00:35:59.700 So we've never had to ask that question.
00:36:01.860 To make you whole once again, it was never to, to improve, to augment, to the idea of,
00:36:07.680 of treating your, your nose, not to spite your face, but to, to improve it.
00:36:12.600 We've never, we've never operated off of that.
00:36:14.780 Nobody would willingly cut off his hand to take a hook.
00:36:18.200 Exactly.
00:36:18.940 Exactly.
00:36:19.380 And so we're, you know, we're dealing with it.
00:36:21.760 And that's the thing is it's the technology is secondary to the philosophical application
00:36:26.660 of it.
00:36:27.000 Right.
00:36:27.540 This is what it's all we always comes down to is the technology is, is fundamentally a
00:36:32.340 neutral thing.
00:36:33.980 Um, advanced prosthetics obviously have enormous capability to, to aid, you know, people who
00:36:40.880 have lost limbs, you know, even, um, with artificial wounds, you could say in instances where,
00:36:45.340 you know, there's risks of miscarriage and, and premature birth and so on.
00:36:49.380 Right.
00:36:50.100 To, to, to fulfill the promise of human life that is cut out by the corruption that's endemic
00:36:55.920 to our existence.
00:36:56.780 Um, but that's the, the, that ability comes from power, which can obviously be misused.
00:37:05.400 And anybody who's listening to this podcast, um, you know, probably does not trust those
00:37:10.560 who are in charge of, uh, Western civilization who are in charge of the, uh, sectors which
00:37:16.100 are developing this high technology.
00:37:17.580 The men who make the decisions there do not have our best interests in mind and certainly
00:37:21.900 have no serious conception of ethics.
00:37:24.700 All right.
00:37:25.640 But the, the only, uh, the only problem I can see with this whole, whole, whole, um, idea,
00:37:30.940 I agree you in, in point that technology in and of itself is neutral and all that kind
00:37:34.840 of stuff, but at what point is a temptation in its very nature too long for man to handle?
00:37:40.860 Yeah, that's a good, good question, but I don't know how that's a, that we could, that's a
00:37:47.680 very, very large question, David, you know, it's difficult to kind of just drop that on
00:37:51.480 the discussion table.
00:37:53.000 Yes, because that, that's, um, it doesn't, you know, it doesn't really matter.
00:37:55.840 We have to deal with what we're given.
00:37:57.100 You know, you can make the argument that nuclear weapons are too large a temptation for man to
00:38:00.260 be able to handle, but, you know, uh, assuming that they're not just a psyop, um, we have
00:38:05.700 to deal with their existence.
00:38:06.760 Yes, if there, if there were permanent human answers, then we wouldn't actually require
00:38:11.060 more, more conciliar account, conciliar being a, a useless word in the sense of conciliar
00:38:17.940 understandings of Christianity.
00:38:20.160 We wouldn't need more councils to understand things if we were granted sufficient understanding
00:38:25.160 of all things that were going to come from the point of our, our birth and to the end when
00:38:34.020 Christ comes back.
00:38:34.840 Like, if we knew all of these things, we wouldn't need to deliberate on them and consider them
00:38:39.840 carefully as they come.
00:38:43.180 Did you want to make a comment, Zag?
00:38:46.120 Um, yeah.
00:38:47.580 I mean, I personally, in many ways, I think our current level of power is just too much for
00:38:55.000 what we, um, we can really handle in many ways.
00:38:59.600 Um, the only motivation that I can think of that, that makes sense is to, you know, go
00:39:07.320 to other planets, basically to explore the universe and we're, we're not there yet, but
00:39:12.920 if we're going to just stay on earth, um, yeah, I, I already think that all of the computers
00:39:19.500 and all of the things that.
00:39:25.420 You, uh, you just got out there.
00:39:26.520 Do you want to just repeat your last bit after computers?
00:39:35.960 Well, anyway, we can't, uh, you said cut out, Zagar.
00:39:38.660 So when you come back, hopefully, um, we'll be able to hear you.
00:39:41.800 No, I mean, I think, I think you make a, you make a good point.
00:39:44.800 Yeah, I'm right now to be sustainable.
00:39:46.600 Sorry, can you hear us, uh, can you hear us, Zagar?
00:39:49.560 Zagar, hello, can you hear us?
00:39:51.280 Humanity's stable.
00:39:52.820 I'm, what?
00:39:53.580 What did you say?
00:39:53.960 Sorry, you just, you just come, you just cut out for about 30 seconds there.
00:39:57.960 Oh, damn.
00:39:58.820 It's okay.
00:39:59.240 Do you want to resume what you were saying after computers?
00:40:02.600 Yeah, I said that, um, with, um, computers and all the machines that we have right now, I
00:40:08.480 mean, we're already far beyond the level where, um, it's, where our technology is sustainable
00:40:13.720 because basically technology makes our lives easier, more comfortable, and that makes us
00:40:18.180 softer and weaker, and it, it, it leads to the destruction of civilization.
00:40:22.700 And, you know, even going back thousands of years, um, you had Romans or Greeks who were
00:40:28.980 worried that their current level of comfort was, um, unsustainable, that they were worried
00:40:33.620 that their civilization was going to collapse because of it, and it did.
00:40:36.520 So now with all of our machines, I mean, we're in a very dangerous place.
00:40:41.920 You see what I mean?
00:40:42.780 Like, just, I mean, if you look at the, the intelligence, well, not like the IQ, but just
00:40:49.240 like ability to memorize things and like, uh, knowledge of the past, our ancestors were
00:40:54.880 much better than us, uh, because there were no, books were not easy to get and people had
00:41:00.560 to memorize large amounts of information.
00:41:02.420 And now with the internet and with art, you know, uh, very easy availability of information
00:41:08.240 technology, we don't have to, um, memorize a lot of things.
00:41:12.200 So most people don't know anything about anything.
00:41:15.220 We just look it up.
00:41:16.200 Me included.
00:41:17.840 I mean, I'm not blaming people.
00:41:19.160 It's just, uh, the fact that we've replaced large parts of our minds with prosthetics, prosthetic
00:41:25.240 knowledge, prosthetic memory, prosthetic, uh, uh, understanding of our history and, uh, our connection,
00:41:32.120 other people, it's all prosthetic.
00:41:34.000 And that means that key parts of the human being are already atrophied because we don't
00:41:41.080 use them anymore.
00:41:42.700 Yeah, exactly.
00:41:43.860 Uh, there was even this, this Roman Senator who basically said we should spare a cottage
00:41:49.240 because if we, uh, if you eradicate cottage, we'll be too comfortable.
00:41:53.820 Uh, but the whole thing to go to other plans and that kind of stuff, if, if the current level
00:41:59.620 of technology is in itself unsustainable, then how does it get better to just, just, uh,
00:42:06.920 add code to the momentum, so to speak?
00:42:09.000 Well, I'm saying that if there's a grand goal like that, um, if we could return to hardship
00:42:14.480 basically, because then we could say we need to sacrifice everything.
00:42:18.080 We need to work extremely hard and like put those technologies to practical use in order
00:42:23.840 to reach that goal.
00:42:25.340 Uh, right now we have all this technology and it's going nowhere.
00:42:28.900 So all the technology is used to just make our lives more comfortable.
00:42:32.740 Uh, if we have an, uh, an external goal, like a greater goal, then the technology can be put
00:42:38.480 to a use outside of human comfort and we can, we can make people, um, uncomfortable again,
00:42:44.800 let's say that's, that's what I'm saying.
00:42:46.520 As long as there's an external goal, it changes things in a way that we don't have at all right
00:42:51.700 now.
00:42:52.020 So yeah, indeed, indeed.
00:42:56.220 Now, who left the shores and left all comfort and probably actually, uh, I'll take this back
00:43:02.780 to, I spent most of my youth running and running extracurricularly as a, as a sport.
00:43:07.960 And when you run after a certain amount of distance, your brain starts screaming at you
00:43:12.620 that you are dying and you need to stop.
00:43:15.620 And in a certain way, it's, it's suffering and learning suffering that teaches you to persevere.
00:43:20.840 And the people who have actually fasted for serious amounts of time will understand the
00:43:26.460 same sensation where your body starts actually rebelling against you.
00:43:29.640 And this is actually the, the holy form of transhumanism.
00:43:32.540 This is the conquering your own body through willpower, not conquering your own body through
00:43:38.980 technology portion.
00:43:40.080 Right.
00:43:42.080 Well, and it's ultimately these practices make you more human because it subordinates your
00:43:47.220 physical, um, faculties to the spiritual and the spiritual is higher than the physical in
00:43:52.480 terms of, um, a hierarchy.
00:43:54.000 It is superior to it, um, in a capital S sense.
00:43:59.000 Uh, and now it's not, and of course, but that, that of course, superiority and hierarchy does
00:44:03.600 not necessarily imply, um, uselessness or, um, that the inferior party, the one that is commanded
00:44:11.140 by the superior party, you know, should cease to exist or should strive to be other than
00:44:15.760 what it is.
00:44:16.380 It exists for its, uh, its purpose, um, um, um, um, um, um, um, um, um, um, um, um, um,
00:44:24.000 it, it exists, it has its own purpose for a reason is what I'm trying to say.
00:44:27.880 And so it does not, it doesn't need to be eliminated.
00:44:30.040 And that's what, you know, in that sense, you want to, if you want to use it to Christian
00:44:33.480 transhumanism, but, um, this, these sort of ascetical practices are designed to do.
00:44:38.380 It's designed to make us more human because the, our, our personalities, uh, take greater
00:44:43.820 control over our irrational, um, passions is really what it comes down to is that the, the
00:44:49.520 nature of living in, in this world is that there's corruption and we are, uh,
00:44:53.860 led astray by, uh, vices, basically, you know, lust for power, pride, gluttony, lust, right?
00:45:00.320 Any of these things we, we, if we don't discipline ourselves, discipline our souls, we become servants
00:45:06.020 to them.
00:45:06.760 You know, somebody with 200 IQ and an addiction to heroin is just a 200 IQ servant of heroin.
00:45:14.400 Their whole will exists to satisfy this passion.
00:45:17.880 And so this is what that, that mortification, uh, uh, the imposing of the spirit over the body
00:45:23.640 is about.
00:45:24.260 It's about stealing and disciplining ourselves that we can live life in a rational, ordered
00:45:29.060 way that brings us, you know, ultimately the greatest happiness, uh, because we're able
00:45:33.840 to pursue what really matters and really satisfies over the long run, eternal life and all that.
00:45:39.660 And that's, that's essentially the, the essence of Christianity, right?
00:45:43.120 I mean, your free will exists to choose what you're supposed to choose.
00:45:47.180 Yes, exactly.
00:45:52.100 So, yeah, I mean the, and so this is what it comes down to is that, you know, a lot of
00:45:57.700 the post enlightenment ideologies, um, that we, we so vehemently oppose are based on this
00:46:02.940 faulty conception of the human nature.
00:46:05.740 Um, you know, enlightenment ideology is based on this tabula rasa, right?
00:46:10.480 That human beings are a blank slate and that their environment can inculcate whatever properties
00:46:14.920 they want in them.
00:46:15.700 We know that this is bullshit, but, you know, even communism and many of these other, um,
00:46:21.100 Marxism rather, and many of these other, um, similar ideologies are based on this presumption.
00:46:26.920 They, they don't give, um, recognition to, uh, the human person and to its, to a certain
00:46:35.460 set of attributes, which are inherent to it, a norm or an ideal of human existence.
00:46:42.860 And so this is why they're, they can be used to legitimize such incredible, uh, evil because
00:46:50.220 it, it doesn't have, uh, it excludes any conception of human anthropological good.
00:46:56.200 And so if the human nature is not fixed, if we don't, if we're not the way we are because
00:47:01.000 of, um, some real, uh, necessary good reason, if we're, we are human beings because, uh, you
00:47:09.280 know, of a process of, uh, flux or random environmental, um, uh, environmentalization,
00:47:17.880 right.
00:47:18.240 Um, behavioral training that ultimately there's no real meaning to that programming or that
00:47:23.720 way in which we operate human families, civilization, life flourishing, eudaimonia, right?
00:47:31.040 These don't mean anything.
00:47:31.960 Yeah.
00:47:32.840 These are nominal.
00:47:34.280 These are just the, the words that we give to pleasurable, um, uh, pleasurable exports
00:47:41.920 of human activity.
00:47:43.560 And, you know, that's ultimately, and this is basically what Marxism is about, right?
00:47:47.580 Is it, it wants to create this kind of Star Trek style, you know, post-scarcity future where
00:47:51.520 everybody can, you know, um, go on house.
00:47:55.440 There is something, uh, something quite freaky, uh, freaky, freaky, which I, um, I kind of
00:48:03.460 read somewhat related to this a while ago.
00:48:05.440 It's, I talked about it in another episode.
00:48:07.460 It's all this, this concept about rational egoism, which was found over the Russian nihilists
00:48:12.440 in the 80, uh, in the 1800s, right?
00:48:16.600 And they had this idea that, um, there is no inherent meaning to life at all.
00:48:22.420 It's all meaninglessness, right?
00:48:23.640 That everything that exists is basically your will to, to, to power essentially.
00:48:29.400 And the, the best thing you can do, the, the thing that's in your actual interest as an
00:48:36.360 egoist is to try to improve the life of those around you, right?
00:48:41.840 Now, this doesn't mean that you should give them some sort of existential meaning.
00:48:46.380 It means that you should, you know, provide for them.
00:48:49.580 And what you get, you get this, this sense of power over them, right?
00:48:53.640 You're over them.
00:48:55.060 You're their God, right?
00:48:57.280 Because if God is dead, you're their God.
00:49:00.300 And this kind of ties back to, to, to one of the parts in the Brother Karamasov, the, the
00:49:05.740 Grand Inquisitor.
00:49:06.480 It's one of the most famous parts I've heard.
00:49:08.500 It's the best chapter.
00:49:09.720 Yeah, it is.
00:49:10.180 It is.
00:49:10.920 It's where this, this, um, this Grand Inquisitor guy, he, he paints this picture, right?
00:49:15.320 That this small elite, right?
00:49:16.760 It's of essentially rational egoist, you could say, but he says that they are suffering,
00:49:21.260 but I don't really believe him.
00:49:22.860 Um, and they are basically fooling the people all around them and they're saying, it's all
00:49:29.140 fine.
00:49:29.640 There is no God.
00:49:30.600 You can do whatever you want.
00:49:32.480 Essentially, we allow you to sin, right?
00:49:35.700 And these people love them because they allow them to sin.
00:49:39.720 And there is another part where they, um, they take this, this, uh, this scripture passage
00:49:45.160 about, uh, you wouldn't give stones to your son if you asked for bread, right?
00:49:49.180 And, uh, the Grand Inquisitor, he and his cause, so to speak, to those who want the spiritual
00:49:56.760 bread, that is God, Christ, meaning a life, the church, you know, Eucharist, body of Christ.
00:50:03.320 To them, they give the stones, right?
00:50:05.280 They give the physical stones to those people.
00:50:07.260 And to the people who, um, who want the physical bread, well, they provide bread for them.
00:50:12.020 They give the bread of those who, they gave the stones to these people instead.
00:50:16.260 And you have this sort of anti-Christian utopia, no God, no existential meaning, just a sort
00:50:23.620 of materialistic, intellectualist, mechanized, rational egoism.
00:50:28.760 God is dead, and you're the God now.
00:50:32.300 The only grand architect that exists is man himself.
00:50:36.400 Perhaps you had a better translation than I did.
00:50:38.840 Um, I, I read the Brothers Karamazov in college, and of course, this was a parable told to the,
00:50:46.260 the youngest brother, Alyosha, who was a, a Orthodox monk by his, um, older and, um, atheist
00:50:55.660 brother, Ivan.
00:50:57.380 Dimitri was the, the middle brother, I believe, and the troublemaker, but, um.
00:51:01.540 I think so.
00:51:02.320 And the, the premise of this was, it was Christ through trials, which were, were a little bit
00:51:08.120 late to discuss these timely.
00:51:09.600 But, um, the, the basic premise is, the devil offered Christ the means to save humanity through
00:51:15.160 his own power, to which Christ said, well, if I did that, they could not be saved, because
00:51:21.260 they would have not understood anything I've done for them.
00:51:24.040 And that's the, you, you can reread this, um, this passage has been translated by better
00:51:29.280 people than, um, Kolevsky or Kolosevsky.
00:51:31.600 I, I, I, I don't know, there's, the translations are very problematic and very difficult, but
00:51:37.600 you'll, you'll understand when you read them.
00:51:40.600 And the, the idea behind it is, what, what profit do you have for understanding what is
00:51:47.420 going on if you do not act righteously?
00:51:49.340 And I say this because the, the understanding of Christ was, if I disobey God, I'm being
00:51:58.640 offered the ability by the devil to change the world for the better.
00:52:02.160 If I, if I defy what is right in terms of utilitarian servitude, I can make the world a
00:52:08.980 better place, or I can continue with the plan of my father and further this place in the sense
00:52:15.240 that people will suffer, everyone will suffer, but they will suffer for a reason.
00:52:21.400 And there will be actual salvation at the end of this suffering.
00:52:24.740 There will be pain, but there will be actual, actual salvation.
00:52:29.440 And the decision came to, was Christ saying, yes, yes, even so I see all these things, even
00:52:37.840 so I see the legions of people who will march to hell.
00:52:40.520 I will do this because this is the only rational way and all other things are lies.
00:52:45.980 I seek only truth and all compromises are lies.
00:52:50.400 Yeah, exactly.
00:52:51.900 Plus there, there is another kind of interesting thing as well.
00:52:55.260 We're getting a slight sidetrack, but I hope it's okay.
00:52:58.940 It's, it's all about this, this, you know, this Christ coming to the world, right?
00:53:03.960 In Christ's, you know, he's suffering again.
00:53:09.100 And this grand equista guy, he, you know, he's kind of acting like the Pharisees, right?
00:53:13.720 He's pointing at all the good things they've done.
00:53:15.700 Well, good in quotation marks.
00:53:17.340 And you kind of explain the mentality behind that thing, rude.
00:53:21.320 But the thing is here that they are merely, they are merely using this sort of rational
00:53:26.560 egoism, right?
00:53:27.060 They detach God completely.
00:53:29.400 And when they do that, this antiquity system is all that makes sense to them.
00:53:33.340 I mean, if you, if you read, I think it's like the Fado, I think, or maybe it's just Symposium.
00:53:41.240 You get this, this idea, right?
00:53:44.180 That beauty in and of itself, you can't, you can see it with your eye, right?
00:53:48.520 You don't need to explain it.
00:53:49.500 You can see it because the object you are gazing at is filled with this beauty, which
00:53:54.420 you cannot make sense with, with your brain, right?
00:53:57.460 You can just see it with your inner eye.
00:53:59.400 Right?
00:54:00.020 And this is kind of how you have to, to act with reality, right?
00:54:04.860 You can't just with your mind, make intellectual sense and give it an existential meaning with
00:54:10.120 your mind, like, like, I don't know, gnostic thing, I suppose.
00:54:13.620 You just have to obey it.
00:54:15.780 You need to humble yourself to the mystery.
00:54:17.500 And if you don't do that, well, then essentially all you have is this sort of will to power.
00:54:22.860 And Dr. Johnson has made numerous talks about this topic, and he's well worth reading too.
00:54:29.580 Precisely.
00:54:30.020 And if I could conclude with one thing, if you remember how the parable ends, it ends...
00:54:34.860 I'm sorry, we cut a little bit there, but we're going to go to a break now, and when we come
00:54:41.080 back, we're going to discuss the spiritual element of transhumanism and the theological
00:54:46.780 underpinnings that support it and cover Kali Yugen News.
00:54:50.460 So stay tuned, listeners.
00:55:20.460 Whenever I tell you about the horror story, we comme ça, the political...
00:55:24.540 And it's made up for...
00:55:25.360 And it is my duty, my skill at this to change on.
00:55:30.140 I don't want to sit here and take a look for...
00:55:34.180 One word...
00:55:36.800 written in blood...
00:55:39.720 Revolution!
00:55:44.520 Revolution!
00:55:45.860 Revolution!
00:55:50.460 Ni är alla militärer, av allra högsta rank
00:55:55.460 Ni har styrkan att förgöra, eller skydda detta land
00:56:00.760 Så därför vill jag veta, vem är för, vem är emot
00:56:06.320 Att under mig, er konung, kämpa med ert blod
00:56:11.420 Och just nu som jag talar, så faller Svea Borg
00:56:16.540 Det sker under vår officer, spräng bort ens försorg
00:56:22.000 Och det chansas alla truffer, de har låtit mig förstå
00:56:27.180 Att de är uppställda och redo, att mot Stockholm då
00:56:32.500 Ett bord, skrivet i blod
00:56:36.600 Revo de God
00:56:39.300 Revo de God
00:56:46.540 Revo de God
00:56:50.000 Revo de God
00:56:55.000 Revo de God
00:56:56.000 Revo de God
00:57:00.000 Revo de God
00:57:00.540 Revo de God
00:57:00.780 För att hela det som drar sig där
00:57:04.000 Revo de God
00:57:05.600 För att bygga något som inte kon att ta isär
00:57:09.380 Revo de God
00:57:10.840 Därför mina herrar har jag samlat er här
00:57:14.840 Revo de God
00:57:16.300 För att kunna veta, om ni mig tror det är så väl
00:57:20.300 Vad tysta ni är
00:57:26.760 Ett svar jag begär
00:57:32.220 Var tysta
00:57:36.220 Ni är
00:57:40.680 Vi följer er vakt, ni är gård, vakt, ni är gård, vi tar blod
00:57:50.180 Vi vågar vår liv och blod, om ni är så på runden
00:57:55.180 Vi tror på er, vår konung, vår konung, vi tror
00:58:00.260 Vi ställer oss ut i vakt, om ni är så på runden
00:58:05.720 Revo de God
00:58:09.180 Revo de God
00:58:11.180 Revo de God
00:58:12.180 Revo de God
00:58:16.180 Revo de God
00:58:17.180 Så knyt en bindel runt er arm, låt den lysa vit
00:58:26.180 Jag är tecknet på vem som är vän, och vem som är bandit
00:58:31.640 Och lärna höra vapen, lärna dem är skarpa skott
00:58:37.500 Låt stridsropen eka, mot dem som skövlar dem för råd
00:58:42.640 Det är ett ord
00:58:44.040 Skriven i blod
00:58:46.840 Revo de God
00:58:49.520 Revo de God
00:58:54.100 Revo de God
00:58:54.840 Revo de God
00:59:00.100 Revo de God
00:59:05.520 Revo de God
00:59:10.100 Welcome back from the break
00:59:20.880 This is Mysterium Fasci's episode 25, Transhumanism, part 2
00:59:26.500 Before the break, we, I think, kind of laid the foundations to what we're talking about
00:59:31.500 Touched on a lot of different areas
00:59:32.760 And we discussed why ideologically, politically, transhumanism is an issue for us
00:59:38.000 Because it alters the core of human anthropology
00:59:40.240 And this alteration ultimately always has cascading worldview effects
00:59:46.020 It cannot help but be ideological
00:59:48.320 Because it ultimately will determine the way that human beings interact with one another
00:59:53.140 On an individual scale
00:59:55.520 Interact with themselves
00:59:56.940 And then interact on a social scale
00:59:58.840 It determines human organization
01:00:00.320 Now, as you might expect on Mysterium Fasci's
01:00:07.600 You know, we try
01:00:08.720 We get pretty esoteric and autistic with our analysis
01:00:12.100 We try to get deep into the spiritual meaning
01:00:14.740 Behind certain movements
01:00:17.480 Political messages
01:00:19.960 Etc.
01:00:21.380 Because ultimately, you know, human beings are, as we've said, a holistic entity
01:00:25.060 You can't separate your ideology from your theology
01:00:29.180 Or vice versa
01:00:30.340 These are going to have effects on one another
01:00:32.140 Whether you like it or not
01:00:33.400 And one of the interesting kind of characteristics of modern ideology
01:00:38.320 Is that many modern ideologies come prepackaged with metaphysical frameworks
01:00:44.560 Built right into them
01:00:45.820 You know, philosophical assertions about the nature of reality
01:00:52.220 Human beings, their origins, and their final destination
01:00:55.340 And transhumanism is kind of no different
01:00:57.720 It is a sort of materialist mythology
01:01:03.300 We're going to get into that in a minute
01:01:04.940 But it is a profoundly, profoundly, spiritually
01:01:11.840 I guess you could say implicit philosophy
01:01:16.240 There are so many different aspects of the theological life
01:01:22.260 That it affects without even being conscious of it
01:01:26.460 So, we talked before about what the core
01:01:30.960 What I believe is the core theological underpinning of transhumanism
01:01:35.980 That's Gnosticism
01:01:36.980 Right?
01:01:37.980 So for those of you not familiar
01:01:38.900 Gnosticism was considered to be the first of the Christian heresies
01:01:42.200 Legendarily said to have been pioneered by Simon Magus from the Book of Acts
01:01:48.600 But it essentially was a variety of eclectic beliefs
01:01:52.900 Promoted by various different first and second century heretical Christian teachers
01:02:00.600 But they have certain commonalities
01:02:03.400 And one of the big commonalities was there was this animosity between spirit and matter
01:02:07.620 That they were opposed to one another
01:02:08.980 This dialectical tension
01:02:11.420 And that human beings
01:02:13.280 And there were various schemes
01:02:14.840 But typically the material world was bad
01:02:17.020 It was created by an evil god
01:02:18.380 Human beings are spirits trapped in an evil body
01:02:22.440 And ultimately the human identity resides solely in our spiritual component
01:02:27.960 And the physical body is at best an impediment to achieving our true spiritual gnosis
01:02:35.240 Transcendence
01:02:36.520 Or at worst is an evil
01:02:39.140 It's a parasite
01:02:39.800 It has to be done away with at all costs
01:02:44.120 And so what happens
01:02:45.620 Is when the human being
01:02:47.800 No longer has this principle of habeas corpus applied to him
01:02:50.700 That he is his body
01:02:51.840 That that's an inextricable part of his own identity
01:02:54.440 That it is this non-local
01:02:59.940 Basically the ego
01:03:01.400 The ego becomes the only
01:03:04.060 The non-physical ego becomes the only determination of identity
01:03:08.320 Is that means
01:03:09.960 That
01:03:11.500 That means basically anything goes
01:03:14.020 It reduces very very quickly like to sterner
01:03:16.500 Max sterner style
01:03:17.620 Egoism
01:03:18.160 Moral nihilism
01:03:19.060 You know what
01:03:21.720 Nothing is as agnostic as sodomy and suicide
01:03:25.020 Right
01:03:27.100 Well yeah
01:03:28.220 Just
01:03:29.100 We've got to try to keep it a little bit focused here
01:03:31.460 And not get into
01:03:32.320 All of these different tangents
01:03:34.940 We had to kind of address this one at a time
01:03:36.480 But that's what happens
01:03:37.740 Is when you make this bifurcation
01:03:39.680 Between the spiritual and the physical reality
01:03:42.620 It means that whatever you do with your body
01:03:45.460 Your ethical actions
01:03:46.660 As a result of your energies in the material world
01:03:49.780 Typically speaking
01:03:51.120 Don't have any moral weight
01:03:53.500 And furthermore that
01:03:55.140 You can do whatever you want to yourself
01:03:57.460 And if you yourself have no core human nature
01:04:01.940 Then other people don't either
01:04:03.660 They're perfectly malleable
01:04:05.660 So these are all of the implications
01:04:08.720 So I'm interested in Hans, Rude, Ziger as well
01:04:12.140 You know what your thoughts are on Gnosticism
01:04:13.640 And kind of how that
01:04:14.740 That ties into this
01:04:16.260 Because I think that this is really
01:04:17.560 At the core of a lot of
01:04:19.460 Post-Enlightenment
01:04:20.900 Philosophy
01:04:22.340 That's just kind of left unexamined
01:04:24.240 Well we mentioned it before
01:04:26.060 Are you a machine
01:04:27.260 That your spirit inhabits
01:04:28.980 If you think this is true
01:04:31.140 Then you are
01:04:32.600 With agnostics
01:04:33.700 You believe something that has not
01:04:35.660 Been Christian tradition
01:04:37.420 For X amount of time
01:04:38.860 You are
01:04:39.300 You are the sum of your parts
01:04:40.440 And your spirit
01:04:41.220 You are
01:04:41.740 Something that is a
01:04:43.420 It is a mystery of parts
01:04:47.240 And sum
01:04:48.800 It is something that cannot be
01:04:50.620 Adequately described
01:04:51.700 In mathematical terms
01:04:53.380 The spark of life
01:04:55.120 Is not something that can be
01:04:56.660 Can be given
01:04:57.840 Or taken away
01:04:58.860 By mathematics
01:04:59.680 Medicine
01:05:00.220 Or anything like that
01:05:01.160 It is something that transcends
01:05:03.000 All of these things
01:05:03.720 You
01:05:03.960 You can't measure
01:05:05.340 Measure human life
01:05:06.900 You never can
01:05:07.760 You never will
01:05:08.260 The electrical impulse
01:05:09.260 Isn't going on in your brain
01:05:10.320 We see those
01:05:11.160 Brain dead
01:05:13.140 We
01:05:13.380 We see them
01:05:14.040 All the time
01:05:14.940 The
01:05:16.060 The basic idea
01:05:17.420 Of our existence
01:05:18.820 Is
01:05:19.200 You are a sum
01:05:20.440 Of everything you are
01:05:21.420 And
01:05:21.800 We
01:05:22.440 It all shows up
01:05:23.980 The phantom limbs
01:05:24.900 The
01:05:25.320 Desire for
01:05:27.120 Completion
01:05:28.620 And it
01:05:29.220 It always
01:05:30.080 Always makes itself manifest
01:05:31.560 There is no way to escape
01:05:32.500 You are
01:05:34.440 All that which is you
01:05:35.820 And that is a
01:05:37.080 It is a tautology
01:05:38.160 But it is a true one
01:05:39.120 Yeah
01:05:39.980 I mean
01:05:41.060 You can even see that
01:05:42.580 In people who try to be
01:05:43.960 Super Gnostic
01:05:44.580 And try to escape it
01:05:45.500 Right
01:05:45.700 I mean
01:05:46.540 Take
01:05:46.820 Take
01:05:47.220 Take
01:05:48.140 Sodomites
01:05:48.720 And transsexuals
01:05:49.640 Especially as an example
01:05:50.580 Right
01:05:50.820 Just by the fact
01:05:53.020 That they get so easily
01:05:54.060 Conquered
01:05:54.940 Triggered
01:05:56.220 Proves that they
01:05:57.020 Instinctively realize
01:05:58.220 That what they are doing
01:05:59.180 Is wrong
01:06:00.260 Right
01:06:00.640 And
01:06:01.820 Essentially
01:06:02.920 It is all
01:06:03.700 Gnostic
01:06:04.100 Fagratory
01:06:04.760 Yeah
01:06:08.980 I get
01:06:09.360 Basically what I just
01:06:10.220 Wanted to get at
01:06:10.980 It is that
01:06:11.380 That's what
01:06:11.940 That's what's going on here
01:06:13.240 And
01:06:13.960 Where
01:06:14.740 With Gnosticism
01:06:15.760 It would be the application
01:06:17.060 Of
01:06:17.720 A specialized
01:06:19.840 Subset
01:06:20.820 Of spiritual
01:06:21.760 Knowledge
01:06:22.300 That would allow
01:06:23.080 One to
01:06:23.900 Gain ascension
01:06:25.580 Transcendence
01:06:26.420 Detach from their
01:06:27.720 Physical body
01:06:28.620 Escape the corruption
01:06:30.080 And death of the
01:06:30.760 Physical world
01:06:31.460 What transhumanism
01:06:32.820 Offers instead
01:06:33.540 Is a mechanical
01:06:34.700 Solution
01:06:35.160 A material solution
01:06:36.060 Instead of some sort of
01:06:37.420 Hidden spiritual
01:06:38.320 Knowledge
01:06:38.880 One needs
01:06:40.000 The hidden
01:06:40.740 Esoteric
01:06:41.600 Technical knowledge
01:06:43.360 Then they need
01:06:44.480 The knowledge
01:06:45.300 Of high technology
01:06:46.120 In order to
01:06:47.500 Construct an
01:06:48.320 Artificial
01:06:48.920 Shell
01:06:50.520 Or
01:06:51.340 To implement
01:06:52.400 One's consciousness
01:06:53.100 Then or to
01:06:54.020 Augment one's
01:06:55.140 Own body
01:06:55.960 As if it were
01:06:56.660 A shell
01:06:57.000 In such a way
01:06:57.940 As to overcome
01:06:58.560 The corruption
01:06:59.220 That's inherent
01:06:59.720 To the human form
01:07:00.600 Going back
01:07:01.980 To the definition
01:07:02.780 That we gave
01:07:03.180 At the beginning
01:07:03.560 Of the episode
01:07:04.080 It's a philosophy
01:07:04.920 That explores
01:07:05.700 Human transcendence
01:07:06.660 Above or beyond
01:07:07.460 Organic corporeal
01:07:08.620 Limitations
01:07:09.080 Through technological
01:07:10.300 And philosophical
01:07:11.100 Evolution
01:07:11.700 So that's what
01:07:13.100 This is about
01:07:13.700 In a way
01:07:14.600 There's something
01:07:15.260 Go ahead
01:07:16.780 Yeah
01:07:17.340 In a way
01:07:18.260 I think it's
01:07:19.120 Both Gnosticism
01:07:20.160 And then
01:07:20.520 Modern
01:07:21.200 You know
01:07:21.980 This gender
01:07:22.900 Bending science
01:07:23.580 Metality
01:07:24.040 And this
01:07:24.340 Transhumanism
01:07:25.020 Metality
01:07:25.440 I think it's
01:07:26.840 Both essentially
01:07:27.460 The same thing
01:07:28.200 Godless science
01:07:29.000 If I'm going to
01:07:29.840 Quote Spengler
01:07:30.380 Again a bit
01:07:30.880 Spengler
01:07:32.620 Viewed the
01:07:33.640 Civilization
01:07:34.120 That produced
01:07:34.720 All these sects
01:07:35.840 You know
01:07:36.020 Gnostic sects
01:07:36.920 And even the
01:07:37.440 Civilization
01:07:37.780 Who cannot
01:07:38.300 Kick off
01:07:38.720 Christian
01:07:39.020 They in their
01:07:41.760 Very nature
01:07:42.180 Had another
01:07:42.640 Mentality
01:07:43.160 So they saw
01:07:43.920 It as a
01:07:44.360 Very
01:07:44.540 How should
01:07:44.880 I put it
01:07:45.340 Mysterious
01:07:46.660 And metaphysical
01:07:47.420 Mentality
01:07:48.040 So their
01:07:49.000 Science
01:07:49.520 Would in
01:07:50.100 Its own
01:07:50.840 Nature
01:07:51.300 Be more
01:07:52.500 Oriented
01:07:53.040 Towards the
01:07:53.940 Mystical
01:07:54.480 And the
01:07:55.100 Metaphysical
01:07:55.600 Right
01:07:55.820 So you get
01:07:56.200 Alchemy
01:07:56.720 And all this
01:07:57.180 Kind of stuff
01:07:57.620 Right
01:07:57.900 Whereas the
01:07:59.300 Faustian civilization
01:08:00.060 That is our
01:08:00.580 Civilization
01:08:01.060 Would be
01:08:02.520 More oriented
01:08:03.180 Towards the
01:08:03.700 Practical
01:08:04.140 Right
01:08:04.440 So what we
01:08:06.080 See in both
01:08:07.000 This sort of
01:08:09.300 Gnosticism
01:08:09.940 And in modern
01:08:11.120 Science
01:08:11.740 Is an
01:08:13.360 Understanding
01:08:14.160 Maybe flawed
01:08:17.020 Or not
01:08:17.460 Of the
01:08:18.420 World
01:08:19.000 Without
01:08:20.680 An absolute
01:08:22.040 Meaning
01:08:22.420 Without God
01:08:23.280 Right
01:08:23.920 And if you
01:08:24.600 Detach
01:08:25.200 The understanding
01:08:26.360 Of the
01:08:27.200 Creation
01:08:27.600 Of the
01:08:28.140 World
01:08:28.400 Of the
01:08:28.700 Natural
01:08:28.900 World
01:08:29.320 From God
01:08:30.740 Which gives
01:08:31.300 Its
01:08:31.600 Existential
01:08:33.000 Meaning
01:08:33.320 So to
01:08:33.620 Speak
01:08:33.900 Then all
01:08:35.360 You get
01:08:35.780 Is this
01:08:36.420 And it
01:08:36.740 Doesn't
01:08:37.160 Matter
01:08:37.660 If we
01:08:38.140 Call it
01:08:38.420 Gnosticism
01:08:39.020 If we
01:08:39.300 Call it
01:08:39.740 Scientism
01:08:41.080 If we
01:08:41.400 Call it
01:08:41.740 Astrology
01:08:42.380 If we
01:08:42.620 Call it
01:08:42.940 Alchemy
01:08:43.340 If we
01:08:43.620 Call it
01:08:44.140 Voodoo
01:08:45.640 Magic
01:08:45.980 It doesn't
01:08:46.420 Matter
01:08:46.600 What you
01:08:46.920 Call it
01:08:47.320 If this
01:08:48.540 Is what
01:08:48.840 You're
01:08:49.020 Doing
01:08:49.280 With it
01:08:49.800 Regardless
01:08:50.560 If it's
01:08:51.420 True or
01:08:51.780 Not
01:08:52.080 Then it
01:08:53.740 Is always
01:08:54.620 Without
01:08:55.080 Exception
01:08:55.660 Destructive
01:08:56.540 Florian
01:08:59.000 Maybe you
01:08:59.360 Can
01:08:59.560 Help me
01:09:00.140 Out
01:09:00.320 Here
01:09:00.560 I
01:09:00.740 Don't
01:09:01.120 Understand
01:09:01.720 If
01:09:02.260 The
01:09:04.040 Gnostics
01:09:04.680 Believe
01:09:05.080 That the
01:09:05.480 World
01:09:05.760 Is evil
01:09:06.260 And the
01:09:06.660 Body
01:09:06.900 Is evil
01:09:07.420 But the
01:09:08.080 Spirit
01:09:08.340 Is good
01:09:08.880 Why don't
01:09:10.240 They kill
01:09:10.640 Themselves
01:09:11.240 What's the
01:09:11.800 What's the
01:09:12.280 Like
01:09:13.160 What's the
01:09:13.980 Ideology
01:09:14.660 Or the
01:09:15.300 Method
01:09:15.800 Why is
01:09:17.340 The
01:09:17.540 Spiritual
01:09:18.320 Some of
01:09:19.160 Them did
01:09:19.440 Yes
01:09:19.780 Nothing
01:09:21.000 Is just
01:09:21.340 Not
01:09:21.540 To be
01:09:21.780 Suicide
01:09:22.160 But
01:09:22.360 Many
01:09:22.780 Did
01:09:22.980 This is
01:09:24.760 An easy
01:09:25.540 Critique
01:09:26.140 Different
01:09:26.860 Teachers
01:09:27.240 Had
01:09:27.500 Different
01:09:27.780 Responses
01:09:28.300 To this
01:09:28.600 Question
01:09:29.000 Some
01:09:30.100 Would
01:09:30.320 Say
01:09:30.640 That
01:09:30.900 This
01:09:31.100 Is
01:09:31.280 Part
01:09:32.240 Of
01:09:32.720 The
01:09:33.480 Suffering
01:09:34.000 That
01:09:34.160 We
01:09:34.280 Undergo
01:09:35.300 As
01:09:35.540 Souls
01:09:35.760 Trapped
01:09:36.020 In
01:09:36.100 This
01:09:36.220 Prison
01:09:36.460 Is
01:09:36.640 Part
01:09:36.840 Of
01:09:36.960 To
01:09:38.460 Strengthen
01:09:38.920 Us
01:09:39.120 To
01:09:39.520 Prepare
01:09:39.760 Us
01:09:40.020 To
01:09:40.140 Become
01:09:41.200 Gods
01:09:42.080 When
01:09:42.740 We're
01:09:42.940 Reunited
01:09:43.500 With
01:09:43.800 The
01:09:44.300 Monadic
01:09:44.840 Energies
01:09:45.780 And
01:09:46.140 Our
01:09:46.780 Soul
01:09:47.160 Is
01:09:47.460 Reaminated
01:09:48.140 From
01:09:48.640 The
01:09:51.540 One
01:09:52.000 In
01:09:52.380 A
01:09:52.560 Glorified
01:09:53.160 Or
01:09:53.300 Givonized
01:09:53.660 State
01:09:53.980 There's
01:09:54.480 That
01:09:54.760 There's
01:09:55.560 Different
01:09:55.820 Other
01:09:56.220 Explanations
01:09:59.920 That
01:10:00.100 Are
01:10:00.180 Given
01:10:00.460 It's
01:10:02.660 A
01:10:03.000 Core
01:10:03.860 Existential
01:10:04.460 Issue
01:10:04.940 Gnosticism
01:10:07.000 Was
01:10:07.160 Always
01:10:07.360 Pickybacking
01:10:08.440 Off
01:10:08.620 Of
01:10:08.760 Christianity
01:10:09.280 And
01:10:09.820 So
01:10:09.900 The
01:10:10.020 Gnostics
01:10:10.380 Would
01:10:10.520 Always
01:10:10.700 Place
01:10:10.940 This
01:10:11.080 In
01:10:11.180 The
01:10:11.300 Context
01:10:11.740 Of
01:10:12.100 Christian
01:10:12.960 Soteriology
01:10:13.760 So
01:10:14.320 It
01:10:14.480 Say
01:10:14.600 Jesus
01:10:15.160 Christ
01:10:15.540 Was
01:10:15.740 Really
01:10:15.980 He
01:10:16.200 Wasn't
01:10:16.420 A
01:10:16.580 Man
01:10:17.580 Come
01:10:17.760 In
01:10:17.840 The
01:10:17.940 Flesh
01:10:18.280 But
01:10:18.800 He
01:10:18.920 Was
01:10:19.180 The
01:10:19.580 Avatar
01:10:20.080 Of
01:10:20.840 The
01:10:21.700 Supreme
01:10:22.320 Personality
01:10:22.980 Of
01:10:23.320 The
01:10:23.480 Monad
01:10:23.980 Who
01:10:24.420 Had
01:10:24.560 Come
01:10:24.840 To
01:10:25.180 Free
01:10:25.900 Souls
01:10:26.460 From
01:10:26.680 Their
01:10:26.980 Prison
01:10:28.060 Of
01:10:28.240 Flesh
01:10:28.560 Basically
01:10:29.120 And
01:10:29.900 If
01:10:30.020 You
01:10:30.120 Really
01:10:30.460 You
01:10:30.700 Really
01:10:31.080 Want
01:10:31.300 To
01:10:31.380 Get
01:10:31.500 Into
01:10:31.740 It
01:10:31.920 He
01:10:32.360 Was
01:10:32.580 They
01:10:32.780 Viewed
01:10:33.020 Him
01:10:33.140 As
01:10:33.260 The
01:10:33.380 Bodhisattva
01:10:34.060 Which
01:10:34.540 You
01:10:34.780 Can
01:10:34.920 You
01:10:35.100 Can
01:10:35.240 Trace
01:10:35.480 The
01:10:35.620 Influence
01:10:36.080 Through
01:10:37.080 Through
01:10:37.180 The
01:10:37.680 East
01:10:38.060 Onwards
01:10:38.800 But
01:10:39.040 Actually
01:10:39.680 There
01:10:39.820 Was
01:10:39.920 A
01:10:40.100 Really
01:10:40.340 Funny
01:10:40.860 Sect
01:10:41.320 Of
01:10:41.800 Gnostics
01:10:43.680 Who
01:10:43.960 Used to
01:10:44.780 Beat
01:10:45.180 Travelers
01:10:45.920 With
01:10:46.240 Sticks
01:10:46.840 In
01:10:47.060 The
01:10:47.180 Hopes
01:10:47.440 The
01:10:47.640 Travelers
01:10:48.080 Would
01:10:48.220 Martyr
01:10:48.560 Them
01:10:48.800 Because
01:10:48.980 That
01:10:49.100 Was
01:10:49.200 The
01:10:49.300 Only
01:10:49.400 Way
01:10:49.600 They
01:10:49.700 Could
01:10:49.820 Actualize
01:10:50.880 Themselves
01:10:51.220 That
01:10:51.420 They
01:10:51.640 Could
01:10:51.840 And
01:10:52.520 Yes
01:10:52.940 It
01:10:53.180 Takes
01:10:54.000 It
01:10:54.280 Takes
01:10:54.600 An
01:10:54.760 Understanding
01:10:55.160 Of
01:10:55.340 That
01:10:55.520 Level
01:10:55.780 Of
01:10:55.960 Buddhism
01:10:56.220 To
01:10:56.480 Make
01:10:56.660 Sense
01:10:56.980 Of
01:10:57.140 A lot
01:10:57.480 Of
01:10:57.760 Early
01:10:58.700 Gnostic
01:10:59.340 Heresty
01:11:00.020 It's
01:11:00.960 So
01:11:01.120 Fun
01:11:01.300 To
01:11:01.420 Hear
01:11:01.560 People
01:11:01.820 Say
01:11:02.040 Christianity
01:11:02.780 Is
01:11:03.040 So
01:11:03.260 Compatible
01:11:03.960 Buddhism
01:11:04.520 No
01:11:05.140 It
01:11:05.240 Isn't
01:11:05.420 We
01:11:05.680 We
01:11:05.900 Fought
01:11:06.420 This
01:11:06.620 Out
01:11:06.800 In
01:11:06.900 The
01:11:07.000 First
01:11:07.200 Two
01:11:07.380 Centuries
01:11:07.780 No
01:11:08.040 We're
01:11:08.160 Not
01:11:08.420 And
01:11:09.240 These
01:11:09.960 People
01:11:10.600 These
01:11:11.340 Bodhisattva
01:11:12.320 Seeking
01:11:13.400 People
01:11:13.840 They
01:11:14.480 Were
01:11:14.840 All
01:11:15.040 Killed
01:11:15.400 To
01:11:15.580 The
01:11:15.680 Man
01:11:15.960 Because
01:11:16.300 They
01:11:16.640 Sought
01:11:17.000 Honestly
01:11:18.040 Their
01:11:19.060 Gnosticism
01:11:20.000 The
01:11:20.440 Worst
01:11:20.760 Gnosticism
01:11:21.360 Is
01:11:21.500 That
01:11:21.700 Which
01:11:21.980 Is
01:11:22.120 Hypocritical
01:11:22.680 And
01:11:22.820 Pursues
01:11:23.300 Itself
01:11:23.740 And
01:11:24.020 Survives
01:11:25.040 Despite
01:11:25.560 Confrontation
01:11:26.800 With
01:11:27.060 Its
01:11:27.260 Own
01:11:27.600 Sort
01:11:27.980 Of
01:11:28.080 Ideas
01:11:28.600 And
01:11:29.080 That
01:11:29.220 Is
01:11:29.500 Arianism
01:11:30.280 And
01:11:30.540 That
01:11:31.060 That's
01:11:31.360 Another
01:11:31.580 Method
01:11:32.080 For
01:11:32.280 An
01:11:32.420 Entirely
01:11:33.060 Different
01:11:33.380 Podcast
01:11:33.820 But
01:11:34.180 Yeah
01:11:34.400 It
01:11:35.040 Is
01:11:35.920 Nonsensical
01:11:36.680 And
01:11:36.840 It
01:11:36.980 Is
01:11:37.300 Self
01:11:37.640 Eradicating
01:11:38.160 Thankfully
01:11:38.820 Self
01:11:39.200 Removing
01:11:39.680 Heresy
01:11:40.140 Is
01:11:40.360 The
01:11:40.480 Best
01:11:40.720 Heresy
01:11:41.160 Yeah
01:11:42.000 Agreed
01:11:42.480 And
01:11:43.800 That's
01:11:45.960 Is
01:11:46.180 Like
01:11:47.020 It
01:11:47.160 Was
01:11:47.560 Never
01:11:47.880 A
01:11:48.540 Like
01:11:49.160 Certain
01:11:49.340 Sort
01:11:49.540 Of
01:11:49.660 Organized
01:11:50.200 Platform
01:11:50.700 Or
01:11:50.880 Religion
01:11:51.140 It
01:11:51.380 Was
01:11:51.640 A
01:11:51.840 Whole
01:11:51.960 Bunch
01:11:52.180 Of
01:11:52.300 Teachers
01:11:52.560 Deviations
01:11:53.400 You
01:11:54.800 Know
01:11:54.920 From
01:11:55.340 Orthodox
01:11:56.160 Christianity
01:11:56.740 Yeah
01:11:57.460 In
01:11:58.500 A
01:11:58.640 Small
01:11:58.880 O
01:11:59.060 Sense
01:11:59.300 So
01:11:59.580 Yeah
01:12:00.880 This
01:12:01.360 Is
01:12:01.540 And
01:12:01.980 This
01:12:02.100 Is
01:12:02.220 What
01:12:02.300 It
01:12:02.380 Comes
01:12:02.520 Back
01:12:02.780 To
01:12:02.960 Is
01:12:03.120 That
01:12:03.260 This
01:12:03.500 Is
01:12:04.120 Exactly
01:12:04.520 The
01:12:04.740 Same
01:12:04.900 Mentality
01:12:05.380 It's
01:12:05.980 The
01:12:06.120 Application
01:12:06.720 Of
01:12:06.940 The
01:12:07.060 Secret
01:12:07.400 Techn
01:12:07.780 Technology
01:12:08.560 Comes
01:12:08.900 From
01:12:09.060 The
01:12:09.160 Greek
01:12:09.360 Which
01:12:09.840 Means
01:12:11.200 Craft
01:12:11.920 Or
01:12:12.800 Knowledge
01:12:13.260 And
01:12:13.420 As
01:12:13.580 We
01:12:13.700 Did
01:12:13.820 Explain
01:12:14.060 Before
01:12:14.400 This
01:12:14.660 Techn
01:12:14.960 Has
01:12:15.880 The
01:12:16.080 Sense
01:12:16.380 Of
01:12:16.720 Like
01:12:17.000 A
01:12:17.400 Secret
01:12:17.880 Illuminational
01:12:18.720 Power
01:12:19.320 That
01:12:20.040 You
01:12:20.280 Gain
01:12:20.680 An
01:12:20.860 Understanding
01:12:21.360 Of
01:12:21.540 The
01:12:21.640 Order
01:12:21.900 Of
01:12:22.040 The
01:12:22.140 Universe
01:12:22.480 The
01:12:22.840 Logos
01:12:23.500 Which
01:12:24.120 Allows
01:12:24.640 You
01:12:24.760 To
01:12:24.880 Manipulate
01:12:25.360 It
01:12:25.540 At
01:12:25.640 Its
01:12:25.780 Core
01:12:25.980 Level
01:12:26.340 So
01:12:27.040 Like
01:12:27.220 A
01:12:27.320 Blacksmith
01:12:27.860 His
01:12:28.260 Techn
01:12:28.760 His
01:12:29.200 Secret
01:12:29.600 Knowledge
01:12:30.160 About
01:12:30.800 Metal
01:12:31.220 And
01:12:31.420 Metallurgical
01:12:31.940 Principles
01:12:32.540 Allows
01:12:33.340 Him
01:12:33.500 To
01:12:33.660 Manipulate
01:12:34.780 These
01:12:35.120 Incredibly
01:12:36.300 Hard
01:12:36.740 Substances
01:12:37.340 To
01:12:37.840 Create
01:12:38.020 Very
01:12:38.240 Deadly
01:12:38.640 Implements
01:12:39.200 Or
01:12:41.000 Very
01:12:41.420 Useful
01:12:41.860 Implements
01:12:42.320 For
01:12:42.720 Farming
01:12:43.040 Whatever
01:12:43.340 Right
01:12:43.740 And
01:12:44.900 So
01:12:45.100 Our
01:12:45.920 Knowledge
01:12:46.540 Of
01:12:47.020 The
01:12:47.840 Material
01:12:48.180 World
01:12:48.540 Becomes
01:12:48.840 So
01:12:49.080 Advanced
01:12:49.500 And
01:12:49.720 Detailed
01:12:50.220 That
01:12:50.740 We
01:12:50.940 Can
01:12:51.240 Subvert
01:12:52.080 Its
01:12:52.500 Own
01:12:53.080 Inclinations
01:12:54.340 That
01:12:54.600 Is
01:12:54.720 To
01:12:54.820 Say
01:12:54.960 For
01:12:55.080 Human
01:12:55.260 Beings
01:12:55.520 To
01:12:55.700 Die
01:12:56.100 Which
01:12:56.440 Is
01:12:56.660 Clearly
01:12:57.060 Written
01:12:57.840 In
01:12:58.020 Natural
01:12:58.280 Law
01:12:58.660 You
01:13:00.800 Know
01:13:00.980 For
01:13:01.880 Our
01:13:02.420 Own
01:13:02.620 Transcendence
01:13:03.220 Our
01:13:03.340 Purpose
01:13:03.940 Through
01:13:04.340 The
01:13:04.500 Use
01:13:04.700 Of
01:13:04.800 Technology
01:13:05.360 Right
01:13:06.080 And
01:13:06.620 So
01:13:06.680 That's
01:13:06.880 What
01:13:06.960 It
01:13:07.080 Offers
01:13:07.480 Essentially
01:13:08.340 It's
01:13:08.820 A
01:13:08.940 Technological
01:13:09.640 Eschaton
01:13:10.280 It's
01:13:10.840 Technological
01:13:11.500 Salvation
01:13:12.120 It's
01:13:13.420 Fundamentally
01:13:14.060 Transhumanism
01:13:14.700 Is
01:13:14.840 First
01:13:15.080 And
01:13:15.260 Foremost
01:13:15.660 A
01:13:16.140 Religious
01:13:16.680 Belief
01:13:17.180 Because
01:13:18.280 It
01:13:18.460 Presents
01:13:18.860 With
01:13:19.120 It
01:13:19.320 Presents
01:13:19.660 You
01:13:19.820 A
01:13:19.980 Whole
01:13:20.180 Capital
01:13:20.600 Mythology
01:13:21.680 Parallel
01:13:24.320 To
01:13:24.520 It
01:13:24.740 Right
01:13:25.280 It
01:13:25.420 Makes
01:13:25.620 You
01:13:25.860 Usually
01:13:26.280 Transhumanism
01:13:27.300 Is
01:13:27.440 Highly
01:13:27.680 Darwinistic
01:13:28.360 Right
01:13:28.920 It
01:13:29.280 Makes
01:13:29.460 The
01:13:29.600 Assertion
01:13:30.000 That
01:13:30.160 Human
01:13:30.380 Beings
01:13:30.720 Kind
01:13:31.080 Of
01:13:31.180 Emerged
01:13:31.700 From
01:13:32.140 A
01:13:33.020 Primordial
01:13:33.740 Soup
01:13:34.200 In
01:13:35.360 The
01:13:36.520 Far
01:13:36.840 Beyond
01:13:37.360 Years
01:13:38.660 The
01:13:40.120 Dim
01:13:40.500 Primeval
01:13:41.660 Materialistic
01:13:42.680 Flux
01:13:43.140 That
01:13:43.700 The
01:13:43.820 Universe
01:13:44.200 Is
01:13:44.480 Just
01:13:44.720 Sound
01:13:46.320 And
01:13:47.020 Lightning
01:13:47.760 Rumbling
01:13:49.320 Forever
01:13:49.780 And
01:13:49.960 Ever
01:13:50.120 And
01:13:50.360 We
01:13:50.520 Randomly
01:13:51.320 Ended
01:13:51.760 Up
01:13:51.900 At
01:13:52.000 This
01:13:52.180 Point
01:13:52.560 Where
01:13:53.300 We
01:13:53.500 Have
01:13:53.740 Consciousness
01:13:54.500 And
01:13:54.880 Through
01:13:55.120 A
01:13:55.320 Fluke
01:13:55.700 Of
01:13:55.840 Luck
01:13:56.120 We
01:13:56.880 Evolved
01:13:57.240 In
01:13:57.380 Such
01:13:57.560 A
01:13:57.700 Way
01:13:57.880 As
01:13:58.160 To
01:13:58.480 Be
01:13:59.000 A
01:13:59.100 To
01:13:59.380 Manipulate
01:13:59.760 The
01:13:59.920 World
01:14:00.080 With
01:14:00.240 This
01:14:00.380 Level
01:14:00.600 Of
01:14:00.720 Precision
01:14:01.100 And
01:14:01.700 So
01:14:01.780 The
01:14:01.920 Logical
01:14:02.300 Thing
01:14:02.580 To
01:14:02.780 Guarantee
01:14:03.140 The
01:14:03.540 Individual
01:14:04.100 Maximum
01:14:05.100 Happiness
01:14:05.500 Is
01:14:06.180 To
01:14:06.320 Transcend
01:14:07.060 Material
01:14:07.820 Death
01:14:08.220 Because
01:14:08.780 Most
01:14:08.980 Of
01:14:09.060 These
01:14:09.160 People
01:14:09.440 Right
01:14:09.640 They
01:14:09.760 Don't
01:14:09.960 They
01:14:10.160 Don't
01:14:10.380 Believe
01:14:10.780 In
01:14:11.100 The
01:14:11.260 Soul
01:14:11.560 Really
01:14:13.000 Yeah
01:14:15.360 Of
01:14:15.600 Course
01:14:15.780 Anyway
01:14:17.680 Sorry
01:14:18.000 Yeah
01:14:18.240 I mean
01:14:18.480 You
01:14:18.580 Just
01:14:18.780 Take
01:14:19.180 It
01:14:19.260 Off
01:14:19.360 In
01:14:19.460 Many
01:14:19.600 Different
01:14:19.780 Directions
01:14:20.200 So
01:14:21.800 The
01:14:22.200 The
01:14:22.760 Unless
01:14:24.020 Anybody
01:14:24.300 Else
01:14:24.520 Had
01:14:24.640 Any
01:14:24.820 Points
01:14:25.200 They
01:14:25.300 Wanted
01:14:25.480 To
01:14:25.560 Make
01:14:25.760 Not
01:14:28.680 Really
01:14:28.940 No
01:14:29.180 Okay
01:14:29.440 So
01:14:29.680 What
01:14:29.800 I
01:14:29.900 Wanted
01:14:30.080 To
01:14:30.200 Talk
01:14:30.400 About
01:14:30.560 Was
01:14:30.720 The
01:14:30.940 The
01:14:31.940 Critical
01:14:33.020 Thing
01:14:33.300 Here
01:14:33.500 Is
01:14:33.680 The
01:14:33.800 Assault
01:14:34.160 On
01:14:34.300 The
01:14:34.420 Body
01:14:34.720 Right
01:14:35.200 We
01:14:35.360 I
01:14:35.820 Think
01:14:35.940 We
01:14:36.060 Isolated
01:14:36.460 This
01:14:36.640 Several
01:14:36.860 Times
01:14:37.220 So
01:14:37.340 What
01:14:37.500 I
01:14:37.580 Think
01:14:37.720 It
01:14:37.860 Bears
01:14:38.160 Your
01:14:38.280 Pean
01:14:38.420 Again
01:14:38.580 Coming
01:14:38.760 Back
01:14:39.000 To
01:14:39.140 It
01:14:39.300 Is
01:14:40.320 That
01:14:40.600 For
01:14:41.460 Christians
01:14:42.040 And
01:14:42.240 Not
01:14:42.420 Just
01:14:42.600 For
01:14:42.720 Christians
01:14:43.000 But
01:14:43.200 For
01:14:43.360 Ancient
01:14:44.980 Pagan
01:14:45.440 Philosophers
01:14:46.040 As
01:14:46.200 Well
01:14:46.500 And
01:14:46.680 Other
01:14:46.860 Religions
01:14:47.320 Hinduism
01:14:48.880 All
01:14:49.060 This
01:14:49.200 The
01:14:49.480 Body
01:14:49.820 Is
01:14:50.100 Seen
01:14:50.440 As
01:14:50.940 The
01:14:51.840 Microcosm
01:14:52.660 Of
01:14:52.800 The
01:14:52.920 Universe
01:14:53.380 That
01:14:54.680 The
01:14:54.820 Human
01:14:55.120 Form
01:14:55.660 In
01:14:56.000 Itself
01:14:56.540 Is
01:14:57.380 The
01:14:57.600 Apex
01:14:58.060 Of
01:14:58.300 Creation
01:14:58.800 And
01:14:59.380 Embodies
01:14:59.860 The
01:15:00.140 Cosmic
01:15:00.540 Order
01:15:00.800 The
01:15:01.880 Most
01:15:02.300 Poignant
01:15:03.500 Example
01:15:04.200 Of
01:15:04.360 This
01:15:04.620 Is
01:15:05.260 That
01:15:05.380 We
01:15:05.560 Have
01:15:06.000 Rationality
01:15:07.160 Is
01:15:08.440 That
01:15:08.580 Human
01:15:08.920 Beings
01:15:09.320 Alone
01:15:09.760 Of
01:15:10.000 All
01:15:10.220 Of
01:15:10.340 The
01:15:10.460 Other
01:15:10.680 Species
01:15:11.320 Of
01:15:11.460 The
01:15:11.560 Planet
01:15:11.860 Through
01:15:13.600 The
01:15:13.820 Combination
01:15:14.280 Of
01:15:14.480 Our
01:15:14.620 Spirit
01:15:14.960 And
01:15:15.160 Our
01:15:15.280 Advanced
01:15:15.660 Biology
01:15:16.840 Including
01:15:17.560 Our
01:15:17.760 Brain
01:15:18.000 We're
01:15:19.480 Able
01:15:19.720 To
01:15:19.900 Understand
01:15:20.300 The
01:15:20.480 Patterns
01:15:20.820 Of
01:15:20.920 The
01:15:21.020 Universe
01:15:21.460 Build
01:15:24.180 Up
01:15:24.460 A
01:15:25.580 Volume
01:15:26.920 Of
01:15:27.100 Knowledge
01:15:27.420 Necessary
01:15:28.060 To
01:15:28.540 Manipulate
01:15:29.860 And
01:15:30.040 Act
01:15:30.320 Upon
01:15:30.540 These
01:15:30.720 Patterns
01:15:31.200 For
01:15:31.820 Our
01:15:32.000 Long
01:15:32.280 Term
01:15:32.480 Gain
01:15:32.720 And
01:15:32.860 Benefit
01:15:33.300 And
01:15:33.840 So
01:15:33.980 We
01:15:34.140 Can
01:15:34.320 Create
01:15:34.740 Things
01:15:35.100 In
01:15:35.260 Our
01:15:35.380 Own
01:15:35.600 Image
01:15:35.920 We
01:15:36.120 Can
01:15:36.300 Create
01:15:36.620 Tools
01:15:37.040 And
01:15:37.260 Technology
01:15:37.860 Build
01:15:38.520 Civilizations
01:15:39.340 Social
01:15:39.820 Systems
01:15:40.420 Right
01:15:41.520 Organize
01:15:42.280 Ourselves
01:15:42.780 We
01:15:43.300 Can
01:15:43.440 Have
01:15:43.680 A
01:15:43.820 Complexity
01:15:44.420 Of
01:15:44.580 Life
01:15:44.880 Which
01:15:45.280 Is
01:15:45.380 Not
01:15:45.540 Available
01:15:45.940 To
01:15:46.240 The
01:15:46.400 Other
01:15:46.740 Other
01:15:48.280 Species
01:15:48.660 On
01:15:48.800 This
01:15:48.920 Planet
01:15:49.200 Because
01:15:49.880 They
01:15:50.060 Don't
01:15:50.280 Have
01:15:50.540 Access
01:15:51.040 To
01:15:51.500 This
01:15:51.680 Rational
01:15:52.100 Faculty
01:15:52.660 Which
01:15:53.320 Allows
01:15:53.680 Us
01:15:53.840 To
01:15:54.020 Peer
01:15:58.540 Teoteric
01:15:58.920 The
01:15:59.100 Deep
01:15:59.780 Nature
01:16:00.660 Of
01:16:00.980 Our
01:16:01.140 Reality
01:16:01.520 We
01:16:02.060 Can
01:16:02.220 Pull
01:16:02.560 Back
01:16:02.920 The
01:16:03.120 Veil
01:16:03.440 Of
01:16:03.680 Accident
01:16:04.120 Of
01:16:04.260 Appearance
01:16:04.740 And
01:16:05.300 Investigate
01:16:05.880 The
01:16:06.040 Real
01:16:06.340 And
01:16:06.500 The
01:16:06.640 Numinous
01:16:07.100 That
01:16:07.540 Exists
01:16:07.940 Underneath
01:16:08.600 The
01:16:09.080 Visual
01:16:09.340 Presentation
01:16:09.940 That
01:16:10.100 Is
01:16:10.200 Given
01:16:10.360 To
01:16:10.520 Us
01:16:10.760 And
01:16:12.040 So
01:16:12.200 What
01:16:12.360 Happens
01:16:12.740 Is
01:16:12.980 When
01:16:13.320 When
01:16:15.040 We
01:16:15.860 You
01:16:16.400 Transhumanism
01:16:17.320 Basically
01:16:17.740 Doesn't
01:16:18.420 Have
01:16:18.560 Any
01:16:18.720 Conception
01:16:19.140 Of
01:16:19.340 This
01:16:19.560 The
01:16:20.060 Human
01:16:20.280 Form
01:16:20.640 Is
01:16:20.800 Not
01:16:20.980 Viewed
01:16:21.480 As
01:16:21.800 An
01:16:21.920 Icon
01:16:22.520 Of
01:16:22.680 Divinity
01:16:23.160 As
01:16:24.320 The
01:16:24.440 Manifestation
01:16:25.040 Of
01:16:25.160 The
01:16:25.280 Cosmic
01:16:25.620 Order
01:16:25.900 As
01:16:27.200 The
01:16:27.540 You
01:16:28.620 Know
01:16:28.740 This
01:16:28.940 Kind
01:16:29.340 Of
01:16:29.520 Kingly
01:16:30.460 Creation
01:16:30.840 It's
01:16:31.520 Viewed
01:16:31.960 As
01:16:32.320 This
01:16:32.740 Just
01:16:33.660 Article
01:16:35.340 Of
01:16:35.560 Convenience
01:16:36.320 You
01:16:37.260 Know
01:16:37.340 You're
01:16:37.480 Human
01:16:37.760 Because
01:16:38.040 You're
01:16:38.220 Born
01:16:38.760 As
01:16:39.060 One
01:16:39.400 And
01:16:39.620 All
01:16:40.120 That
01:16:40.280 Really
01:16:40.460 Means
01:16:40.820 Is
01:16:40.980 That
01:16:41.180 You
01:16:41.600 Have
01:16:41.760 This
01:16:41.880 Sort
01:16:42.020 Of
01:16:42.140 Fleshly
01:16:42.640 Existence
01:16:43.440 You
01:16:44.700 Know
01:16:44.820 At
01:16:45.060 Some
01:16:45.740 Really
01:16:46.440 Non
01:16:47.060 Important
01:16:48.080 Place
01:16:48.460 In
01:16:48.620 Time
01:16:48.920 There's
01:16:49.120 Not
01:16:49.360 A
01:16:49.640 There's
01:16:50.660 No
01:16:50.940 Undergirding
01:16:52.060 Value
01:16:52.540 To
01:16:54.940 Biological
01:16:55.460 Life
01:16:55.900 Or
01:16:56.160 To
01:16:56.500 The
01:16:57.160 Human
01:16:57.340 Structure
01:16:57.780 You say
01:16:58.840 Article
01:16:59.320 Of
01:16:59.500 Convenience
01:17:00.060 But
01:17:00.340 It's
01:17:00.600 Even
01:17:00.800 Further
01:17:01.240 It's
01:17:01.560 Like
01:17:01.720 Article
01:17:02.100 Of
01:17:02.320 Contempt
01:17:02.860 Yes
01:17:03.560 These
01:17:03.860 People
01:17:04.160 Call
01:17:04.480 The
01:17:04.640 Body
01:17:04.920 Meat
01:17:05.480 Puppets
01:17:06.040 Or
01:17:06.280 Talk
01:17:06.480 About
01:17:06.660 Meat
01:17:06.960 Space
01:17:07.340 And
01:17:07.560 Things
01:17:07.680 Like
01:17:07.820 That
01:17:08.060 And
01:17:08.240 They
01:17:08.420 Clearly
01:17:08.920 Hate
01:17:09.260 It
01:17:09.460 Yes
01:17:10.820 Exactly
01:17:11.200 Exactly
01:17:12.020 That's
01:17:12.320 What
01:17:12.400 It
01:17:12.520 Is
01:17:12.660 It's
01:17:14.020 Beyond
01:17:14.380 This
01:17:15.920 Passion
01:17:17.960 It's
01:17:18.280 Contempt
01:17:18.840 And
01:17:18.980 Sometimes
01:17:19.180 Even
01:17:19.400 Animosity
01:17:20.040 And
01:17:20.680 This
01:17:20.920 Is
01:17:21.020 What
01:17:21.140 We
01:17:21.260 Would
01:17:21.400 See
01:17:21.600 Among
01:17:21.780 The
01:17:21.940 Ancient
01:17:24.940 That
01:17:25.060 The
01:17:25.180 Body
01:17:25.440 Doesn't
01:17:25.740 Matter
01:17:26.080 That
01:17:26.760 The
01:17:26.880 Body
01:17:27.100 Is
01:17:27.220 Not
01:17:27.340 An
01:17:27.500 Icon
01:17:27.860 Of
01:17:28.020 Divinity
01:17:28.420 Then
01:17:30.440 You
01:17:30.640 Get
01:17:30.960 Two
01:17:31.140 Attitudes
01:17:31.480 One
01:17:31.860 You
01:17:32.020 Get
01:17:32.140 People
01:17:32.340 Who
01:17:32.520 Hate
01:17:32.760 The
01:17:32.920 Body
01:17:33.160 They
01:17:34.120 Want
01:17:34.300 To
01:17:34.380 Do
01:17:34.500 Everything
01:17:34.840 In
01:17:35.000 Their
01:17:35.120 Power
01:17:35.440 To
01:17:35.800 Destroy
01:17:37.220 It
01:17:37.400 Or
01:17:37.760 You
01:17:38.740 Get
01:17:41.000 The
01:17:41.120 Attitude
01:17:41.380 That
01:17:41.600 What
01:17:41.840 You
01:17:41.960 Do
01:17:42.060 With
01:17:42.160 The
01:17:42.260 Body
01:17:42.460 Doesn't
01:17:42.700 Matter
01:17:42.940 At
01:17:43.100 All
01:17:43.340 So
01:17:44.000 You
01:17:44.080 Can
01:17:44.180 Totally
01:17:44.440 Modified
01:17:45.020 You
01:17:45.180 Can
01:17:45.300 Change
01:17:45.640 It
01:17:45.780 So
01:17:45.980 Whatever
01:17:46.340 You
01:17:47.060 Think
01:17:48.180 Whatever
01:17:48.500 Satisfies
01:17:49.020 You
01:17:49.140 The
01:17:49.280 Best
01:17:49.500 And
01:17:49.640 You
01:17:49.740 Can
01:17:49.840 Do
01:17:49.980 Whatever
01:17:50.200 You
01:17:50.360 Want
01:17:50.620 With
01:17:50.820 It
01:17:50.960 As
01:17:51.660 Long
01:17:51.900 As
01:17:52.040 You're
01:17:52.260 Satisfying
01:17:52.920 With
01:17:53.260 The
01:17:53.380 Actions
01:17:53.700 Of
01:17:53.840 Your
01:17:53.940 Body
01:17:54.280 Right
01:17:55.000 You're
01:17:55.280 Getting
01:17:55.480 Pleasure
01:17:55.860 You're
01:17:56.320 Sending
01:17:56.860 Spiritually
01:17:57.680 Right
01:17:58.380 Then
01:17:58.520 Whatever
01:17:58.740 You
01:17:58.940 Do
01:17:59.260 Doesn't
01:17:59.660 Matter
01:17:59.840 How
01:18:00.060 Much
01:18:00.400 You
01:18:01.500 Know
01:18:01.700 Gay
01:18:02.180 Ritual
01:18:02.600 Sex
01:18:02.920 Magic
01:18:03.220 You
01:18:03.400 Do
01:18:03.680 Doesn't
01:18:04.180 Matter
01:18:04.380 How
01:18:04.620 Many
01:18:04.820 Children
01:18:05.180 You
01:18:05.380 Enslaved
01:18:05.860 It
01:18:06.300 Doesn't
01:18:06.480 Matter
01:18:06.700 You
01:18:06.940 Know
01:18:07.020 How
01:18:07.240 Many
01:18:07.500 Nations
01:18:08.040 You
01:18:08.260 Ruin
01:18:08.660 For
01:18:09.100 Your
01:18:09.240 Own
01:18:09.440 Personal
01:18:09.820 Gain
01:18:10.220 With
01:18:10.700 Your
01:18:10.820 Body
01:18:11.140 Right
01:18:12.020 Everything
01:18:12.940 Goes
01:18:13.240 No
01:18:14.780 Utility
01:18:15.580 And
01:18:16.960 I
01:18:17.200 Mean
01:18:17.360 This
01:18:17.700 Is
01:18:17.840 Basically
01:18:18.220 What
01:18:18.680 Hindu
01:18:19.300 What
01:18:19.780 Hinduism
01:18:20.700 What
01:18:21.420 Yoga
01:18:21.700 Was
01:18:21.900 All
01:18:22.060 About
01:18:22.320 As
01:18:22.520 Well
01:18:22.720 I
01:18:23.160 Mean
01:18:23.320 They
01:18:23.660 Would
01:18:24.260 Shape
01:18:24.660 Their
01:18:24.920 Body
01:18:25.240 Which
01:18:25.500 Is
01:18:25.760 You
01:18:26.000 Know
01:18:26.180 This
01:18:27.960 The
01:18:28.120 Symbol
01:18:28.580 Of
01:18:28.980 The
01:18:29.200 Cosmic
01:18:29.480 Water
01:18:29.720 Right
01:18:30.000 And
01:18:30.420 They
01:18:30.500 Would
01:18:30.640 Shape
01:18:30.920 It
01:18:31.060 In
01:18:31.200 Different
01:18:31.500 Ways
01:18:31.960 As
01:18:32.360 To
01:18:32.780 Try
01:18:34.280 To
01:18:34.440 Make
01:18:34.600 A
01:18:34.720 Sort
01:18:34.880 Of
01:18:35.320 Chemical
01:18:36.060 Transmutation
01:18:36.820 Of
01:18:37.100 It
01:18:37.340 To
01:18:37.760 Fix
01:18:38.040 Their
01:18:38.500 Own
01:18:38.800 That
01:18:39.120 Specifically
01:18:39.980 Hatha
01:18:41.100 Yoga
01:18:41.480 Yeah
01:18:42.580 Exactly
01:18:42.960 The
01:18:43.980 Raja
01:18:44.700 Yogis
01:18:45.380 Criticized
01:18:46.020 That
01:18:46.280 As
01:18:46.560 Being
01:18:46.860 A
01:18:47.260 Limited
01:18:47.580 Art
01:18:48.760 And
01:18:49.200 Criticized
01:18:49.980 It
01:18:50.080 As
01:18:50.240 Being
01:18:50.500 Like
01:18:50.780 Well
01:18:51.080 Even
01:18:51.440 If
01:18:51.620 You
01:18:51.780 Attain
01:18:52.260 Perfection
01:18:52.920 In
01:18:53.080 That
01:18:53.340 Domain
01:18:54.240 You're
01:18:54.500 Just
01:18:54.780 A
01:18:55.260 Particularly
01:18:56.460 Healthy
01:18:57.000 Animal
01:18:57.420 And
01:18:57.720 Nothing
01:18:57.940 Else
01:18:58.280 I
01:18:59.380 Don't
01:18:59.520 Know
01:18:59.680 That
01:18:59.880 It's
01:19:00.260 A
01:19:00.420 Valid
01:19:00.680 Critique
01:19:01.960 Yeah
01:19:02.500 Yeah
01:19:02.760 I
01:19:03.540 Know
01:19:03.700 Right
01:19:06.760 Well
01:19:06.980 That's
01:19:07.200 The
01:19:07.320 Thing
01:19:07.580 And
01:19:07.800 This
01:19:07.940 Is
01:19:08.040 What
01:19:08.160 It
01:19:08.240 Always
01:19:08.440 Comes
01:19:08.620 Back
01:19:08.900 To
01:19:09.120 Is
01:19:09.420 It
01:19:09.660 Sorry
01:19:12.500 One
01:19:12.700 Second
01:19:12.900 Here
01:19:13.080 Yeah
01:19:13.640 This
01:19:13.820 Is
01:19:13.920 What
01:19:14.040 It
01:19:14.120 Always
01:19:14.280 Comes
01:19:14.440 Back
01:19:14.660 To
01:19:14.880 Is
01:19:15.560 That
01:19:15.700 There
01:19:16.280 You
01:19:16.780 The
01:19:17.140 Underlying
01:19:17.780 Motivations
01:19:18.360 Here
01:19:18.660 Are
01:19:19.140 Just
01:19:19.520 Fundamentally
01:19:19.960 Religious
01:19:20.380 Right
01:19:20.620 What
01:19:20.760 Do
01:19:20.840 People
01:19:21.060 Who
01:19:21.220 Actually
01:19:21.460 Subscribe
01:19:21.820 To
01:19:21.960 Transhumanism
01:19:22.440 Want
01:19:22.860 Eternal
01:19:23.620 Life
01:19:24.080 Freedom
01:19:25.320 From
01:19:25.560 Corruption
01:19:26.140 Happiness
01:19:28.140 It's
01:19:30.180 Not
01:19:31.260 Very
01:19:31.440 Complicated
01:19:31.940 These
01:19:32.660 Are
01:19:33.020 Core
01:19:33.720 Human
01:19:34.040 Existential
01:19:34.700 Needs
01:19:35.880 Right
01:19:36.860 We
01:19:37.040 Don't
01:19:37.960 Like
01:19:38.240 Death
01:19:38.520 We
01:19:38.680 Don't
01:19:38.900 Like
01:19:39.140 The
01:19:39.320 Fact
01:19:39.500 That
01:19:39.600 We
01:19:39.700 Get
01:19:39.860 Ill
01:19:40.100 And
01:19:40.220 We
01:19:40.380 Suffer
01:19:40.820 Death
01:19:42.100 Of
01:19:43.180 Like
01:19:43.600 That
01:19:44.160 And
01:19:45.880 All
01:19:47.400 That
01:19:47.680 Right
01:19:48.660 And
01:19:48.960 So
01:19:49.060 That's
01:19:49.220 The
01:19:49.340 Thing
01:19:49.560 Is
01:19:49.780 The
01:19:50.340 These
01:19:51.120 Are
01:19:51.500 The
01:19:51.660 Very
01:19:52.140 Appetites
01:19:52.880 The
01:19:53.040 Very
01:19:53.260 Natural
01:19:53.760 Human
01:19:54.020 Inclinations
01:19:54.640 That this sort of philosophy appeals to. But rather than directing them towards the cultivation of personal virtue, the striving after of service and transcendence through action in the world, the fulfillment of the divine natural archetypes, you know, by exercising the will in service of the cosmic order in a building up of creation.
01:20:18.260 Rather, it inverts it and says that all of the high technology of the universe exists for our personal pleasure. So that we can extend and enhance the fallation of the ego. Right? Because there's no presumption of natural or divine law here. All can be subverted in the name of utility.
01:20:40.880 And this is what we get back to again and again and again. And this is why I think it's why I bring it up is because this is what our enemies are trying to do to us as a species is they're trying to totally subvert any conception of natural law, truth, beauty and goodness so that they can overthrow any of the cosmic order that we strive to uphold.
01:21:01.940 That's what we're all about here on this podcast. That's what I think hopefully everybody's listening about listening here is about. And so that's what these ideologies are about. That's why they're dangerous. And this is why I think the transhumanism is so existentially threatening and important to address because it's very, very close.
01:21:18.100 There is one part that I read recently as well that's pretty interesting. And I think it was Socrates argued that the philosopher wouldn't really have a fear of death, right? Because as we establish fear, death is a separation of the body and the soul rights.
01:21:33.780 But what happens if you are practicing a life of philosophy, that is you're trying to gaze at God, you could say. Well, you could see the good in its essence, right? You could see beauty in its very essence and not just the obvious which beauty feels, right? That's the idea at least.
01:21:54.040 But let's say you live a super, super, how should I say, super corner life, very sinful. You only value that which is super worldly. You value life for life's sake. You worship power for power's sake. You want money just to be rich, right?
01:22:10.880 Then you wouldn't really find any joy in that. You'll be blinded. You can't see it. You can't handle it. You're thirst for the material you want so desperately, but you can't really find it anymore because you don't have the body, right?
01:22:24.060 So you just wander around thirsty, begging for some water, so to speak. And let's take it like this, right? This person has willingly abandoned his goal in life.
01:22:37.840 So let's say it's kind of like a piece of puzzle, right? He has kind of worn out the thing that will connect him to another piece of the puzzle, right?
01:22:48.900 So he can't literally be connected to this other piece of puzzle, right? So he's just left alone. He's drifting away. He can't really love beauty. He can't love goodness. He can't love any of this because he can't really connect to it. He doesn't value it. He doesn't even want to value it.
01:23:05.800 And I was going to go to a point with this, but it's very late here, and I can barely think.
01:23:16.160 I understand. Please.
01:23:20.100 Yeah. So, Ruud and Zeiger, please step in with your comments if they heard you.
01:23:25.800 Right. And so, yeah, what I think that this is what they're going for, right, is the introduction of the gender ideology, the transracialism, all of this kind of stuff, right?
01:23:41.840 The background of Darwinian evolutionary flux, the idea that there is no set human nature is – I think there's actually like a cabalistic esoteric agenda behind it.
01:23:53.740 I don't think that it's just kind of an accident that we see these kind of philosophies because, as we pointed out, they're nothing new.
01:24:01.780 They, you know, occur again and again and again throughout history.
01:24:07.040 What's going on here is the – I think the technology is going to be used in an attempt to reduce human beings in their masses to this kind of a blank state, this tabula rasa, this einsoff.
01:24:19.400 Through environmental conditioning, aided by technology, human nature can be devalued to the point where it can be reconstructed in man's own image.
01:24:29.820 The human body can function in a way which is more suitable to man's own ego and the ends that his own ego desires, which is ultimately passion.
01:24:37.780 So after the universe is reduced to these building blocks from their preordained nature, their preordained stance, after it's reduced to this kind of gray goo, it's reconstituted, given artificial life in order, in service of an inferior mind to the cosmic mind, which brought all of it into existence.
01:25:02.840 Here's a question that we've been thinking about for a while, actually.
01:25:06.280 Okay. Couldn't you say that this same mentality could kind of be seen in political messianism, where you just place the ideology as the important thing rather than, how should you put it, God, the church, that which gives existential meaning?
01:25:26.920 Because if you just view the ideology in and of itself, even if, let's say, it's full 1488 National Socialism, then how would you really be in a difference?
01:25:37.980 Operating systems and, I guess, not software, but hardware?
01:25:43.180 Yeah, I don't really have a good answer for that.
01:25:46.040 You're probably on to something where, and yes, this is precisely difficult to us because in our modern future, yeah, guess what?
01:25:56.220 Guess what?
01:25:57.160 Death is scary.
01:25:58.260 Death is scary to people because we live in a comfortable reality, which is built on these basic ideas that it's sad when anything that makes you hurt happens.
01:26:08.380 And, oh, guess what?
01:26:09.880 These things tend to happen.
01:26:13.060 Children die.
01:26:14.400 And it's tragic.
01:26:15.800 It is supposed to be tragic.
01:26:17.240 And it's supposed to wreck your life and wreck your world for a certain amount of time.
01:26:21.320 We live in a protected bubble.
01:26:24.860 And you know what the provisions for these things are?
01:26:27.860 They are the sacraments.
01:26:29.380 They are your liturgy.
01:26:31.360 The protection against these things is we've always dealt with this.
01:26:34.380 There's never been a human alive who hasn't felt these things before, and they have not killed other people.
01:26:41.140 The difference is it's been the liturgy, the sacraments, the community.
01:26:44.940 You're dying because you're alone.
01:26:47.560 Yeah, right.
01:26:49.240 And the existence without pain is not worth living.
01:26:55.860 Yeah, of course.
01:26:56.780 Exactly, right.
01:26:57.580 And, I mean, as we kind of said briefly before, right, the philosopher wouldn't really fear death.
01:27:04.920 Because with death, he can finally, so to speak, see the essence of things, right?
01:27:08.680 So, right, why should we be super afraid of that stuff if that will just, you know, brings us to a sort of better place?
01:27:19.980 Maybe I'm babbling.
01:27:21.420 I don't know if you're not for a while.
01:27:22.220 Well, I mean, I understand that.
01:27:23.240 I mean, yeah, ultimately, you know.
01:27:28.180 No, that's good to know.
01:27:29.980 Yeah, well, it's not something that we should be afraid of.
01:27:35.080 We've talked about this many times before.
01:27:37.400 But we can't really address that in long form.
01:27:39.080 Let me just interrupt you a bit.
01:27:41.500 Would that include the death of the ethnicity?
01:27:44.500 Well, okay, but we can't, like, okay, if we want to discuss theologically, like, the meaning of death and our confrontation of this, that's a really, really big topic.
01:27:58.240 Death is, really, could be its own two-and-a-half-hour podcast.
01:28:02.020 So, I mean, that's too large to get into here.
01:28:04.900 There's a lot of issues.
01:28:06.600 What I wanted to just emphasize is that these ideas, you know, the philosophical interpinnings here are not new, right?
01:28:12.960 And we've seen these come up again and again and again over – when we examine, you know, these different occult and enlightenment philosophies which blossom out of these ways of thinking, we find these patterns of thought reoccurring.
01:28:28.120 The idea of the secret knowledge or the secret technique, the quantification of all of the universe in order to serve man's needs.
01:28:35.980 That man's own titanic will and his manipulation of the secret knowledge which he has acquired through, you know, the will of power will allow him to transcend his own ultimately meaningless human limitations.
01:28:50.240 This is the narrative over and over and over again that we always see.
01:28:54.340 And this is the thing, and this is – is that when we oppose our enemies ideologically, politically, when we oppose these impositions upon our society, we're opposing the whole worldview that stands behind it.
01:29:06.740 And as we've been saying at Mysterium Fashis over and over, it's like you can't just have politics.
01:29:14.060 You can't just have your ideology.
01:29:15.820 You need a worldview.
01:29:17.280 You need to be grounded in the transcendent, in ethics, in religion, all of these things in order to be able to counter – to effectively counter our enemies.
01:29:27.760 Because they understand this, and their ideologies are totally wrapped up in all of these things, even if it's implicit.
01:29:35.180 And if we don't explicitly examine it, ultimately it's just going to be filled in for us.
01:29:41.080 And that's – that's – kind of – maybe this kind of addresses what your point is, is that like, you know, we're not – you know, ideology is not our god.
01:29:51.740 We engage in ideological action.
01:29:53.620 We engage in politics because there are higher values than the political that demand that we do so, that motivate us towards these ends.
01:30:02.060 Now, we have a duty to ourselves, to our kinsmen, to our ancestors, to our civilization, our posterity.
01:30:09.220 Right?
01:30:09.740 We must act politically to secure an existence for our people and a future for white children.
01:30:13.960 That's not an option.
01:30:15.980 We have no choice but to engage in politics.
01:30:18.500 But politics is not the end of our lives.
01:30:21.280 Politics is part of how we live, a necessary and important part.
01:30:25.840 And that's the thing.
01:30:26.420 I think I was getting your thing, Riva, is that when we begin to skew from this worldview, this, you know, healthy, integrated, multifaceted way of being, you know, you get into what Hans was talking about.
01:30:41.080 Where, you know, you start to get into this political messianism.
01:30:44.980 Well, here's the thing, right?
01:30:47.220 I understand exactly what you're saying, but here's the little problem.
01:30:51.220 I kind of have it in my mind right now.
01:30:52.940 Maybe it's just silly, but maybe it's not.
01:30:54.660 Maybe I'm onto something.
01:30:55.680 Maybe I'm just babbling.
01:30:56.520 And that is human for a choice, right?
01:30:59.960 If you look at Sweden, for example, we're all very atomized.
01:31:03.640 We choose to do retarded stuff and we're too proud to admit that we're wrong.
01:31:09.300 Right?
01:31:09.580 Now, how would you say that people like this?
01:31:12.640 Will that be through political action or through essentially trying to make them, I don't know, repent or something?
01:31:19.520 That's the problem.
01:31:20.860 Exactly.
01:31:21.660 Well, yeah, ideally.
01:31:22.480 So, the way you would combat this would not primarily be political.
01:31:29.400 It would be, how should I put it?
01:31:31.460 Spiritual.
01:31:32.080 Legally.
01:31:32.740 Spiritually.
01:31:33.400 Yeah, exactly.
01:31:33.980 Yeah, we've talked about this before.
01:31:35.040 So, this is kind of my point, right?
01:31:39.540 You can't really just have political action and think that the political action in and of itself will really, you know, solve them.
01:31:46.480 Yes.
01:31:47.080 Yes.
01:31:48.120 Well, it depends on how you define politics.
01:31:50.960 If you're talking about elections and things like that, then, yeah, that's not going to do anything.
01:31:55.400 Let's say even you have a super great revolution right now and everything will be as you want your political utopia to be, right?
01:32:07.100 Well, politics will simply be as violence.
01:32:10.240 You can make people...
01:32:13.400 Politics is the exercise of power.
01:32:15.360 And I had disagreements and I had discussions with powerful, powerful theological people whom I respect greatly and they kept on saying everything I talk about is politics, this and that.
01:32:26.500 But it's the exercise of force.
01:32:28.940 Politics is always the exercise of force and it is imperative that we remember that.
01:32:34.800 Everything involved in politics is what are you willing to see happen and how much of your blood or sweat are you willing to see that done?
01:32:43.700 And it all boils down to that.
01:32:46.820 And the problem with leftists is their blood and their sweat are, they're not willing to give it for much.
01:32:52.280 Right.
01:32:55.360 I mean, in the case of Sweden, if you had an occupying army that came in and then that forced people to behave differently and to, you know, learn different values, I mean, they would do it.
01:33:07.900 Force works.
01:33:08.720 Force, even in religion and spirituality, I mean, you can use force to convert a population to a different spiritual outlook and it's been done many times in the past.
01:33:19.260 Force, even in the past, ultimately, it's a question of who has the will to do it.
01:33:26.700 And the people who have the will...
01:33:28.340 Because they're even more afraid of other stuff.
01:33:29.920 Well, Greva, I mean, okay, yeah, I mean, here's the thing, Greva, it's like this, okay?
01:33:34.020 I mean, when you, like, if you kill somebody, that's the result of a spiritual action.
01:33:41.800 Right.
01:33:42.200 That's not, that is the physical manifestation of spiritual actions that you've taken.
01:33:47.260 Okay.
01:33:47.780 And so it's not the other way around.
01:33:49.980 And so any spiritual warfare that we do has physical results.
01:33:53.720 Even if you're just praying, well, you know, you have to do things physically when you pray.
01:33:58.540 Right.
01:33:58.960 This is what, you know, when we go to church, we engage in the liturgy.
01:34:01.920 We move around, we do, we speak, right?
01:34:04.460 We ought to give order to our thoughts when we pray.
01:34:07.320 Right.
01:34:07.700 And so this is the thing.
01:34:08.460 There's no, this is getting back to it.
01:34:10.420 Is there's no bifurcation of the soul from the body.
01:34:13.980 They exist together.
01:34:14.980 And as long as we're on this earth, everything that we do spiritually has a physical component in light, vice versa.
01:34:21.540 And so, you know, this is what we're just trying to entreat our listeners is to understand that, you know, our enemies are doomed to failure because they don't understand this.
01:34:32.660 Or rather, they do understand this and they subvert it on purpose.
01:34:35.500 And because they subvert it, they're going to lose because they're going against the rules of the game, so to speak.
01:34:40.420 They're playing against nature.
01:34:42.500 But they can still be enormously destructive while I do so.
01:34:45.820 You know, obviously, the Soviet Union was bound to collapse at some point.
01:34:48.980 You know, it would have collapsed much sooner if there hadn't been Western aid.
01:34:51.800 But that doesn't mean it shouldn't have been allowed to exist to begin with.
01:34:54.820 Obviously, it did, you know, enormous damage.
01:34:56.920 And so, this is what we're getting at is that we need to have ourselves this integrated perspective.
01:35:03.860 And we need to understand that our struggle is first spiritual, which then manifests in the physical realm.
01:35:09.400 And we need to attend to both of these areas of our existence and fight on both levels symbiotically.
01:35:14.960 And that's what's important.
01:35:16.560 That's how we win.
01:35:17.360 Yeah, I mean, if you have a spiritual worldview and you want to spread it to other people, it's not going to spread by osmosis.
01:35:24.980 It's going to spread through your actions.
01:35:27.240 So, it's always going to end up being action.
01:35:29.940 And pretty much all action ends up being force, ultimately.
01:35:33.740 Even just speaking.
01:35:35.360 I mean, you're influencing people and it's changing them and it's changing the world.
01:35:39.640 And any change is a form of violence in a certain sense.
01:35:44.980 So, ultimately, spiritual warfare and physical warfare are one and the same.
01:35:55.340 Right, exactly.
01:35:56.420 Yeah, that's the thing.
01:35:57.980 It's the application of force to resistance to achieve an end.
01:36:01.400 Right?
01:36:01.600 This is what it's about.
01:36:02.640 So, the question is the technique, right, of application of force and then the end.
01:36:10.520 Right?
01:36:10.660 These are the things that we have to decide.
01:36:13.000 How do we apply ourselves physically and spiritually?
01:36:15.220 In what fashion?
01:36:16.260 And for what purpose?
01:36:18.700 Right?
01:36:19.380 And so, you know, obviously, we try to answer both of these questions on our show.
01:36:22.540 We try to tell you, well, this is why we should be fighting.
01:36:25.140 This is what we're fighting for.
01:36:26.820 And then how do we do this?
01:36:28.000 What are the principles by which we can gain victory?
01:36:30.540 And that's what we need to focus on.
01:36:31.740 And this is the thing, brothers and sisters, is why you can't, you know, don't be dejected.
01:36:37.760 You know, understand, this is not a new struggle.
01:36:39.360 Things are bad now.
01:36:40.400 Yeah.
01:36:41.000 But we're fighting.
01:36:41.840 This is just a particularly pugnacious and refined manifestation of a very, very, very old battle.
01:36:51.560 The fact that 2,000 years ago, we had the same template for the subject we're discussing today, widespread and fought against by the church fathers.
01:37:00.740 You know, saying, here, and I pray for us, attests to this fact that we are dealing with these same issues.
01:37:06.680 It's just that in the modern time, technology allows them to be exponentially refined.
01:37:13.740 And so that they can become much more instructive and ideas can be spread virally and so on.
01:37:18.260 Indeed, you know, this is how I'm speaking to you right now.
01:37:20.820 Now, you're going to listen to this recording over the Internet, right?
01:37:25.120 And so, but this, I just, but really, just stay focused on what is in your power to change.
01:37:32.020 Don't, you know, don't let yourself be blackpilled by all this news.
01:37:34.680 You know, whether Donald Trump pulls out of NAFTA or not, it's not going to affect, you know, your own struggle against your passions, right?
01:37:42.760 And your own duty to try and, you know, establish your family and raise them well and contribute to your community and your society and, right, try to break away from the state apparatus and so on.
01:37:52.340 And so it's just, please, don't get wrapped up in vanity and foolishness.
01:38:01.100 You know, it's a temptation, right?
01:38:03.320 It's a temptation, especially when, I know many of our listeners here, you know, we, you know, politics in the news becomes a, you know, a serious hobby.
01:38:12.240 More than just a hobby, you know, it's a way of life.
01:38:14.420 We devote ourselves to this because it's meaningful existentially.
01:38:17.360 Because the results of our actions collectively will determine the success of failure of our race on this continent and in Europe.
01:38:25.980 And so, you know, it's very, very important what we're doing.
01:38:28.380 It's not frivolous.
01:38:29.880 But it must be, you know, but we cannot get caught up in the temptation, you know, to, you know, just to invest our whole ego and our whole emotional state in the news cycle.
01:38:44.140 But in the comings and goings of parliamentary politics, it's kind of bringing it back to the beginning of the show.
01:38:52.380 It's just, it's like, oof, just take heed, take heed.
01:38:57.680 Right. So, this is the thing.
01:39:03.480 So, I think we kind of come full circle.
01:39:04.920 The kind of, we're going to, I want to talk a little bit about the religious elements just right before we finish on the subject.
01:39:14.340 And we'll talk about, you know, potential positives of transhumanism.
01:39:17.100 Because obviously today we've spent most of our time talking about how shitty it is.
01:39:21.200 Now, one of the things that I wanted to emphasize was that transhumanism is fundamentally, it's a philosophical proposal, rather not philosophical, it's a soteriological proposal.
01:39:35.060 It's basically how will humans save themselves from corruption and death using technology.
01:39:40.080 Right. And so, it promises, it makes this, this apocalyptic commitment.
01:39:47.720 It says that human beings will, we will get to the point where we have, you know, this technological singularity.
01:39:53.080 Where AI supercomputers are able to increase the rate of our sophistication and resource extraction exponentially.
01:40:02.580 We'll get to, we'll go to, you know, zero point energy and unlimited resources and, you know, infinite technology.
01:40:08.620 And we'll basically be gods.
01:40:10.740 This is the promise.
01:40:12.600 Given that our physical forms can be modified to whatever we want.
01:40:15.760 We'll be able to do whatever we want.
01:40:17.300 There won't be any physical limitations.
01:40:18.760 We can transcend space and time through the careful application of technology and achieve divinization.
01:40:28.140 Theosis.
01:40:30.420 Right.
01:40:31.140 And so, this is what it's about.
01:40:33.100 It's what it's about at the end of the day.
01:40:35.060 I mean, this is what Christianity promises as well.
01:40:36.700 You shall be as gods.
01:40:38.520 But, you know, it promises this process, you know, through suffering, basically, through ascetic warfare, through the emulation of the life of Christ and the power of the Holy Spirit and all that.
01:40:47.860 The resurrection of the last day.
01:40:49.060 We don't have time to get into that.
01:40:53.100 But that's what's going on here.
01:40:55.300 Right.
01:40:55.580 And so, it's just kind of laughable.
01:41:03.680 It's kind of laughable.
01:41:04.640 Because that's how shallow, that's how shallow our opponents are.
01:41:08.680 That's what we're dealing with.
01:41:09.640 So, Zyger, Hans, anything to say on this before we get into maybe a few of the positive potentials for such technology?
01:41:21.940 Yeah.
01:41:22.540 Well, yeah, I mean, it really depends on your view of, you know, what's the meaning of love and stuff like that.
01:41:31.560 So, of course, from a Christian perspective, it's obviously not positive in any way.
01:41:39.980 And from my perspective, it's not positive either, for the most part.
01:41:44.840 Like I've said earlier, I do think that there is a potential.
01:41:49.840 It's like, it's a question of equilibrium points.
01:41:52.020 I think that the human experience can have kind of equilibrium points where the technology is kind of in harmony with both our environments and our natures.
01:42:08.080 And we can live meaningful, sustainable existences.
01:42:12.060 But they're like plateaus.
01:42:13.020 Like, for example, probably living simple countryside agricultural existences is probably one of those plateaus where, you know, it's sustainable and it's possible to be healthy, both spiritually and physically at that level.
01:42:30.520 I think, obviously, at the current level, it's not sustainable.
01:42:34.380 It's very unhealthy and going in the direction of this transhumanist thing is only amplifying this imbalance and this, you know, headlong frenzy towards self-destruction.
01:42:51.640 But I do think it's possible to have, you know, a higher plateau where we could live balanced existences.
01:43:02.120 But it's not really possible to imagine at this point.
01:43:07.460 Right.
01:43:07.820 Right.
01:43:08.240 But here's the thing, though.
01:43:10.660 Here's the thing, though.
01:43:11.540 I think that this insane technology progress that we, I don't know, install CPUs in our brain.
01:43:17.860 I think this technological progress, the technological power in and of itself is by its very nature so powerful as to completely overpower human weaknesses and to enslave these human weaknesses for this desire of power.
01:43:33.380 Therefore, you can't really escape this desire for power.
01:43:36.920 And therefore, it is in itself a bad thing.
01:43:41.860 It's a sort of satanic temptation, you could say.
01:43:44.040 Yeah.
01:43:44.860 Well, like, think of science fiction works like Dune, for example.
01:43:49.900 And there are interesting parts of that book and that story that are very different from other science fiction works.
01:43:57.720 In the sense that in Dune, they have some high technologies, but there's also technologies that are completely absent and banned.
01:44:05.400 For example, computers are banned completely.
01:44:08.060 Hold on, you can't interrupt people.
01:44:12.280 Please.
01:44:13.000 Please.
01:44:14.040 Hold on, Zagr.
01:44:16.040 No.
01:44:16.860 So, it's a kind of, the social structure is kind of medieval and balanced over thousands of years.
01:44:27.420 And they've done that by banning computers and only having manual tools, basically.
01:44:32.880 And, you know, I'm not saying that that's what we need to do.
01:44:34.960 I'm just saying that there are potentially ways to use high technologies without destroying our society and sabotaging the spiritual value of our existence.
01:44:47.820 But that requires wisdom that obviously we don't have at this point.
01:44:52.480 Yes, that's the critical, yes, that's the critical, the criticalizing factor.
01:44:55.100 And the reason we don't have that wisdom is because the worst elements of society are at the top, while the best elements of society, the people who could potentially have wisdom, are at the bottom and are completely disempowered.
01:45:07.140 Right.
01:45:08.140 So, obviously, it's not going to happen right now.
01:45:10.520 Any innovation that exists or that happens now will be put in the service of evil.
01:45:17.980 So, it doesn't have to be this way, but it's going to be this way for the foreseeable future.
01:45:25.220 Okay, so, okay, let me see if I understand you correctly then.
01:45:28.520 You propose a sort of technological aristocracy.
01:45:31.920 No, I think that what Ziger is just trying to say is that, you know, technology is a neutral thing.
01:45:39.620 It would, if we had a rational society, like, let's, you know, space Nazis or whatever, we had, like, a serious civilization that was run by people with, you know, a sense of, with wisdom and sanity and order in mind, the common good, that we could reach a healthy equilibrium that, you know, with high technology that's conductive to the values that we want to see, right?
01:46:06.800 Yes, but that's the thing, you could say that about everything, but it doesn't work anyway.
01:46:11.800 No, well, if you take that attitude, then you would be in favor of banning hammers, because hammers are also a tool, and they can also lead to, you know, a damaging of the social fabric, depending on how you use it.
01:46:26.320 Yeah, but the difference is that the hammer, by its own nature, doesn't have the same power as transhumanists, I don't know, cyborgs.
01:46:35.560 Yeah, exactly. So, therefore, the hammer gives a power, which, in its very nature, is adapted to the human.
01:46:42.680 Right, well, but this is, yeah, I understand what you're saying, but it doesn't matter.
01:46:46.380 It doesn't matter. We're in the titanic age, so we can't, we can't ignore these problems, right?
01:46:52.180 And, you know, what I'm saying is, if we use the example of nuclear weapons, right?
01:46:55.900 You know, so, if we, you know, nuclear weapons, we cannot avoid the reality of them.
01:47:04.580 I know Zyger has some interesting ideas on this, but if we just assume that they exist, it's not a big psyop, you know, we can't just, you know, wish that we didn't, these weapons didn't exist because of their terrible power and potential for the destruction of the Earth's surface.
01:47:18.740 We have to just grapple with this existential threat and move on, and it might even mean that we have to have, you know, our own nuclear weapons.
01:47:27.520 Now, I'm not saying that we should, you know, adopt transhumanism so we can beat up leftists or whatever, right?
01:47:34.120 But I'm merely pointing out that there can be rational application of high technology, which is what I think Zyger is saying as well.
01:47:41.520 Yeah, I mean, we can go back in the past, I think it was about 500 years ago in France, some inventor came up with a way of having a fully automatic crossbow, which basically worked like a machine gun.
01:47:56.520 And he brought that to the king of France, I don't remember which one it is, and he had the guy executed and the machine destroyed because he thought, you know, that would have a very negative effect on the social, you know, the nature of warfare and the social equilibrium and things like that.
01:48:18.520 So it's just an example of even back then with primitive technology compared with anything we have today, you can still make devices and techniques that would be extremely destructive to the way that society works.
01:48:33.620 And if you don't have wisdom, you're lost.
01:48:38.180 I mean, you could go back to like Stone Age level.
01:48:41.520 And I'm sure that if you're creative enough, and if you have knowledge enough, you could come up with Stone Age level tools and techniques or knowledge that would be extremely destructive.
01:48:51.380 For example, if you think about techniques, fire can be hypnosis.
01:48:58.160 Hypnosis works.
01:49:00.060 And if everyone is very good at hypnosis, maybe that's a knowledge that would be socially destructive.
01:49:06.860 And, you know, if you go back to a few thousand years ago, people who knew hypnosis and like openly talked about it, you know, these people were persecuted and sometimes killed.
01:49:18.680 So I don't think that there's any way to say, oh, you know, we can then, you know, you can't, you can't say that hypnosis is a very simple technique.
01:49:28.720 But it's not like high tech, but it can still, you know, have socially destructive impacts.
01:49:34.940 There's no way to like put an arbitrary bar somewhere at a certain level of technology and say, you know, if we don't go beyond this level of sophistication, like electricity or whatever, we're safe.
01:49:45.300 No, it's impossible to be safe.
01:49:47.340 Nothing can save us except wisdom and compliance with the cosmic order.
01:49:52.500 There's no trick, only virtue and, you know, the application of, you know, common sense and, you know, correct thinking to our lives.
01:50:03.300 And there's no trick.
01:50:04.780 And that includes transhumanism.
01:50:06.760 And obviously the stakes are higher, but it doesn't change the fact that it's not a problem that can be solved easily.
01:50:14.940 Because we've always had that problem.
01:50:16.440 Right, exactly.
01:50:17.640 And this is, you know, the Pope banned the use of crossbows against, by Christians against Christians in the Middle Ages, the papacy did.
01:50:26.180 It's like the weapons control, you know, treaties are like not new things.
01:50:34.320 These problems are perennial, right?
01:50:35.860 They'll continue to assault us.
01:50:37.000 But what happens is with high technology, as things get more powerful, these problems get kind of faster and more refined.
01:50:42.520 These historical cycles become more intense.
01:50:46.960 The stakes become higher.
01:50:49.160 And so that's, you know, what we have to contend with is that the stakes now are higher, right?
01:50:54.920 But it's the same things, same shit we've been dealing with forever.
01:50:59.680 Nothing new under the sun.
01:51:01.760 Nothing new.
01:51:03.080 Yeah, and so that's the thing.
01:51:03.900 Just to kind of end on a little bit more of a positive note, just with the technologies that, yeah, you know, there are, you know, massive positives to biotechnology.
01:51:19.160 You know, obviously the ability to restorative medicine is, you know, an obvious and huge component, right?
01:51:24.900 That, you know, people who are crippled or who, you know, suffer from a debilitating illness through the application of technology can be restored to the fullness of life.
01:51:34.000 You know, that war vets, you know, could actually, you know, in exchange for their service, even if they're wounded grievously, could live a reasonable life.
01:51:43.540 You know, you know, could have lost organs and limbs replaced.
01:51:47.220 You could even have, you know, nervous structures, nerve endings and these sort of things recreated.
01:51:56.340 The, you know, for women, mothers, the one that's suggested a lot is with the artificial womb, that like early births, right, could be gestated artificially in these wombs.
01:52:13.020 And new life could be brought to fruition that normally would have perished.
01:52:17.740 And that's not always a good thing, because as we know, corruption has an important kind of eugenic effect upon society, right?
01:52:26.160 Suffering tends to keep us honest, right?
01:52:29.660 Extreme suffering tends to keep us extremely honest.
01:52:33.900 Because we can't ignore the reality of human corruption.
01:52:37.640 These hard limits of the cosmic order serve as strong rebukes to our own personal delusions.
01:52:44.680 And so there is, this is one of the problems that we run into, as Sager suggested, is that whenever we have this, whenever we can separate the human condition from the natural environment radically with technology, we run the risk of decadence.
01:52:59.640 Because of human beings' tendency towards corruption, if we don't have any checks that bring us back to reality constantly, we need it all the time, you know, we will descend into madness, into delusion, and ultimately die.
01:53:13.480 That's what it's about. We will die.
01:53:14.640 Yeah. So, ultimately, I think the wise application of science and technique and knowledge is not in, it's ultimately in allowing us to live in more hostile environments.
01:53:31.220 Like, extreme, like, far north, underwater, in space, in other planets, and things like that.
01:53:38.560 Not in making our lives in our current environment easier, but rather in, like, expanding the domain of possibilities for humanity, but not making our lives easier.
01:53:49.480 So, that's how I can conceive that technology can be good in that sense.
01:53:54.580 Right. Exactly.
01:53:55.320 You know, certainly, you know, I'm not opposed at all to space travel and Earthsea exploration.
01:54:00.520 I think these are very good things, but, you know, do you even begin to consider them, obviously.
01:54:05.200 Yeah, we've come to the end of our show, so I think we're going to hit Kali Yuga News.
01:54:10.280 Let's hit the good parts.
01:54:11.860 And finish up our episode.
01:54:13.460 So, keeping it topical, there was a piece of Kali Yuga News recently that was extremely, extremely pertinent to our discussion today.
01:54:23.180 So, I'll begin with this one from Russia today.
01:54:26.700 Artificial wounds successfully passed first test.
01:54:30.720 Human trials could begin within three years.
01:54:34.220 Scientists have successfully developed and tested an artificial womb capable of supporting prematurely born lambs for periods of up to four weeks in a landmark development that could dramatically reduce the risks posed by premature births in humans.
01:54:48.140 Lambs born at the equivalent of 23 weeks into the human gestation period have been kept alive in a transparent vessel, or bio bag, that serves as both a womb and an incubator for periods of up to four weeks after their initial premature birth.
01:55:02.660 The research was conducted by a team of doctors and scientists led by Alan Flake, a fetal surgeon at the Children's Hospital of Philadelphia.
01:55:09.500 The team's findings were published in the journal Nature on Tuesday.
01:55:11.940 All right.
01:55:15.760 So, we've just spent the last two hours talking about this.
01:55:19.380 But just to demonstrate to our listeners that if you haven't been paying attention, this technology is here.
01:55:25.300 We're not talking about a future problem.
01:55:27.160 We're talking about a right now problem.
01:55:29.980 Okay?
01:55:30.260 A current problem.
01:55:31.280 Just like automatization, all of the necessary prerequisite technology in order to do these sort of things exist.
01:55:37.640 And we've covered other stories where we've talked about, like, you know, sexbots, right?
01:55:42.840 And I've made a comment that I think what we're going to see is we're going to see, like, sexbots with artificial wombs.
01:55:48.280 Ugh!
01:55:49.420 Yes, yes!
01:55:50.280 It's just that's what it's going to be.
01:55:52.020 You know what?
01:55:52.040 This is hilarious.
01:55:52.800 This is super hilarious.
01:55:54.060 Well, yeah.
01:55:54.420 You're going to get your custom-printed, you know, anime 2D waifu sexbot, right?
01:55:58.900 With the artificial womb, you know, with the X chromosomes of your choice, right?
01:56:04.000 And so, you know, you put in the ovum in the womb, right, from a woman that you want, and, you know, bang your sexbot.
01:56:10.140 That's what it's going to be, for sure.
01:56:11.460 And this is rather pathetic as well, because, I mean, let's say a relationship, right?
01:56:16.760 A normal health relationship, that is also supposed to make both the parties good, right?
01:56:21.940 It's supposed to be like two pillars leaning on each other and supporting each other, right?
01:56:25.940 I mean, Plato had this idea of not a piece of the puzzle, right?
01:56:29.120 And if you, you know, put them together, you become whole again.
01:56:32.660 And he ironically had this mentality that, you know, that was a sort of titanic fall, but whatever.
01:56:37.940 The point is that it's supposed to lead you towards that, which is good, towards God, right?
01:56:45.440 But the sexbot thing, it's just so perverse, because this isn't even another human being.
01:56:49.860 It's just technology, right?
01:56:51.280 And it's, it doesn't even have a soul, right?
01:56:55.480 And you're getting this erotic love for an object which does not have a soul.
01:57:01.000 And therefore, how is this, in a way, more different than sodomy?
01:57:06.180 I mean, in a way, it is sodomy, in a way.
01:57:09.040 On the other hand...
01:57:09.940 You're not making love with death itself.
01:57:11.760 On the other hand, thoughts blown the fuck out.
01:57:17.840 Yes, there's that.
01:57:19.800 That's the only other people who are...
01:57:21.240 Yeah, you'll see the two people bitching about it is us, and the thoughts, because they know they're BTFO-ed, right?
01:57:27.120 It's an interesting...
01:57:28.140 But legitimately, there are, like, feminist scholars who have written, you know, lengthy diatribes against, you know, sexbots and transhumanism.
01:57:36.220 And these sort of artificial wound technologies.
01:57:38.120 Because they realize that it removes their sexual marketplace value control over men, right?
01:57:44.780 Is that when, you know, when the pussy power is gone, to be vulgar, you know, women lose a lot of their coercive ability over men.
01:57:54.240 Yeah, definitely.
01:57:55.120 Yeah, it's just straight, straight up, right?
01:57:57.620 You know, and so that's the thing is, like, if guys are already opting to jerk off to 2D anime porn, and, you know, instead of dating actual women,
01:58:07.320 because of their own iniquity and the garishness of, you know, the modern thought, right?
01:58:12.700 And how much more will the coming class of technocratic elite prajeet needs desire to have their Japanese schoolgirl, you know, lowly waifu with their premium frozen female eggs to fertilize, right?
01:58:30.500 I mean, it makes sense.
01:58:32.440 They're responding to these environmental pressures.
01:58:36.540 And, I mean, let's be serious.
01:58:38.820 How many of them will even have a womb?
01:58:40.740 Because people don't even want that responsibility, right?
01:58:44.380 They just want, you know, to empty themselves and have a good time, and, you know, calm is calm.
01:58:53.480 That's the mentality of the day, right?
01:58:55.200 Speaking of having a good time.
01:58:56.180 We don't want families.
01:58:57.640 We don't want anything wholesome.
01:58:59.620 All we want is to drown in this pleasure.
01:59:03.400 And then, of course, blame God that we're not happy.
01:59:06.840 Yes.
01:59:07.140 Oh, boo-hoo, it's God's fault for giving me free will and then not intervening as I continuously run into this brick wall.
01:59:14.800 Oh, boo-hoo-hoo, poor me, poor me.
01:59:18.680 I want a tyrant instead.
01:59:20.520 Oh, this Antichrist guy, he seems nice.
01:59:22.620 He's forcing me to be a kind of decent-ish person.
01:59:26.780 I should follow him and worship him as God.
01:59:30.780 Griva, why don't you select the next article and read it to our listeners?
01:59:34.060 How about I don't for a moment and you do it because I'm too tired to read them all?
01:59:39.740 I understand.
01:59:41.120 Your, uh, weariness has reduced you to illiteracy.
01:59:46.380 Mm, yeah, kind of.
01:59:48.920 Now.
01:59:49.640 Well, I can read the next one.
01:59:51.240 Please, go ahead.
01:59:52.820 Uh, I guess I'll go with the first one.
01:59:55.360 Yeah.
01:59:55.720 Let's see.
01:59:59.720 Oops, there's a delay.
02:00:03.100 Okay.
02:00:04.060 When globalization brings brain-invading worms,
02:00:08.740 the parasite that causes rat longworm disease is now endemic in the southeastern United States,
02:00:14.660 and it's expected to spread northward.
02:00:18.320 There's a long, grim history of infectious diseases crisscrossing the globe aboard giant ships.
02:00:24.020 Explorers looking to set up new colonies carried smallpox, measles, and other deadly viruses with them to distant lands.
02:00:32.340 Even the vessels and blah, blah, blah.
02:00:34.960 Anyway, the point is...
02:00:36.340 The rat worm has long been prevalent in parts of Asia and the Caribbean.
02:00:42.900 The first human case of the disease was recorded in Taiwan in 1944, but only recently has it been identified routinely in the United States, including in Hawaii, California, Alabama, Louisiana, Florida, and elsewhere along the Gulf Coast.
02:00:58.080 Hmm, let's see, what's causing this?
02:01:00.960 So, it's a worm infection introduced into North America through globalization, said Peter Hotez, the dean of the National School of Tropical Medicine at Baylard College of Medicine.
02:01:12.240 Hmm, some suggest that it's due to snails or slugs in the ship ballasts, ships coming from Asia and going through the Panama Canal.
02:01:22.740 Transmission to humans often occurs when people eat intermediate hosts.
02:01:26.960 A tiny, translucent slug might be imperceptible on a leaf of lettuce that wasn't adequately washed, for example.
02:01:34.240 Even the slime left behind by an infected slug carries a transmission risk.
02:01:39.080 Hmm.
02:01:40.640 Yeah.
02:01:41.160 So, what does this thing do?
02:01:44.500 Hmm.
02:01:45.660 I have parasitic meningitis, said the Tricia Minar, a Maui woman diagnosed with the disease.
02:01:54.540 The parasites are in the lining of my brain moving around.
02:01:58.320 Hmm, that sounds very pleasant.
02:02:00.420 So, it's basically some sort of brain-eating disease or something like that.
02:02:06.380 Yeah, well, she says,
02:02:07.180 Minar described her pain for the disease as worse than childbirth, saying it feels like somebody opens up the top of my head, sets a hot iron inside my brain, and then pushes the steam button.
02:02:17.720 Hmm.
02:02:18.520 Sounds pleasant.
02:02:19.600 Yeah, indeed.
02:02:20.540 Well, it's...
02:02:22.020 I mean...
02:02:22.900 Globalization, Goem.
02:02:24.520 Globalization, indeed.
02:02:25.360 I mean, these are the...
02:02:27.200 This is the product of chaos, right?
02:02:29.060 And, I mean, this is...
02:02:30.900 Right.
02:02:31.120 This is not, again, you know, new problems...
02:02:33.720 Old problems.
02:02:34.780 Old problems.
02:02:37.280 Yeah.
02:02:37.640 Well, it's the availability of, like, oil and how cheap it is to transport things.
02:02:45.220 I mean, it makes no sense to, like, buy lettuce grown on the other side of the world.
02:02:50.400 It's only because oil is so cheap and, like, we have so much power we can use for stupid reasons.
02:02:57.440 I mean, it makes no sense at all to carry, like, all of our stuff around the world when we could just make it, like, locally.
02:03:04.020 Um, it's really, I think, um, cheap energy is probably the biggest problem that we have.
02:03:12.820 It's causing all these other, uh, globalization-based problems.
02:03:17.640 Yeah.
02:03:17.860 If we didn't have that, we could avoid a lot of our problems.
02:03:20.460 Well, that's right.
02:03:21.320 Because it causes us to behave irrationally, right?
02:03:23.920 Because it destroys these, uh, environmental barriers to, um, like, our wants.
02:03:28.580 You know, and so it's like, um, you know, Mexico, uh, and some of these other places in the world,
02:03:33.820 have three or four growing seasons.
02:03:35.200 They can grow food all year round.
02:03:36.580 And so they're constantly in season.
02:03:38.840 And so you could import avocados from Mexico all year.
02:03:43.160 Um, you know, but it might not necessarily be a prudent idea.
02:03:48.520 Um, but if the, as you say, if the energy is cheap enough, then the transportation costs are very, very low.
02:03:54.420 There's no reason not to do it if you've got the money, right?
02:03:56.940 Aside from the fact that it's hugely inefficient.
02:03:58.680 Well, it's kind of funny.
02:04:01.320 I mean, we can't even grow our own food in Sweden anymore.
02:04:04.380 We're totally dependent on trade, on import, on globalization.
02:04:09.800 And you kind of create this sort of, um, what's it called?
02:04:14.660 Trade net, right?
02:04:16.460 And if you kind of try to plug out from this trade net, well, you're kind of screwed, right?
02:04:20.820 Because now we don't even have industry for our own people to get food, right?
02:04:26.980 Yeah.
02:04:27.780 The system has created a situation where we're very weak and very vulnerable.
02:04:34.780 The, the, the environmental pressure of having, uh, expensive energy means that we become robust and tough, you know, to adapt to that situation.
02:04:45.120 But cheap energy makes us weak and, um, you know, very vulnerable.
02:04:50.380 And that's the problem.
02:04:51.780 Right.
02:04:51.960 Well, in a sense, it's, if you, the fasting is, does this on the micro level, right?
02:04:55.680 When you fast, you deprive yourself of personal energy.
02:04:59.820 You, you remove the cheap energy of the surface of food from your life.
02:05:04.540 And you condition yourself to, your, your soul to behave, uh, to operate in a body that's on a limited energy.
02:05:12.460 Which is, you know, how we are, uh, in nature.
02:05:16.200 We have a limited, um, good.
02:05:21.640 So, yeah, it's just, uh, really, you know, not really much to say to it.
02:05:26.560 Is this going to continue to happen?
02:05:28.300 I'm going to continue to get, you know, brain invading parasites from strange, dark places of the world.
02:05:33.580 That we have no defense or protection against, um, as long as we continue these, uh, globalist trade practices.
02:05:40.280 Obviously, there's still issues from a national perspective, but there's a little bit higher level of control over it, right?
02:05:47.320 Uh, yeah.
02:05:48.300 And so on.
02:05:49.720 So, the kind of, feeding right into this, the next article I wanted to cover was, uh, from Russia Today.
02:05:56.160 Agricultural mega-merger could make China leading GMO producer.
02:06:02.260 A $43 billion takeover deal that would merge Chinese state-owned agriculture company ChemChina and Swiss-owned seed company Syngenta is expected to turn the world's second largest economy into a biotech titan.
02:06:16.120 In recent weeks, the deal has been approved by EU and U.S. authorities.
02:06:20.980 Once closed, it will be China's biggest overseas acquisition.
02:06:24.440 It will create the world's largest farm business oligopoly, concentrating agricultural power in the hands of the three countries, the U.S., Germany, and China.
02:06:34.140 Now, yeah, so basically, this is, um, I think this is, you know, this is important on many, many, many different levels.
02:06:43.820 On the one hand, it demonstrates very eloquently, um, the nature of globalization, where, you know, international food production begins to be monopolized by,
02:06:52.520 they use this interesting, the, uh, oligopoly, the oligopoly, the, the food production begins to be monopolized by the oligopoly.
02:07:02.300 And so, you know, national sovereignty is, by definition, removed.
02:07:06.780 I mean, because if you're beholden to somebody else for your bread, people to starve, you know, you're not autarkic, obviously, you know, they can do whatever they want to you, right?
02:07:14.740 But more than that, you know, we, there's also the element of biotechnology as well, the, these kind of, um, you know, plant-based resources.
02:07:21.960 You know, we see this a lot in the United States with, with Monsanto and these other, um, biotech corporations where they're trying to, um, like, copyright the intellectual capital, you can call it that, like, for seeds.
02:07:35.940 Like, they try to copyright seeds and clamp down on genetic capital that's available for, for farmers.
02:07:42.360 Yeah, I, I heard a funny story about that, by the way.
02:07:44.600 If you, uh, as a farmer, has this little GMO seed that, um, blows with the wind and cannot lands on your ground, that's a sort of copyright infringement.
02:07:57.400 And basically, you'll get sued and lose your farm.
02:08:00.940 Uh, yeah, exactly.
02:08:02.120 There's, um, some interesting documentaries about this in the United States where there's these crazy scenarios where, um, you know, farmers will be recycling their seeds as you might expect them to do.
02:08:12.680 And, you know, if they have one, um, you know, copyright Monsanto GMO seed, they can be sued hundreds of thousands of dollars because they violated their contract that they signed when they bought the seed.
02:08:23.800 Even if they never, the farmer who's recycling the seed never bought from Monsanto, right?
02:08:28.520 The seed blew into their, their, um, their field somehow, right?
02:08:32.660 And so on.
02:08:34.020 Yeah, it's obviously an absurd situation.
02:08:37.000 Uh, and, you know, the people responsible rightfully should be executed.
02:08:40.680 Truly, without a doubt.
02:08:41.820 Well, it's just, and this is the thing, right?
02:08:44.920 It's about control, right?
02:08:46.580 If you can control these basic building blocks of human life, this, the, the, the genetic capital of our agriculture, then you can, um, you have an enormous, enormous amount of control over human activity.
02:09:00.100 Because, you know, even beyond oil, you know, food is our, our, our core energy, right?
02:09:03.880 We need it for personal activity and living.
02:09:05.560 And so if you have this, this, this grip on that production, right?
02:09:11.100 Then obviously you have, uh, a terrible and, uh, in great power.
02:09:15.900 Still, I mean, what, what's the worst case in art could happen, right?
02:09:19.980 The, uh, sort of this, this insanely titanic totalitarian anti-Christian system take over.
02:09:26.040 Or you do not take its marks or you starve to death.
02:09:29.140 And, well, then you're in heaven, right?
02:09:30.980 So we shouldn't be too worried.
02:09:33.540 Yeah, well, yeah.
02:09:34.300 I mean, if you're a Christian, that's the case.
02:09:35.740 But obviously, you know, that's not something we want to see brought about.
02:09:39.280 I mean, if you're a Christian, you would consider other views errors.
02:09:45.600 So why does that matter to begin with?
02:09:48.660 So there you go.
02:09:50.180 Or maybe I'm just babbling because I'm tired.
02:09:51.800 Well, you know, that's, that's ridiculous, right?
02:09:54.480 I mean, because, you know, we have to try and avoid evil.
02:09:58.320 Yeah.
02:09:59.200 This is not even speaking of the fact that, um, like, these people have no idea what they're
02:10:04.620 doing.
02:10:05.240 Right.
02:10:05.820 The, the, the biotech, like the genetic engineering.
02:10:08.620 I mean, I'm not completely opposed to it on principle, but I think we're like several
02:10:13.340 hundred years too early to be actually putting this stuff into application.
02:10:17.820 Like we have no idea how any of this stuff works.
02:10:20.400 I mean, it's, it's like we're, we're playing with, um, alien technology.
02:10:25.860 That's thousands of years more advanced than us.
02:10:27.820 And we're kind of just, uh, you know, taking chunks from it and, you know, duct taping it
02:10:33.340 to other things like these, these, um, genetically engineered seeds.
02:10:37.380 They're not like carefully crafted, uh, you know, by architects who understand how genetics
02:10:42.740 work there, they're, they're just copy pasting stuff from various organisms together and
02:10:47.720 seeing like what sticks.
02:10:49.540 It's completely insane.
02:10:51.300 Um, we, we should not be dealing with this stuff.
02:10:54.120 I mean, it's fine to maybe experiment with it, I guess.
02:10:56.760 I mean, obviously it's very dangerous, but I mean, the fact that we're already putting
02:11:01.640 this into practical application.
02:11:03.300 I mean, this is, it's unbelievably irresponsible.
02:11:07.220 Well, this is the thing, and this is what it comes back to.
02:11:09.280 And I was thinking about this, right?
02:11:11.100 You know, so we asked, you know, best case scenario, we get what we want.
02:11:14.140 We have, um, you know, a serious regime.
02:11:16.960 What's the real difference between, uh, you know, us invest this version of us and our enemies?
02:11:23.240 Well, it's that, you know, there's a rational government, right?
02:11:27.080 Wisdom, there's a, uh, at least an attempt, you know, to govern and make policy decisions
02:11:33.000 and guide the course of the nation with like the common good in mind, you know, utilizing
02:11:38.920 reason in order to achieve that, right?
02:11:42.820 That's a goal at large.
02:11:44.400 And so that's the thing is these, you know, our, our enemies are like, you know, they're,
02:11:48.740 they're taking the, uh, you know, like the, the paintbrushes of the gods and doing finger
02:11:53.140 painting with them basically, you know?
02:11:55.980 And we had, it's just, it's, uh, it's like out of a science fiction, but really it is,
02:12:00.960 you know, it's, um, incredible, incredible.
02:12:06.180 Yeah.
02:12:06.700 Again, like the, the, the, the biology, like, uh, physical bodies of animals and plants and
02:12:12.980 humans and all that, um, you really have to understand that it's basically like, uh, technology
02:12:18.740 that's thousands and thousands of years more advanced than what we can do now.
02:12:24.240 I mean, it's not, that's, that's another thing about transhumanism that, um, we haven't
02:12:29.300 talked about is that there's this silly notion that we can make like metal, uh, and like
02:12:37.580 silicon parts that will be better than our own biological parts.
02:12:43.920 And that is an absurd like statement that, that completely, uh, that completely underestimates
02:12:50.200 how, uh, powerful our own bodies are.
02:12:54.320 Like we can't, we can't make a metal arm that's stronger than a human arm.
02:12:59.200 Like that's not, that's far beyond our technology.
02:13:03.100 Um, plus, I mean, if you would, let's say you chop off your own arm and try to replace
02:13:08.340 it with another arm.
02:13:09.620 I mean, that, that is literally a death threat because you're separating your body from your
02:13:13.580 soul.
02:13:13.920 But if you're, if you're like, you know, had an accident and you try to sort of step up
02:13:20.480 back to the ideal body that reflects the soul, then that would actually be different.
02:13:25.880 So, I mean, Florian has a point there, I suppose.
02:13:28.740 Yeah.
02:13:29.220 What I'm, what I'm saying is like, you make an arm out of metal, uh, it's going to be
02:13:34.640 much heavier than a normal arm.
02:13:37.020 So like the, the power to weight ratio is going to be much worse.
02:13:42.040 Um, the, the energy consumption is going to be far worse than what we have.
02:13:46.700 Like our, the animal bodies that like biology is like extremely energy efficient, like far more
02:13:52.940 than anything we can make.
02:13:54.480 And so like the overall performance is like really incredible, far beyond anything we can
02:14:00.020 do.
02:14:00.640 And really some of the most advanced technologies that we were working on really takes inspiration
02:14:06.320 from biology.
02:14:07.760 Not, uh, we're, we're thousands of years, uh, too early to be making improvements on, um,
02:14:14.420 on the human body or even the animal bodies.
02:14:17.160 Yeah, exactly.
02:14:18.480 And I mean, this is not, you know, these, these sort of experimentations.
02:14:21.740 I mean, if you read the book of Enoch, you read other, um, he's not just confined to like
02:14:27.560 a Christian scripture, but if you read like the Vedic texts, there's lots of, um, references
02:14:33.680 to, you know, whatever you want to call it, the antediluvian civilization, past high civilizations
02:14:38.680 in the history of humanity that got into these same things and it turned out to be their downfall.
02:14:43.580 Right, the chimera.
02:14:46.800 Right, the, indeed.
02:14:47.800 And so this is the thing.
02:14:48.560 I really do not think that there's anything new under the sun.
02:14:51.560 I don't think that these problems are novel.
02:14:54.080 I think that we're encountering the, the same bullshit that we've always run up against.
02:14:58.000 And so I don't think that this is, uh, they should not surprise us.
02:15:02.560 Should not surprise us.
02:15:03.620 It's, we just have to understand and it has to be a kind of motivation for us to be committed
02:15:08.360 to ourselves that we are saying that in our households, there's sanity and wisdom and order
02:15:14.000 in our communities and so on, where we can actually implement it.
02:15:17.840 It's what it's about.
02:15:19.000 Always focus on this.
02:15:20.840 Yeah.
02:15:21.460 You know, and this is what comes down is, and Zaga, you said it perfectly.
02:15:24.820 There's no, um, there's no magic bullet.
02:15:27.660 There's no like secret technique to, uh, replace virtue.
02:15:30.760 That, that's what it comes down.
02:15:32.920 We just have to be virtuous.
02:15:33.960 And that's it.
02:15:35.300 Yep.
02:15:36.160 And it's just very difficult.
02:15:37.380 Okay.
02:15:37.640 Right.
02:15:37.960 Like the cultivation of virtue, you know, requires a lifetime to gain even a small amount, a small
02:15:44.340 amount of consistent, uh, consistently implied virtue, a very, very precious thing.
02:15:49.960 So it does not mean that we should not strive after excellence.
02:15:53.860 No, I'm going to hit the last article.
02:15:56.100 I'm coming into the universe.
02:15:56.540 The link is broken.
02:15:58.060 Yes, I know.
02:15:58.620 I put a new link in the, in the mumble here.
02:16:00.500 If you've got it.
02:16:02.900 Yeah.
02:16:03.600 So, uh, we're going to hit the last article.
02:16:05.280 I'm going to finish up here.
02:16:07.800 It's from yahoo.news.com.
02:16:10.040 Syndicated in many different places.
02:16:12.860 Basic Instinct director, Verhoeven, to make film about lesbian nuns.
02:16:18.080 Uh, Basic Instinct director, Paul Verhoeven, you know, he directed Starship Troopers, is
02:16:24.880 making a film about a lesbian nun with miraculous powers who falls in love with another sister,
02:16:31.820 his producer said, Wednesday.
02:16:33.280 The veteran Dutch filmmaker, whose controversial twisted rape thriller, Elle, starring Isabella
02:16:41.560 Huppert, was a huge critical hit, is working on a drama called The Blessed Virgin.
02:16:48.620 The producers said that it was based on the life of a 17th century Italian nun, uh, Benedita
02:16:54.440 Carlini, who caused a sensation when she claimed to see visions of Jesus Christ.
02:16:59.200 I mean, I could go on, but it's basically, it's just, it's very, it's straight up, right?
02:17:04.320 It's just, um, you know, they're, they're making ugly what's beautiful.
02:17:08.400 They're defiling what's pure.
02:17:09.800 It's not complicated.
02:17:11.700 So what about this nun?
02:17:13.740 Um, what visions was it that you saw in real life?
02:17:18.220 Well, I didn't, I didn't penetrate into the mysteries too deeply, right?
02:17:22.100 But this is the thing.
02:17:23.240 It's because it's what it's about.
02:17:24.880 It's about creating in people's minds the idea that there can be.
02:17:29.620 Very high levels of sanctity and personal holiness, uh, that, you know, people can achieve,
02:17:35.260 uh, uh, you know, uh, mystical levels of communion with, uh, the Godhead and enjoy kind of miraculous
02:17:43.720 powers while at the same time having no personal ethics, like we'll engage in, in, in homosexual
02:17:49.440 action.
02:17:50.940 Right.
02:17:51.480 And this is not, it's a very nice thing as well, because that really cuts out the God from
02:17:56.860 the whole idea.
02:17:57.700 Exactly.
02:17:58.020 Exactly.
02:17:59.040 You don't need anything like that.
02:18:00.640 All you need is you to feel good about yourself and have this will to power mentality.
02:18:07.680 Oh, I want spiritual guests because that will make me cool.
02:18:11.160 I mean, you essentially see the same thing in, uh, Pentecostals, right?
02:18:15.560 You know, this old babbling.
02:18:16.940 I mean, everyone has gone to one of those meetings, you know what I'm talking about,
02:18:20.000 and it's just absurd to look at.
02:18:22.400 It's scary.
02:18:24.300 And they have this, this real disturbing desire to basically LARP like apostles, right?
02:18:34.160 That's essentially what it is.
02:18:35.280 They want to LARP like apostles, to redirect the apostles and say, oh, I want to LARP like
02:18:40.060 this.
02:18:41.000 I identify myself as an apostle.
02:18:43.300 And then they go on and off about that.
02:18:46.180 It's, well, prelice is a good word.
02:18:49.720 Spiritual deception.
02:18:50.880 Yeah, and that's what it's about, man.
02:18:52.980 And this is, this is what, um, these people are trying to do is, is, is the, it's inversion,
02:18:58.520 right?
02:18:58.840 It's the perversion.
02:19:00.180 They're taking these cosmic, these divine archetypes and they're twisting them on their head, right?
02:19:07.600 They're, they're saying that, you know, one can be unethical, kind of low virtue, but
02:19:13.340 also be, as we say, in communion with the divine.
02:19:16.640 Intimately.
02:19:16.920 I mean, we, we see the same thing here in the Swedish state church as well, right?
02:19:20.880 You know, all these priests who support homosexuality bishops as well.
02:19:25.780 I mean, one bishop is a lesbian.
02:19:28.840 Let that sink in, by the way, bishop being a lesbian.
02:19:33.260 It's a feeble, by the way.
02:19:35.840 And all this stuff.
02:19:36.760 And I, I spoke to, to, um, an orthodox priest a while ago, and he said something in the
02:19:42.900 lines of, uh, now I lost my train of thought.
02:19:46.940 I had it clearly.
02:19:50.700 Just give me a second.
02:19:51.760 Well, Paul Verheaven, like, uh, experience, like, inverting morality, you know, his, his
02:20:00.020 film, Starship Trooper, like, it's based on a book that was a pro-fascist book and
02:20:05.640 he flipped it around and made it, uh, an anti-fascist book, uh, movie.
02:20:09.620 So.
02:20:10.080 Yeah.
02:20:10.400 It's, it's so absurd.
02:20:11.200 He hated it.
02:20:11.640 He didn't even read the book.
02:20:12.960 No, he didn't read it.
02:20:14.240 Right.
02:20:15.140 Hey, that's the thing.
02:20:16.260 You know, I mean, it's just kind of a funny movie.
02:20:17.960 Um, but it's just, it's, it's ridiculous.
02:20:21.220 Starship Trooper is one of my favorite novels and the movie is just, it's like a joke, you
02:20:25.780 know?
02:20:27.640 Um, but yeah, I mean, what's your, uh, your reading?
02:20:32.460 Oh, right.
02:20:33.140 I, I, I just, I just remember what he said again.
02:20:35.100 And he, he met this, this, this, this guy that, you know, had tons of sex outside marriage
02:20:40.960 and this kind of stuff.
02:20:42.060 And, you know, when he came up to take, you know, communion, he said, have you confessed
02:20:45.880 all your sins?
02:20:46.700 Do you want to go to confession?
02:20:48.500 Or he said, no, I don't want to go to confession.
02:20:51.700 He just said, nah, I don't want to give you communion.
02:20:54.640 Get out of here.
02:20:55.400 Get out of here.
02:20:56.580 Get him out of here.
02:20:57.600 You know, that, that was pretty funny.
02:20:59.340 Um, but yeah, I mean, these, these movies are just, it's just horrible.
02:21:04.580 You know, we don't really have any sort of instinctive decency anymore.
02:21:08.020 Right.
02:21:08.860 Yeah.
02:21:09.120 But these movies, they're, um, psychological warfare, basically.
02:21:12.960 Well, that's the population, right?
02:21:14.200 Yeah.
02:21:14.380 Um, no, that's the thing, no, no, I think that's the thing that's accurate.
02:21:17.220 When you don't have the connection to the organic truth anymore.
02:21:18.900 Right.
02:21:19.300 Well, exactly.
02:21:20.380 Well, this is the thing, right?
02:21:21.320 And when you only have the psychological, you can easily deconstruct it like this.
02:21:24.740 Right.
02:21:25.060 Exactly.
02:21:25.260 So people, you know.
02:21:25.980 And that's what's so scary.
02:21:26.380 That's it.
02:21:26.880 Don't interrupt me, Florian.
02:21:28.560 Sorry, Hans.
02:21:29.020 Anyway, I'm done now, so now you can go ahead.
02:21:31.340 Thank you, my friend.
02:21:32.240 Thank you for, uh, releasing the authority to me.
02:21:35.980 No, I think that, yeah, this is the thing is, you know, most young people, they've never
02:21:38.800 met a nun in real life.
02:21:40.520 They've never been to a convent.
02:21:41.560 They've never met a monk.
02:21:42.880 Right.
02:21:43.040 They don't know anything even really about Christianity.
02:21:45.680 So when you make movies like this, you insert into people's minds the image of monasticism
02:21:51.020 monasticism as basically being these lesbos, right, which is just, it's absurd, right,
02:21:57.000 on its face if you know anything about Christianity, right?
02:21:59.880 And it's just, um, and that's what it's about.
02:22:03.720 It's about deconditioning and destroying the traditional expressions of European mysticism
02:22:09.140 and the spiritual, uh, uh, pursuit of the spiritual way, right, where, you know, people
02:22:15.280 who in the past who were dedicated, who were interested in, uh, obtaining true transcendence,
02:22:20.200 they wanted to actually, um, move beyond the limitations of their human form, obtain
02:22:25.140 immortality, beat corruption, you know, dedicated themselves to monasticism, to this pursuit
02:22:30.260 of the spiritual life, communion with the divine, would be a holy, laudable, and noble thing,
02:22:36.020 um, which there certainly isn't enough of.
02:22:38.160 But this is what these, this movie tries to do is it takes this, right, which is already
02:22:43.080 so foreign and attached to most people's minds, and it, you know, drags it through the muck
02:22:47.420 and presents its own kind of juniorized version of this story, uh, to, uh, modern audiences.
02:22:54.560 Exactly.
02:22:55.140 I've, I've heard another funny story.
02:22:56.480 There was this movie team, and they were going to, they were going to try to do something
02:23:00.900 like that at a monastery because of, oh, these, these monks, they're all homos.
02:23:04.640 They thought, oh, I can't feel good to sing that, but the point is, so what happened, right?
02:23:08.700 The, um, uh, the monks, they invited the team in, right, and basically now the team was forced
02:23:15.980 to stay there and, you know, hear their prayers and, you know, participate in liturgy and do
02:23:21.100 all that stuff.
02:23:22.680 So basically what's happened was that while the TV team just wanted to try to paint the
02:23:29.020 monk in a monastery as a sort of dirty house, what happened was that the TV team was forced
02:23:35.500 to see something pure and wholesome for once, and they didn't really find the filth they
02:23:40.220 thought they were going to find.
02:23:42.060 Yes.
02:23:43.060 Well, we've, thank you, Ruda, you've rejoined us, that's fortuitous, we've come to the end
02:23:47.220 of our podcast for tonight.
02:23:48.920 So, um, to all of our listeners, thank you for joining us.
02:23:52.840 I'm your host, Florian Geyer.
02:23:55.180 It's a pleasure to be back as always.
02:23:56.580 Christ is risen.
02:23:58.300 Indeed, joining me this evening, I had my co-host, Greva Hans, a very tired Swede.
02:24:04.160 Thank you, my friend, for joining us.
02:24:06.640 Yes, good to be here.
02:24:07.680 Also, Death to Gnostics again.
02:24:09.940 Ha!
02:24:11.020 Joining me as a special guest, back on the panel, Rude Rathont, Rude Pace, for popping
02:24:15.860 on right at the end to, uh, graciously give your, uh, your exit.
02:24:19.960 Thank you, and God bless you, God bless you, all, all listeners.
02:24:23.900 Excellent.
02:24:24.860 And once again, on the show, we had Zyger.
02:24:27.580 Uh, Zyger, thanks for your analysis.
02:24:29.200 It's always good to have you with me.
02:24:30.220 I do really enjoy discussing this kind of stuff with you.
02:24:32.480 Good, great insight.
02:24:33.060 Yeah, thank you.
02:24:34.660 I had a good time.
02:24:35.920 A real pleasure.
02:24:37.280 To all of our listeners, thank you for listening.
02:24:41.080 Shalom.
02:24:41.400 Hey, ja, ho, wo war denn da der Redelmann?
02:24:56.380 Hey, ja, ho, spieß voran, hey, rauf und dran, setz das Klosterdach, den Ruch und Teuf ein Hahn.
02:25:07.440 Hey, ja, ho, wo war denn da der Redelmann?
02:25:09.700 Hey, ja, ho, ja, ho, ja, mancher rüber die Klinge sprang.
02:25:29.040 Hey, ja, ho, ja, ho, ja, ho, ja, ho, ja, ho, ja.
02:25:32.320 Hey, ja, ho, ja, ho, ja.
02:25:33.460 He, ja, ho, ja, ho, ja.
02:25:35.720 Hey, ja, ho, ja, ho, ja, ho, ja, ho, ja, ho, ja, ho, ja, ho, ja.
02:25:39.140 Hey, ja, ho, ja, ho, ja, ho, ja, ho, ja.
02:25:48.840 Come on, come on, go on and go on
02:25:52.460 Put the church on the roof on the roof
02:25:56.260 Come on, come on, go on and go on
02:26:00.340 Put the door on the cross, the Herrn Kaplan
02:26:18.840 Put the door on the cross, the Herrn Kaplan
02:26:48.840 Put the door on the cross, the Herrn Kaplan
02:27:00.580 Put the door on the cross, the Herrn Kaplan
02:27:08.540 Put the door on the cross, the Herrn Kaplan
02:27:15.460 Put the door on the cross, the Herrn Kaplan
02:27:16.760 ran away!
02:27:18.420 It can't beілghed!
02:27:20.640 Noamoyt Mr. Hugo's King!
02:27:26.400 Noamoyt!
02:27:30.480 Who is to hide?
02:27:33.860 Our own Reach is going to
02:27:33.900 be found a bloody face!
02:27:38.240 The offspring
02:27:41.760 They want to be hidrugi!
02:27:45.800 Hey, ja, ho, die Leiger an die Kirchtür schmieren
02:27:53.380 Hey, ja, ho, spieß voran, hey, lauf und dran
02:28:01.180 Treffen wir uns wieder am Gehalte hin dann
02:28:05.080 Spieß voran, hey, lauf und dran
02:28:08.980 Treffen wir uns wieder am Gehalte hin dann
02:28:12.920 Spieß voran, hey, lauf und dran
02:28:16.700 Treffen wir uns wieder am Gehalte hin dann
02:28:20.620 Spieß voran, hey, lauf und dran
02:28:24.440 Treffen wir uns wieder am Gehalte hin dann
02:28:42.920 Geschlagen ziehen wir noch aus
02:29:01.740 Hey, ja, ho, ho, und Enkel fechtens besser raus
02:29:17.440 Hey, ja, ho, ho, ho, ho, ho
02:29:25.140 Gehalte hin dann
02:29:26.840 Gehalte hin dann
02:29:30.840 Gehalte hin dann
02:29:32.840 Gehalte hin dann
02:29:35.740 Dann
02:29:37.040 Gehalte hin dann
02:29:38.180 Gehalte hin dann
02:29:38.540 Gehalte hin dann
02:29:38.940 Gehalte
02:29:40.440 Fin
02:29:40.540 Gehalte hin dann
02:29:42.540 Gehalte hin
02:29:44.960 Keh, du
02:29:46.780 Gehalte hin dann
02:29:47.720 Je
02:29:48.480 Gehalte hin ans
02:29:49.660 Gehalte hin Erfahrung
02:29:51.660 Gehalte hin
02:29:53.660 Gehalte hin
02:29:55.660 Gehalte hin
02:29:56.720 Gehalte hin
02:29:59.640 Gehalte hin