RadixJournal - April 10, 2021


Where Are They?


Episode Stats

Length

41 minutes

Words per Minute

163.976

Word Count

6,724

Sentence Count

432

Misogynist Sentences

4

Hate Speech Sentences

14


Summary

In this week's episode, the boys discuss the dangers of contact with alien life, and whether or not we should be worried about it. Plus, a discussion about vaccines and chocolate and the dark side of the moon landing.


Transcript

00:00:00.000 Michio Kaku, that's his name.
00:00:01.820 Michio Kaku, I have a book by him here.
00:00:03.640 He thinks we will discover alien life in the next 100 years
00:00:07.540 because he's very, very optimistic.
00:00:10.380 He writes books with titles like The Physics of the Impossible.
00:00:13.440 But he thinks that if we do so,
00:00:14.840 we should probably be quite cautious about it
00:00:17.860 because they might wipe us out.
00:00:19.320 And he gave the example of Cortez
00:00:21.080 and how Cortez was welcomed by the king of the Aztecs.
00:00:24.080 And then Cortez just displaced the king of the Aztecs
00:00:27.260 and they all died of disease and whatever
00:00:29.680 and were generally conquered.
00:00:31.400 So the Aztecs didn't do very well
00:00:33.000 from coming into contact with a superior civilization
00:00:35.560 and nor might we.
00:00:39.160 Oh, of course not.
00:00:40.200 No.
00:00:40.940 And that's what's got us thinking about this, folks.
00:00:50.240 All right, Ed, how are you?
00:00:54.040 Yes, all right.
00:00:55.140 Did you have a good Easter?
00:00:56.760 I did, yes.
00:00:58.060 I went to church.
00:00:59.680 You know, I'm an Episcopalian, as you know.
00:01:02.840 Kind of the ultimate Episcopalian
00:01:04.460 in the sense that we don't actually believe.
00:01:08.700 No, the ultimate Episcopalian simply believes in wasps.
00:01:12.560 Yeah, exactly.
00:01:13.400 He believes in and worships wasps.
00:01:15.960 That's it.
00:01:16.440 Yeah, exactly.
00:01:17.500 We are wasp god.
00:01:19.500 No, it was very nice and ate way too much candy.
00:01:23.180 I'm kind of starving myself for the next couple of days because I got, I ate so much candy
00:01:29.720 that I was like by 5 p.m.
00:01:31.880 I was like on the verge of total collapse.
00:01:35.660 Right.
00:01:36.140 But my experience for American chocolate is that it doesn't have any sugar in it anyway.
00:01:42.060 So I don't suppose that it would have had that much of an effect, would it?
00:01:46.280 Well, we did have Cadbury cream eggs.
00:01:48.720 You would be happy.
00:01:49.040 Oh, do they have those in America and they taste like Cadbury cream eggs?
00:01:51.940 Oh, okay.
00:01:52.660 Well, I think so.
00:01:53.920 I love them.
00:01:55.000 They're only available around Easter time, but I love those.
00:02:00.020 Yes, they are.
00:02:00.600 They are nice.
00:02:01.380 We used to have a boy when I was on my road when I was about 10 and we used to call him
00:02:05.940 Cadbury cream egg because he had brown hair, but he had like a white circle of hair at the top.
00:02:11.280 So his nickname was cream egg.
00:02:13.800 That's nice.
00:02:14.800 Poor chap.
00:02:15.980 So have you been vaccinated yet?
00:02:18.520 No.
00:02:19.940 Have you?
00:02:21.060 I've taken my first dose.
00:02:23.200 Yes.
00:02:23.320 Oh, no, no.
00:02:23.900 We don't have any of this up our way because we don't have much Corona up our way.
00:02:28.700 So there's no sort of pressure to get vaccinated.
00:02:32.220 I haven't received anything about this at all.
00:02:34.480 I don't know anything about it.
00:02:35.140 Interesting.
00:02:36.760 I took the first dose.
00:02:39.280 I think I did the Moderna.
00:02:40.820 So I was a little bit sore in my arm, kind of weirdly.
00:02:46.240 Never been sore from a shot for that long.
00:02:49.420 But I mean, not that it was a major thing.
00:02:51.900 But yeah, no side effects.
00:02:55.240 Well, except, of course, Bill Gates now knows everything you think.
00:02:58.900 Well, I mean, if we want to reach stage three civilizations, microchips in our heads controlled by overlords is step one.
00:03:09.620 I mean, your former fellow Montana resident, Mr. Kaczynski, who, of course, lived outside Lincoln, Montana, would not be happy about this at all.
00:03:24.640 I mean, this is exactly his nightmare coming true.
00:03:27.600 And the only solution would be for him to send parcel bombs to random caretakers.
00:03:33.280 Yeah, absolutely.
00:03:34.400 Well, as you know, once you get a vaccination, Windows 95 basically plays in your consciousness.
00:03:42.480 Now, some people think that's a side effect.
00:03:44.560 I think it's a feature.
00:03:46.500 It's excellent to wake up in the morning and you wake up and you know it's time.
00:03:51.980 Anyway, should we move away from this nonsense and move on?
00:03:57.200 All right.
00:03:57.600 So this is going to be an offbeat, off the beaten track discussion.
00:04:04.200 Or maybe it's not.
00:04:05.420 Maybe it actually gets to the heart of what we care about.
00:04:08.600 But so you sent me an interesting article from the scientist.
00:04:13.940 And he's really a public scientist.
00:04:15.440 He's kind of a Carl Sagan of our day.
00:04:18.540 Michio Kaku, who you're holding a book of his.
00:04:24.640 I have actually never read one of his books.
00:04:26.420 I probably should.
00:04:27.320 I've seen a lot of his interviews and videos.
00:04:30.000 And he seems like a, you know, enlightening and also generally kind of fun guy as well.
00:04:37.300 I do consume a lot of popular science books, but I haven't read one of his.
00:04:42.020 But he basically said, he gave a warning, which is that, well, he showed optimism and a warning, which is that we likely will meet intelligent life out there of some kind.
00:04:56.400 But actually, we should be careful what we wish for.
00:05:01.200 So you can go on this and then I have a couple of different things I want to talk about.
00:05:06.960 Fermi's paradox and then stages of civilization.
00:05:10.320 But why don't you go on this?
00:05:12.100 Well, first of all, if the life was, there's no, if he's saying this is going to happen within our lifetimes or whatever, which is what I inferred from the article.
00:05:24.260 Then it won't be us that is reaching this.
00:05:27.500 I mean, our intelligence is declining and we haven't gone in.
00:05:30.080 We haven't we've we've we're not seriously sending people into space or even contemplating trying to find intelligent life anymore.
00:05:36.600 We've become decadent and focused, therefore, on harm avoidance and on equality and things on Earth.
00:05:42.080 So it would have to be for the civilization to find us.
00:05:45.320 They would have to be not only would they have to be much, much more intelligent than us, which is extremely dangerous because they might see us as dispensable and not worth bothering about.
00:05:57.220 And that was the parallel he gave was when Cortez found the king of the count, found the Aztecs and the Aztecs.
00:06:04.280 OK, I mean, they were in the ruins of a more advanced civilization, but they weren't a particularly advanced civilization themselves.
00:06:10.420 And as far as he was concerned, they were just rubbish.
00:06:13.280 And he was perfectly content to just the king of the Aztecs was quite friendly to him.
00:06:19.120 He simply displaced him, took over, tore down their shrines, spread all kinds of diseases they weren't used to and killed them.
00:06:26.220 So it didn't go well because there was a substantial difference in civilization and therefore they looked they looked down on them.
00:06:33.460 So I would think that that's going to be tenfold more if we're talking about a species that's capable of of inter solar system travel.
00:06:44.080 Well, I mean, but have you considered the fact that we invented feminism and they're going to need that?
00:06:50.500 So we you know, they would come with advanced interstellar technology.
00:06:54.580 They're going to need they will come with feminism.
00:06:57.560 So you're welcome.
00:07:00.480 Well, that brings me on to my that brings me on to my second point.
00:07:04.480 So they would we're ending this.
00:07:06.120 They would they would.
00:07:07.200 First of all, it's very likely for them to be able to do this.
00:07:11.400 They would have to be far more intelligent than us.
00:07:14.960 They would have to.
00:07:15.740 I mean, maybe they would have an IQ, an average IQ of one hundred and twenty or one hundred and thirty.
00:07:20.800 Maybe if we had the technology at our disposal that we had in the 60s or now and we had the same IQ that we had when we began the Industrial Revolution,
00:07:31.100 then if that happened, then maybe we would be doing things like going to Venus and where you couldn't land on Venus.
00:07:37.000 It's too viscous. But, you know, it's just going there or all this.
00:07:40.620 But definitely whatever we don't.
00:07:43.060 We don't. We don't. We did have that, but we don't.
00:07:45.600 So we don't have that. So that's how much more intelligent they would be than us.
00:07:49.820 And that is a huge difference.
00:07:52.140 Thirty IQ points. That's the difference between us and, you know, primitive tribes in Africa.
00:07:59.160 So that's that's that's the first thing.
00:08:01.460 So they would see us.
00:08:02.380 They that doesn't mean because we don't see them as dispensable, but when we were in our exploration phase,
00:08:08.700 when we were in the phase where we were not decadent, where we were in the autumn of our civilization,
00:08:15.780 where the fruits of the civilization were coming in, then we we were we did not we we did see ourselves as superior.
00:08:22.740 We did see ourselves as better.
00:08:24.560 We did see these other people as dispensable and beneath us.
00:08:27.460 We had our racial theories and stuff like this.
00:08:29.920 We were into eugenics and all that kind of thing.
00:08:32.380 And so that we would what we approached Africa with that in mind.
00:08:38.420 And that being the case, we were happy to exploit them.
00:08:42.000 We were happy to just take over their lands.
00:08:44.180 We were happy to enslave them, although it should be emphasized.
00:08:46.820 They also enslaved each other.
00:08:48.620 But there we are.
00:08:49.980 We were we were perfectly happy to do that.
00:08:51.820 We were happy to just treat them as a sort of not a subspecies, but as a sort of sub as a lesser kind of human.
00:08:59.240 And these were people that were genetically closely related to us, that were genetically similar to us, relatively speaking, as part of a religion which preached that we're all equal in the eyes of God or whatever.
00:09:12.960 I mean, it's later in the 1950s that the Archbishop of Canterbury in England stated when he came back from Africa that all are equal in the love of God, but not in the eyes of God.
00:09:22.960 And he said that with reference to these Africans.
00:09:25.860 So that would be if they were in that phase of civilization, they would be high in group oriented values, high in ethnocentrism, high in cold rational logic and classification and all this stuff.
00:09:39.020 And so they'd be perfectly happy to enslave us as or at least to exploit us and take us over and take charge of us as we did to people with that kind of IQ difference.
00:09:49.440 But there are different species.
00:09:51.660 So they have no genetic similarity to us at all.
00:09:56.500 Right.
00:09:56.680 One of the things that predicts being nice to people and being kind to people and bonding with people is genetic similarity.
00:10:02.680 So in that sense, we shouldn't compare our relationship to them, to the Victorian white man's relationship with black people.
00:10:11.280 It's more comparable to our relationship like dolphins or something.
00:10:15.700 I mean, yeah, I mean, when we go to bees, we don't have any trouble basically taking them out of their natural habitat, putting them into giant boxes and then harvesting their honey.
00:10:29.880 No one has any moral qualms with that outside of the post-extreme.
00:10:34.400 I was just guessing on that IQ difference that would be necessary.
00:10:38.680 I mean, it's probably more.
00:10:41.040 It might be more and it might just be different.
00:10:43.780 I mean, okay, here are some – because I want to add something about the intelligence.
00:10:49.180 I actually don't – obviously, intelligence is an indispensable quality of this, but I don't think it's actually sufficient for this kind of adventure.
00:10:58.460 But let me just mention a couple of things.
00:11:02.320 And this is basically Fermi's paradox, which is – can be summed up in the line, where are they?
00:11:08.560 So there are 100 billion galaxies that we can – at least estimated – that we can see out there.
00:11:17.040 But we're separated from them through such distances that into some of them, it would actually take millions of years to get there, even if we were traveling at close to the speed of light.
00:11:30.980 So now, might there be some other form of travel that we haven't quite grasped, you know, a kind of dune-like folding space or something like that?
00:11:46.060 Maybe, but that is obviously beyond us.
00:11:49.520 But there are – in the Milky Way, there are basically 500 billion stars.
00:11:55.160 So the Milky Way itself, our own galaxy, is huge.
00:12:00.440 And then the estimate from scientists – and so obviously take all of this with a grain of salt, but this kind of gets us, you know, in the ballpark.
00:12:10.620 The estimate is that there are 100 million planets – excuse me, there are 100 million stars that could have planets that are – resemble, say, our solar system that could thus –
00:12:25.160 you know, have life.
00:12:27.320 Now, if only a tiny percentage of these did actually develop through this miracle of life generating on a planet, then there should be people out there.
00:12:43.180 And they've had enough time to evolve and to conceivably begin a colonization project.
00:12:50.580 So it does – Fermi's Paradox is a kind of, where are they?
00:12:53.780 Are we alone in the universe?
00:12:57.000 Because, you know, yes, life to develop is already – you know, the odds are against it.
00:13:04.820 It's already a kind of miracle.
00:13:06.220 There are enough chances for it to develop.
00:13:08.240 Go ahead.
00:13:08.620 But the potential solution to Fermi's Paradox is that – well, there's two solutions.
00:13:14.620 One is that these different space-going civilizations will never meet because they will all go through the cycle of civilization, of the rise and fall of civilization, always.
00:13:31.240 Even if they get a species that's intelligent enough to have civilization, it won't have civilization for more than fleeting periods of time, followed by long periods of dark age and then fleeting period of time where it might be possible to do something like that or even get close to it.
00:13:49.160 And so they will therefore never come into contact because they will just – you'd have to have two – for them to come into contact realistically in anything other than a destructive way, you'd have to have two civilizations that were in exactly the same sort of place at the same time.
00:14:05.300 And that's very unlikely.
00:14:07.560 The second solution to Fermi's Paradox is simply that it is impossible to ever – the evolutionary forces necessary to achieve that kind of high-level intelligence.
00:14:21.840 Intelligence is never able to go far enough to engage in that kind of interstellar travel because civilization always collapses before we become that intelligent.
00:14:33.040 When we're in the phase where we might be intelligent enough to do something like that, that we haven't – that's when our breakthrough of the Industrial Revolution takes place.
00:14:42.500 And therefore, by the time we have the sort of technology that might allow us to do things like that, we've become decadent and we've stopped bothering and we start to go – we start to go backwards.
00:14:51.280 And then I think the third thing which might be relevant is simply that the nature of intelligence precludes this ever happening.
00:14:58.600 Because the level of intelligence that would seemingly be required to do that would be so massive and it seems that intelligence is maladaptive when it becomes too high because what you invest – you take away energy from other things.
00:15:17.320 And so people that have super high intelligence – I mean, again, I refer you back to your fellow Montanatan Ted Kuczynski.
00:15:24.800 This was a person who had an IQ of 170, something like that.
00:15:29.640 He was super, super duper intelligent.
00:15:32.680 And look at the result.
00:15:35.020 These are not the kind of people that seem to be consistent with building up civilization.
00:15:41.080 High intelligence is associated with autism.
00:15:44.720 It's associated with mental illness.
00:15:46.340 It's associated with all kinds of things being wrong.
00:15:49.520 So I don't think we could ever become intelligent enough.
00:15:53.440 Or if we could become intelligent enough, we would become intelligent, that intelligent, in the wake of an appalling ice age.
00:16:00.140 And therefore, at the end of that appalling ice age, we would be sort of hunter-gatherers or farmers or something like that.
00:16:05.680 And it would take us time to get the technology.
00:16:10.440 And by the time we got the technology, our intelligence would be falling.
00:16:14.120 So I think there's a number of reasons why it's just not – it can't happen.
00:16:18.180 Yeah.
00:16:18.400 Well, there does seem to be these recurring tendencies.
00:16:23.680 And we can talk about spiteful mutants and so on.
00:16:26.900 But if you look back at world history, it's something like the Bronze Age collapse.
00:16:33.200 Remember, we shouldn't underestimate the intelligence and engineering genius of the distant ancient world.
00:16:45.520 In the sense of these Bronze Age civilizations, Egypt being one of the most magnificent.
00:16:53.060 I mean, the ability to create the pyramids, engage in –
00:16:59.520 But that's not Bronze Age, though, is it?
00:17:01.160 That's Stone Age.
00:17:04.180 Well, I'm talking about the – okay.
00:17:06.540 No, Pyramids is not Bronze Age.
00:17:07.980 Pyramids is earlier.
00:17:09.460 Okay.
00:17:09.800 And that's why – no, it's important because that's why there is this belief among certain – what's called conspiratoria.
00:17:18.480 There's a guy called Hancock, his name is.
00:17:20.580 And he's done a series of books in which he argues that there was a technological civilization that existed in the Stone Age.
00:17:33.260 And that actually it wasn't the Stone Age at all and that they had like lasers and that they were a high technology civilization.
00:17:39.800 And some people have gone on from this to argue that they were actually aliens.
00:17:44.160 And so aliens have come here and aliens built the pyramids.
00:17:48.380 Okay.
00:17:48.940 Well, okay.
00:17:51.140 All right.
00:17:51.860 I'll leave that there.
00:17:53.440 That's not even opposed to what I was trying to say.
00:17:56.260 But what I was saying is the Bronze Age collapse is one of the most mysterious events.
00:18:04.660 And it does seem to lead to a kind of collapse of complex society's thesis in which societies get to a point where the intelligence is not high enough to maintain them.
00:18:17.840 That in some ways the technology that they're developing is leading them into a certain – leading to certain demands that the people aren't successful enough to accomplish.
00:18:30.440 So with the Bronze Age collapse, you had the collapse of all of these magnificent empires.
00:18:35.960 You had the rapid decline in literacy, rapid decline of history writing so that we can actually know what is happening.
00:18:45.300 And all of this happened over the course of about a generation.
00:18:48.620 I mean, it's a remarkable thing.
00:18:51.380 Then we, of course, had the ancients as we generally think of them in the sense of the development of Greece and Rome.
00:18:59.500 But we had similar collapses of these amazing imperial structures.
00:19:06.000 I think a reasonable hypothesis for it is climate change.
00:19:10.220 So you had a situation where it had been extremely cold.
00:19:13.600 That creates intelligent people.
00:19:16.100 Those intelligent people create the Bronze Age.
00:19:19.720 Then it starts to get warm.
00:19:21.940 That means the intelligence starts to go down.
00:19:24.140 But the population can grow and can grow hugely.
00:19:27.260 And then it starts to get cold again.
00:19:29.540 And when it starts to get cold again, you get an absolutely catastrophic collapse.
00:19:34.100 It seems that there's some evidence it got cold because of a volcanic eruption.
00:19:37.900 So there was a nuclear winter.
00:19:39.760 And this set off all kinds of things, this movement of peoples known as the sea peoples and all of this sort of stuff.
00:19:47.080 You don't know who they are.
00:19:48.440 No, we don't know who they are.
00:19:49.760 It's such an amazing – yeah.
00:19:51.780 The speculation of who they are.
00:19:53.300 But we don't know exactly who they were.
00:19:55.240 Pirate.
00:19:55.560 And so you end up with these harried inability to get – the whole system, which was based around bronze, starts to be replaced with iron.
00:20:04.100 And so people – the whole system falls apart very, very quickly.
00:20:09.160 And loads and loads of people die.
00:20:11.860 Huge collapse in population.
00:20:13.300 Once that happens, then all these people whose jobs are based around the city don't have jobs.
00:20:17.380 So they flee the cities.
00:20:18.440 The cities – all these people die.
00:20:20.440 They go back to a simpler way of life.
00:20:22.520 800-year dark age.
00:20:23.940 And then towards the end of that dark age, people have no idea how they built these tall buildings, and they believe that giants built them.
00:20:32.300 Right.
00:20:32.480 And you get that again and again throughout history, this belief that giants built – this belief in giants, because the people become so stupid that they don't comprehend how these things are built until later, when they rediscover their history and they work out what went on.
00:20:44.760 Right.
00:20:44.980 And so –
00:20:46.880 And we're worshipping a god that emerged during that period, I'll just mention.
00:20:51.060 Yeah, I think –
00:20:52.280 It isn't that kind of problematic, seriously.
00:20:54.420 I think that that's the period in which this religion emerged, the Bronze Age collapse.
00:21:01.360 Yeah.
00:21:01.980 And presumably the previous collapse, whatever it was, Egypt, 4000 BC, whatever, every thousand years or so you seem to get some kind of collapse, would be similar.
00:21:13.780 And you get these giants again and again, even in the Bible, and there there seems to be a – that was written towards the end of the Bronze Age, sorry, around that time of that collapse.
00:21:25.340 And you seem to get this reference, this, like, mythology, whereby when humans were put on Earth after the fall, there were these people there called the Watchers, the Nephilim, and they were giants, and they were gods.
00:21:38.560 They had sex with men, or with women, and all this.
00:21:42.800 So you've got this idea of this older civilization, which is believed to be giants.
00:21:47.080 I wonder if that's a reference to the pyramids.
00:21:49.160 I mean, they're not understanding how they got about.
00:21:51.840 But it's fascinating.
00:21:54.620 But as I say, I think that it's – the Fermi's paradox can be solved by the rise and fall of civilizations.
00:22:02.000 We have this belief – scientists have this belief that we can just progress forever because they don't understand how – necessarily how these civilizations –
00:22:13.160 Isn't what's holding us back – because, I mean, so there's Fermi's paradox, which is about them, which is why aren't there other species of some kind?
00:22:23.100 I mean, maybe even non-carbon-based life forms in the sense that we can't fully understand who they are.
00:22:30.480 There's also another solution to Fermi's paradox is that they're already here, and we don't know about it, or they visit us in previous times, et cetera.
00:22:40.220 But then there's also the question about us, and I think intelligence is obviously an essential component of rising to a stage where you're not just controlling energy on this planet, but you're actually harnessing the energy of our star, that is the sun.
00:22:59.760 And this is the Kardashev stages, which I was just reading about.
00:23:05.380 He was actually a Soviet scientist who was exploring this.
00:23:08.280 But there is this question about us, and why can't we do it?
00:23:13.540 I think there are some kind of built-in natural aspects that will create cycles.
00:23:20.880 You can think about what – climate change, a complex society being impossible to maintain at some point because intelligence doesn't catch up.
00:23:31.260 A general – the general population intelligence doesn't catch up to the kind of IQ that is needed to maintain a civilization.
00:23:42.040 But there might be other factors just in the sense of political factors and moral factors.
00:23:48.880 The United States did have a kind of autumn period, which is a good way of putting it, where we were harvesting all of these things.
00:23:57.380 And there were politicians like JFK talking about a new frontier, the space program.
00:24:03.140 Even if you want to view this as kind of competition with the Soviets or vainglory or whatever, it at least was – political leaders were pushing us in those directions.
00:24:15.460 It seems to be now that – I mean, there's first off a push towards private enterprise and space.
00:24:23.900 I think Bezos is involved in this.
00:24:27.420 Elon Musk is more famously involved in this.
00:24:30.280 But there just – there seems to be these political and moral dimensions which hold us back from completing a stage one civilization that is harnessing all of the power of the earth, and that would include nuclear power.
00:24:47.920 Yeah, I think that what did it was that we became too individualist.
00:24:51.760 And I think there were two – so you've got to think about – people don't think about the history of the space race, all the men that were killed, all the men that were killed in an attempt to get to – in accidents and things like this.
00:25:03.700 And the risk that it would go wrong and people would die.
00:25:07.080 And we're just averse to that now.
00:25:09.100 We're too individualistic.
00:25:10.480 Now, why has that happened?
00:25:11.320 Well, one reason, as you say, spiteful emergence or mutation in general, which is a mutation away from being highly group-oriented, which means that we're more individualist at a genetic level.
00:25:21.760 And the second is that because of the better conditions, collapse in religiousness, because of low stress, religiousness promotes group orientation.
00:25:31.660 And a third is that these individualist genetic people spread their maladaptive ideas throughout society, whatever the reasons.
00:25:39.340 Eventually, you get to a tipping point, and I think that happened in about 1963, where you tip over into being a highly individualistic society.
00:25:49.200 It's not the first time this has happened.
00:25:50.360 I mean, you had something like this, perhaps, around about the fall of Rome, in which, indeed, Gnosticism was very similar to modern-day multiculturalism in a lot of ways.
00:26:01.360 And so then there's this focus on individualizing against the good of the group.
00:26:05.600 And that's what – it seems to be inevitable that that happens.
00:26:09.040 In other words, that hadn't happened yet.
00:26:11.640 Or it was just – it was happening, but it hadn't fully happened yet by the time we managed to get into space, which was – what was that?
00:26:17.220 That was the 50s in the Soviet Union, and then get to the moon in 69, and then go to the moon again a few times in the early 70s.
00:26:26.160 And then it just –
00:26:26.720 But it's all petered out.
00:26:28.520 Yeah, I mean, there's this man, his last name is Zubrin.
00:26:32.680 He's actually a Jewish emigre who makes very compelling arguments for the conquest of Mars.
00:26:39.960 He makes compelling arguments for harnessing other power sources beyond fossil fuels and so on that have geopolitical implications.
00:26:50.880 Obviously, nuclear power is sitting out there as something that people are irrationally afraid of, which also that irrational fear leads to this dependence upon the Middle East and – well, I mean, again, America produces a lot of fossil fuel.
00:27:10.340 But still, a focus on the Middle East as this place that we need to care about, a place of huge wealth as well.
00:27:21.100 And there just doesn't seem to be any political will.
00:27:24.760 And I know this is a small thing, but there was a tweet not too long ago from Bernie Sanders.
00:27:32.900 And I have some sympathy towards Bernie Sanders.
00:27:34.900 I think a lot of what he says is very decent.
00:27:39.780 But it was this tweet against Elon Musk of basically, well, space travel is all fine and good, but we've got to take care of these problems down here on Earth.
00:27:49.240 There was a famous song during the space age of Whitey on the moon, which is like, we're down here being oppressed and all you bastards are flying into outer space.
00:28:02.320 There just seems to be this resentment that – and again, I understand the resentment on some level, but it's resentment, which will endlessly hold us back.
00:28:13.420 So it's not just a matter of intelligence, it's a matter of psychology.
00:28:17.560 Yeah, yeah, precisely.
00:28:18.420 But that psychology is – well, it's partly a matter of intelligence in the sense that as intelligence goes down, you trust people less and you become more resentful and nasty and selfish.
00:28:29.160 So it could partly be the actual matter of declining intelligence.
00:28:32.180 But secondly, yes, it's not just intelligence.
00:28:33.960 It's the crumbling genome, which means greater individualism, which means less trust, which means less cooperation, which means more of these focusing on these individualizing values of equality, runaway individualism and general madness.
00:28:45.580 And there's another factor which we should forget as well, which is what you have with the space race was the most intelligent minds of the time getting together in a meritocracy, basically, and coming up with these brilliant ideas and putting them into action and doing them.
00:29:01.740 Whereas what you get – and that was in a period where we had it under our highly group-oriented system that you had that held sway until, I don't know, 1850 or something.
00:29:10.620 You didn't promote the best under that system.
00:29:14.320 You had the religion and you promoted the people on almost a religious basis.
00:29:20.020 So the aristocracy was upheld by the religiosity and people would get positions in society not because they were the best but because they were nobles or something like that.
00:29:28.800 Now, then this system collapses.
00:29:32.020 You have a period of meritocracy where you believe in truth and you promote people because they're the best.
00:29:37.980 And now we have a new religion, individualism.
00:29:40.600 And so even if we were intelligent enough to be able to come up with these things, which I doubt, we no longer encourage genius.
00:29:48.960 We suppress it because of individualism trumping it.
00:29:51.780 And we don't promote the best anymore.
00:29:53.980 We've gone back to a system of promoting the aristocracy, but it's the new inverted aristocracy of women and black people and homosexuals.
00:30:01.740 And so even if we were intelligent enough, which we're probably not, still we've got that problem as well, which is that we're not even trying to be the best anymore.
00:30:11.900 I mean, you see it even in all areas of life, even in commercials on TV.
00:30:16.740 They've stopped trying to persuade people to buy things.
00:30:20.800 I mean, how are you persuading people because they want to score woke brownie points?
00:30:25.840 How are you persuading women to buy a product if you're getting a load of fat, ugly women in their pants and saying, oh, they use this soap?
00:30:33.840 So should you?
00:30:34.760 Well, they're going to think, I'm not a fat, ugly woman.
00:30:36.800 People want to be associated with success and beauty.
00:30:39.940 And so I think that it's all of these things come together.
00:30:44.080 And the period in which you might have a combination of the necessary intelligence and the necessary group orientation and the necessary martial values to do things like this and possibly reach another civilization is such a narrow window that in the expanse of space,
00:31:04.380 the possibility that you're going to get far enough to meet another remotely advanced species, you know, maybe animals or something, but a similarly advanced species is just vanishing.
00:31:19.880 So I suspect...
00:31:22.380 I mean, again, there are answers to it, but these answers are, they make too much sense in a way.
00:31:30.020 They're obvious, but they're also seemingly impossible.
00:31:34.820 I mean, one of the answers is eugenics in the sense that we have passed through a singularity in the sense of just natural selection of the environment holding sway.
00:31:47.620 And things that used to hold in natural selection have actually been reversed.
00:31:55.040 Up until the Industrial Revolution, the most intelligent, the most successful were outbreeding the least successful and the aristocrats were outbreeding the peasants and they were doing it by a lot.
00:32:07.680 I think this is my little pet theory that this was the origin of the Prima Nocta myth in the sense that the aristocrat was always, he's breeding even with your own wife.
00:32:20.100 You know, Prima Nocta, yeah.
00:32:22.320 But Prima Nocta never actually occurred throughout history, but it was, that myth was always there.
00:32:28.000 It gave us operas like The Marriage of Figaro, so it was worth something.
00:32:31.540 But we've, we've gotten, we've gotten to a point where really the reverse is the case.
00:32:37.160 The most intelligent are not breeding successfully.
00:32:40.400 The least intelligent, the more likely to be on welfare, the more likely to not be self-sufficient or productive are outbreeding the productive.
00:32:48.640 It is horrifying.
00:32:50.000 But that's not, it's not you.
00:32:50.640 But we are conscious of it.
00:32:53.020 Yeah, but we were, we were, we were conscious of it in Rome and we were conscious of it in Greece.
00:32:58.760 I mean, they write about it at the time.
00:33:00.360 I, I, I, I know, but still, like, it, can't it be different this time?
00:33:06.640 I mean, it, it, when you're conscious of something, that is how you solve that problem.
00:33:11.840 But A, the eugenics program is absolutely necessary.
00:33:18.120 But B, we might need to have a new type of religious paradigm that doesn't emerge from a, you know, scattered, wandering people after the Bronze Age collapse.
00:33:32.920 That, and that might need to focus on the sun in the sense that to get to a second stage of civilization, it will be in some ways harnessing the power of the sun without destroying it.
00:33:44.540 And thus, a new kind of solar religion would be the one that would help us get there.
00:33:53.360 Because I, I think that unless we are reaching stage two or state or, you know, a galactic civilization, unless we are the them in the sense that we are the ones traveling beyond our sphere, then what are we even doing down here?
00:34:12.180 You know, you know, we're, we're, we're literally down here to help people avoid harm or just live their content yet ultimately meaningless lives just rumbling around in the mud.
00:34:27.900 I mean, unless we're advancing, then what are we doing?
00:34:32.060 That's exactly right.
00:34:32.980 But that's, that's why, that's why they, so many people have this melee of, of late civilization because they've just given up and there's no point.
00:34:41.660 And that's how a lot of people, the aristocracy anyway, in late Rome would have felt.
00:34:46.440 And so they would, they would engage themselves with these silly mystery cults and things like this, a sort of surrogate activities.
00:34:54.820 But ultimately there was, there was no point.
00:34:57.900 Um, and we, we create this evolutionary mismatch and then we, well, it was actually Ted Kaczynski.
00:35:02.040 I mean, mad and vicious as he was, he did make a number of, I hate to say it, but he did make a number of good points, which is that you create, you create an evolutionary mismatch and you give people antidepressants to solve the evolutionary mismatch, which you've created.
00:35:15.660 Right.
00:35:16.140 Or, or which the system, the system has created.
00:35:19.220 And, um, but I'm, I'm interested in, I'm, you know, I'm researching the, this, uh, for this book we're doing on zombies.
00:35:25.760 And, uh, the, the, the, the, the, the, the, the zombie apocalypse, the future, the future, the future of this.
00:35:31.660 And what I see is that the more, the more intelligent and the more, um, right wing will come apart.
00:35:38.300 Uh, and, and they, and they will be group selected.
00:35:41.520 And so whether they, from their retreat of civilization into smaller areas will, will learn, um, it's, it's, it's possible, I suppose.
00:35:53.160 Well, we need to, and we have to, we're conscious of it.
00:35:55.660 Therefore we need to do it.
00:35:56.660 And I think maybe what we need to do is conquer this planet so that we, you know, in the sense that we've allowed this planet not to be on, in, under our control for too long.
00:36:10.140 And we've created this unhappiness and this kind of, you know, medicine that cures the poison in the sense of, you know, here, let me give you antidepressants for the problem that I've just solved.
00:36:20.120 I mean, this is the, the Caducean, in a nutshell, um, we haven't been in control, uh, for a long time, actually of our own planet.
00:36:31.060 And, you know, maybe we do need to kind of get away and go to a redoubt and reform.
00:36:37.520 But at the end of the day, we're going to have to force all of these people to our will in order to advance our mission on this planet, um, which is interstellar travel.
00:36:49.080 Sorry, as usual, I'm being bombastic, but I usually get there around the 30 minute mark.
00:36:55.180 Well, you've taken 40 minutes, all right.
00:36:58.200 It's the, all that chocolate, slow down your metabolism.
00:37:02.220 Um, but, but, but, um, yes, I, I think that is, you need to have a mission.
00:37:07.720 And, uh, I'm normally that mission is to just get enough food.
00:37:12.000 And once you've got, once you've got what that's the most, I'd get sex.
00:37:15.080 And those are the most basic missions.
00:37:17.140 And if you don't, if you, if, as long as you've got those, then you can see how melee hits in.
00:37:22.820 And it doesn't hit it if you feel you're on the up.
00:37:25.360 And that's what we felt for a very long time.
00:37:27.220 We're on the up.
00:37:27.960 It's getting better.
00:37:28.660 It's getting better.
00:37:29.300 We've got into space now.
00:37:30.360 We've got to the moon now.
00:37:31.500 And then there's this change.
00:37:34.200 Um, and even when we were going into space, you have people campaigning, moaning about,
00:37:38.080 uh, we'll concentrate on black people's rights.
00:37:41.200 Um, no, um, widely on the moon.
00:37:44.560 Um, and, and, uh, no, the, the, the next step just needs you to keep going.
00:37:49.660 And I just think that it's, um, almost impossible unless all of this current woke stuff is reversed.
00:37:54.680 It will reverse itself in those people, those individualists, whatever they don't breed.
00:37:59.100 But the problem is that clever people also don't breed.
00:38:01.360 And so, and they're more likely to be sucked in by whatever the dominant ideology is.
00:38:05.580 And so I think I can only offer a sort of gray pill.
00:38:08.900 There's not the white pill.
00:38:10.380 There's going to be a collapse in intelligence.
00:38:12.320 There's going to be a movement backwards, but the boat, it might not be as bad a collapse
00:38:16.860 as before as, as more intelligent people move out to places like, uh, uh, certain refuges,
00:38:26.880 uh, in parts of America and whatever.
00:38:29.340 I mean, the next stage I'm thinking as well, that's another thing as the, everything becomes
00:38:33.720 more wokeified.
00:38:34.720 I mean, you're going to get a situation where doctors now in America are going to be taught
00:38:39.140 to be woke, but not actual useful medical knowledge.
00:38:43.000 So the result of that is going to be people fleeing to places that do have proper doctors
00:38:47.300 and the police are taught to be woke.
00:38:49.480 So the result of that is going to be the rise of militias.
00:38:51.540 It's going to be the rise of sort of retinues where you pay protection.
00:38:55.460 I mean, that's, and once you, once you get that kind of thing, then you get effectively
00:38:58.420 separate States.
00:38:59.460 And I think we're moving towards that.
00:39:03.500 And so it would only be, it would intelligent people separating themselves off from the stupid
00:39:10.120 people would, would be a start.
00:39:12.480 Do you think that China, because it is a nation run by engineers, as opposed to a nation run
00:39:20.060 by lawyers and academics in a way that the United States is, that they, they're in China, there's
00:39:28.680 at least a potential for the, the type of technological advancements that are necessary.
00:39:37.500 They've been faster implementing a lot of things because they don't give a damn.
00:39:43.100 They just kind of do it.
00:39:44.640 They have, but they only have a short window as well, because we have data on that.
00:39:48.200 We know they're in dysgenic fertility intelligence is negatively breeding in China.
00:39:53.820 So we, we, we know that's happening.
00:39:56.360 So they're in their autumn, basically, they've got to get on with it.
00:40:00.000 They need us.
00:40:00.900 I think that they don't, they can't think originally and creatively.
00:40:03.940 They, that that's not what they're good at.
00:40:05.620 So we, we would be kind of like Greeks to their Romans and they would, they would have
00:40:10.680 to bring us in to, to do things.
00:40:13.220 And then if that were to happen, then yes, that, that, that perhaps would, and they would
00:40:17.580 promote the best as well.
00:40:19.160 And that's another thing that there's a problem there is, well, they are corrupt.
00:40:21.940 So do they promote the best?
00:40:23.200 Not always, but, but, but then they're, they are, they are further behind in the decadence,
00:40:29.520 but the decadence is definitely coming.
00:40:31.420 So if they're going to do something, they have to strike out quickly or they won't be able
00:40:36.680 to do it because it's coming.
00:40:40.800 Well, we'll put a book market in on that.
00:40:47.580 Let's see.