The Auron MacIntyre Show - November 03, 2023


Lobotomizing AI in the Name of Equity | Guest: James Poulos | 11⧸3⧸23


Episode Stats

Length

1 hour and 1 minute

Words per Minute

176.5191

Word Count

10,916

Sentence Count

593

Misogynist Sentences

3

Hate Speech Sentences

15


Summary

James Poulos is an author, a key member of the Claremont Institute, and the host of Zero Hour on the Blaze. In this episode, we discuss the recent executive order from the Biden administration that calls for the development of artificial intelligence.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.320 Rocky's Vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.200 Sponsored by Bell. Conditions apply.
00:00:28.720 CRCanada.com.
00:00:30.420 Hey everybody, how's it going?
00:00:32.060 Thanks for joining me this afternoon.
00:00:33.380 I've got a great stream with a great guest that I think you're really going to enjoy.
00:00:37.600 So I think a lot of people understand that the competition between state actors,
00:00:42.580 whether it comes to between different countries or the culture war inside of the United States,
00:00:48.480 that technology is a huge aspect of what's going on.
00:00:51.660 And of course, artificial intelligence is a huge buzzword,
00:00:54.820 a huge talking point of people who are looking at what's going to shape the future here.
00:00:59.460 Now, the White House went ahead and issued an executive order recently.
00:01:03.660 The Biden administration told us that they need to go ahead and tame artificial intelligence,
00:01:07.920 that they need to train it to obey the DEI and CRT requirements of the progressive religion.
00:01:14.420 And I wanted to go ahead and bring somebody who thinks a lot about artificial intelligence and other technological matters in to talk about this.
00:01:21.240 He's an author, he's a key member of the Claremont Institute, and he is, of course, the host of Zero Hour on the Blaze.
00:01:28.460 James Poulos, thanks for coming on, man.
00:01:30.800 Hey, how you doing?
00:01:32.140 Doing well, doing well.
00:01:33.300 All right, guys, well, we're going to go ahead and dive into this executive order,
00:01:36.820 talk a little bit about the implications and AI in general.
00:01:40.380 But before we do that, let's hear from today's sponsor.
00:01:43.520 These days, it's impossible to thrive with just one job.
00:01:46.120 Between increasing living costs, paying off debts, and planning for the future,
00:01:49.840 things like buying a home, building savings, and even going on vacation can seem like fantasies.
00:01:54.600 If your goal is financial freedom, you could start taking on more hours at your current job,
00:01:58.480 work towards a promotion, or try putting your money into something risky like stocks, cryptocurrencies, or even a side hustle.
00:02:04.820 But at the end of the day, do you really want to sacrifice time and energy
00:02:07.800 that could otherwise be spent with your loved ones or on your hobbies just to make a living?
00:02:12.000 Luckily, you don't have to hustle to reliably make more money.
00:02:14.820 All you have to do is job stacking.
00:02:16.540 Job stacking is the best way for regular people, regular employees,
00:02:19.980 to unleash their earning potential and increase job and financial security.
00:02:23.500 How?
00:02:23.760 By working multiple jobs, but without burning out, or more importantly,
00:02:27.440 getting caught by corporate overlords.
00:02:29.520 Job stacking allows you to reliably receive paychecks from multiple employers each month
00:02:33.400 without having to work more than eight hours a day.
00:02:35.940 You don't have to be in tech or any particular field or industry to do it as long as you can work remotely.
00:02:41.140 If you've thought about working multiple jobs, but you're not sure how to start or are afraid of getting caught,
00:02:45.280 get the fundamental job stacking course today and learn all of the secrets
00:02:48.480 on how to sustainably work multiple full-time jobs
00:02:51.380 from the foremost expert on the matter, Rolf Halza, author of Job Stacking.
00:02:55.580 Rolf has worked multiple full-time jobs since 2018, including hybrid jobs,
00:03:00.000 and has condensed all of his experiences and wisdom into a single four-module online course
00:03:05.100 so you can start proficiently job stacking without having to make mistakes,
00:03:09.000 figuring things out on your own, or reinventing the wheel in the process.
00:03:12.880 Go to www.jobstacking.com and enter the promo code ORIN to get a special discount.
00:03:18.320 All right, James, so we're going to go ahead and talk about this executive order,
00:03:23.260 but before we do, for people who are unfamiliar with your background,
00:03:26.400 you've made talking about technology and the way that we are interfacing with it
00:03:31.020 a really critical part of your thought, especially when it comes to addressing spirituality and development.
00:03:37.040 Could you give people a little bit of your background?
00:03:39.560 Like, why did you make this kind of the center of how you're thinking about where we're heading in the future?
00:03:45.040 Sure. Well, my background is in academic political theory,
00:03:49.140 which is now an almost completely irrelevant discipline, like so much of academia.
00:03:55.980 This was even before the sort of D-I-C-R-T-E-S-G barrage set in.
00:04:04.320 It's basically, it's kind of like political philosophy,
00:04:07.900 but it's a little bit less philosophical, a little more theoretical, more social theory.
00:04:14.680 Sometimes even theology gets in there.
00:04:17.460 And so all good news as far as it went.
00:04:20.480 It was a good education.
00:04:22.080 But around 2016, 2018, it started to become clear to me that if,
00:04:29.640 as a political theorist,
00:04:31.040 I couldn't say anything useful about what technology was doing to so rapidly reshape and remake our inner and outer lives,
00:04:41.440 whether in politics or really any other sort of area of human endeavor or experience,
00:04:47.420 then sure enough, that discipline probably decided to go on the scrap heap along with plenty of others.
00:04:53.720 Fast forward to today, of course, and you see the trend lines on advanced degrees in the humanities.
00:04:58.880 They are nosediving.
00:05:01.220 Some of that is just kind of like, you know, poetic justice or just desserts.
00:05:06.560 In other ways, the humanities value going to zero is not a good thing.
00:05:11.840 And maybe we can talk a little bit about that later on.
00:05:14.440 But really, for me, it was kind of a challenge that I saw just being a gauntlet thrown down in front of everyone,
00:05:22.640 really, including political theorists.
00:05:24.720 So I kind of had to just begin to remind it and start from the bottom,
00:05:27.820 learn the literature, figure out how it interfaced with what I'd already learned about politics and regimes
00:05:35.840 and the theory of how people order themselves over time and why they do so.
00:05:42.160 And so when I was, you know, kind of working my way through this,
00:05:46.760 sometimes there's a lot of jargon.
00:05:48.420 Sometimes we're throwing around concepts that people don't immediately understand.
00:05:52.260 But I kind of had to learn on the job and write my way through it.
00:05:54.980 You know, some people tweet their way through it if they're experiencing some kind of PR crisis.
00:05:58.940 I had to kind of write my way through it and learn in real time.
00:06:02.400 And so by the time 2020 rolled around, people started to kind of realize that things that I was talking about somewhat cryptically a few years ago
00:06:11.240 were now somewhat obvious.
00:06:12.800 So I've tried to stay one step ahead of unfolding events.
00:06:16.400 But they're moving fast.
00:06:18.100 And so really, I think the challenge has been intensified.
00:06:20.520 People really do need to face up to what's happening to us and why it's happening to us
00:06:28.940 and understand some of the basic dynamics so that we don't lose our identity, our humanity, our form of government,
00:06:36.660 our God-given rights, and plenty of other things to technologization,
00:06:42.340 just to the routinization, automation of things that until very recently have been solely the province of us human beings
00:06:51.020 or in some cases of angels and demons.
00:06:54.460 Yeah, I think it's really critical because a lot of conservatives or reactionary types will look at technology and say,
00:07:01.520 well, we're just not going to face this.
00:07:03.140 You know, this is just something that needs to be put in a box or it needs to be destroyed or, you know, it will somehow disappear.
00:07:08.280 They don't really want to look at how this is shaping people and what the next step would be alongside kind of their political project.
00:07:15.580 And so I do think it's really critical that you have people facing this problem and thinking about it from a non-progressive worldview
00:07:22.440 because it's going to be impacting people one way or another.
00:07:26.460 And if you simply leave it to your enemies, they'll be the ones that shape everything about it.
00:07:30.160 And that's been a critical failure by the right in many different areas and technologies, you know, not least among them.
00:07:36.660 So the first thing I guess let's talk about real quick is artificial intelligence.
00:07:41.640 We know that this executive order is coming out and it's talking about how we need to get rid of algorithmic discrimination, right?
00:07:48.380 We have to make sure that we secure equity and civil rights through artificial intelligence.
00:07:53.960 But I think when people look at artificial intelligence, it's really just a mixture of different concepts.
00:07:59.900 It's a sci-fi fantasy.
00:08:01.160 I hear some people who are very knowledgeable on the subject say, you don't need to worry about this.
00:08:05.760 AI is never going to matter.
00:08:07.200 It's a dumb technology that is overblown.
00:08:10.880 People are just inserting their favorite fan fiction into the ideas.
00:08:14.700 And then there are other people who are like, no, you have to take this seriously.
00:08:17.240 It's going to enslave us all in 10 years.
00:08:19.720 Where are we with artificial intelligence right now?
00:08:22.960 And how critical is it for nation states to kind of shape where this is going?
00:08:27.080 Well, there's a now ancient history, John Lennon, Yoko Ono song, war is over if you want it.
00:08:35.080 And I think we're at a moment where, you know, yes, humanity is over if we want it.
00:08:39.560 We absolutely do have free will and we can use that free will to voluntarily choose to eradicate our humanity.
00:08:49.280 So AI matters now.
00:08:51.060 It's already happening.
00:08:52.360 It's every step of the way that we want to go down that path, we can go.
00:08:57.040 And I expect that there will be serious conflict over those kinds of choices as things accelerate and as the technology improves.
00:09:05.640 That said, I am hesitant to refer to these things as artificial intelligence at all.
00:09:10.380 It's a catchy term.
00:09:11.600 You know, it's sort of Kubrick, Spielberg film of that name, artificial intelligence, AI, back also in ancient history in the primordial era.
00:09:23.720 These things are automated simulators.
00:09:25.920 And that is why they are so powerful.
00:09:28.120 We like simulating things.
00:09:30.160 We like simulations.
00:09:31.120 You can go back to Jean Baudrillard or other social theorists who saw a lot of this coming.
00:09:38.880 You know, 1987, Baudrillard, the ecstasy of communication.
00:09:42.720 Great slim little volume if you want a taste of what the folks who saw this coming saw so relatively early in the game.
00:09:50.400 There were others, you know, we can we can run through the list maybe maybe in a minute.
00:09:56.360 But whether it's a film like Videodrome or, you know, Clive Barker writing about the Cenobites from the Hellraiser universe, like there was Annoyeners building in the building in the 80s, continuing into the 90s.
00:10:08.700 Obviously, The Matrix and other films, Dark City, sort of bringing this thing into into the mainstream as it grew and developed.
00:10:17.000 And it's not that technological development is inherently bad.
00:10:20.400 But what is inherently bad is is giving into the temptation to retreat from reality and to try to substitute in a simulation for the given world in which we live and our given selves and and our given relationships.
00:10:41.460 Trying to simulate everything is a powerful temptation, not just for ordinary people, but for especially for smart people.
00:10:48.880 And this is, I think, one of the key points that we need to sort of recognize is at the heart of what's going on here.
00:10:55.040 Smart people like to think that if they just have enough power, if they build the right kind of tools, if those tools heighten their intelligence, that intelligence is really capable of solving any problem.
00:11:05.640 If it's not working, you just need more of it.
00:11:07.620 And what you're hearing out of some of the smartest people right now is, well, if the technology isn't working, that just means that we need more of it.
00:11:14.520 I think Bill Clinton back in the day saying there's nothing wrong with America that can't be solved by what's right with America.
00:11:19.960 And this kind of attitude toward technology is just kind of the latest version of a longstanding view about intelligence, which is the same.
00:11:28.920 The smarter you get, the better off you'll be.
00:11:31.400 Smart people, however, obviously very blind to the fact that, no, intelligent people can be very easily deceived.
00:11:38.580 When intelligent people make mistakes, they tend to be very large mistakes.
00:11:43.160 And there's a price to be paid, not just if you're the smart person making mistakes or being deceived, but all those who are affected by the artifice that you've constructed on top of the world we live in.
00:11:54.620 So automated simulators, very impressive things.
00:11:58.940 They can simulate, you know, perhaps even the human soul.
00:12:01.760 Perhaps they can even simulate something like the Holy Spirit.
00:12:04.840 These are powerful machines.
00:12:06.020 They are disincarnate.
00:12:08.220 They're floating around through space and time, able to do things that human beings can't do.
00:12:12.040 They pass through our walls, pass through our minds, pass through our hearts.
00:12:15.100 You can fit a seemingly infinite number of them on the head of a pin.
00:12:18.840 And they're all talking to each other.
00:12:20.300 And with 5G technology, that means they can all talk to each other more or less instantaneously.
00:12:24.980 Heavy stuff.
00:12:26.120 And if we do not take seriously the spiritual and theological and religious implications of these machines, of these tools,
00:12:33.120 then we will be inclined to say, wow, human beings really suck now.
00:12:36.900 And wow, I guess our only hope is to listen to the smartest people tell us that we have to obey the even more intelligent machines that they create.
00:12:46.880 That can be a powerful, self-fulfilling prophecy.
00:12:49.180 And that would be a self-fulfilling prophecy of doom.
00:12:52.460 It's not the only path.
00:12:53.440 We don't need to get totally blackpilled, but we do need to be spiritually mature and understand what the stakes are here.
00:12:59.560 Yeah, really dangerous to be assembling our own hyper-agents or even worse, probably like rediscovering ones that were best left locked away somewhere.
00:13:08.000 But we'll dive into that here a little bit later.
00:13:10.880 I wanted to focus first on this executive order.
00:13:13.580 So, like you said, obviously a lot of intelligent people who think they should be able to run every aspect of humanity really believe in that social engineering is the way forward.
00:13:23.180 We can rebuild man and reshape it in our own image.
00:13:25.880 Love this kind of stuff because it lets them believe that they can centrally plan every aspect of humanity.
00:13:31.260 But what's really interesting about, I guess, this executive order is while in theory AI would allow them to wield this kind of power, allow them to have the kind of the computational power they need to plan everything and alter everything and such, they're already starting out by basically lobotomizing it.
00:13:50.560 They're saying, okay, well, yes, we understand that this could possibly accurately model things, but we don't want it to actually accurately model things.
00:13:59.040 That's really critical.
00:13:59.920 We want to make sure that there are certain caveats that make sure that realities that might manifest themselves, if we took too hard of a look at statistics and that kind of thing, we need to make sure that those are already cut out.
00:14:11.340 And so the executive order specifically mentions having an AI bill of rights and making sure that, again, we don't have this algorithmic discrimination.
00:14:19.960 It specifically singles out things like criminal justice, healthcare, housing, as things that might be reflected in these algorithms might be things that will select against certain groups, certain favored groups.
00:14:33.420 Do you think that there's any kind of cognitive dissonance, any kind of confusion for these people between saying, well, we have this tool that could allow us to range society because we're so smart, but we also want to make sure it's specifically dumb in a way that will keep it from identifying problems that we don't want people to see?
00:14:52.920 Well, yeah, I think this is a case in point of why what we're dealing with is ultimately spiritual, theological, religious, and not just a matter of intelligence or IQ or processing power.
00:15:03.440 Yes, from a secular standpoint, it is kind of ironic and concerning to watch people say, well, yes, we need these machines and not just smarter than human beings, but actually dumber than human beings in some, a few important respects that we will determine because we know best when a machine should be smarter than you, but also when it should be dumber than you.
00:15:25.240 It doesn't really make sense. It doesn't really make sense. And, you know, I think an understandable but incomplete at best reaction that has bubbled up to the top in the face of things like, you know, wokeness or whatever you want to call it is, wow, this is just crazy.
00:15:41.920 This craziness, it's so out of hand. These people are, they've lost their minds. And well, I suppose in a certain sense, but in another sense, they recognize that when you have technology that is this powerful,
00:15:53.700 that calls so fundamentally into question who we are as human beings, what our purpose is, what our destiny is, the only kind of response that is going to assert a kind of authority over determining how this technology is developed and how it is used, that authority has to be spiritual authority.
00:16:15.400 That authority has to be human beings saying there's something about us, which you might not be able to see, but which we know is real, and which preserves our authority over these tools that we've created.
00:16:28.640 And the Wokies understand this on some kind of level. And so from that standpoint, they're not trying to lobotomize the bots as much as they're trying to catechize the bots, as much as they want to ensure that these machines adhere to their religious or theological worldview,
00:16:51.520 in the same way that ordinary Americans would probably prefer that if we are automating parts of government, that these entities have fully internalized the Bill of Rights that we already have.
00:17:07.480 You look at the First Amendment, the Second Amendment, freedom of expression, freedom of association, right to bear arms and use them.
00:17:14.860 These things implicitly protect what we already inherently have, which is a right to access and use fundamental digital technologies in order to protect ourselves, protect our community, and freely constitute our communities.
00:17:39.380 Right now, right now, we are going down a path where these people who are offering to us something like an AI Bill of Rights, they don't really want us to have free association.
00:17:50.680 And they don't really want us to have access and use of fundamental basic digital technologies without which we cannot be free through those technologies or with them.
00:18:00.320 They want to impose on us a top-down social credit, social justice system, and they want to use basic technologies to enforce that system on us.
00:18:13.060 You can't have wokeness without a woke supercomputer at the end of the day.
00:18:16.520 You need some sort of complex superintelligence in order to decide, okay, well, this person has suffered this number of microaggressions today,
00:18:23.900 and they need this degree of, like, micro-justice payment, you know, it's so complex, it's beyond what human beings can do.
00:18:31.080 That's the good news.
00:18:32.180 The bad news is these folks on the left are concluding that the only way that we can really have justice in the world is,
00:18:39.260 well, we tried monarchy, that didn't work.
00:18:40.940 We tried aristocracy, that didn't work.
00:18:43.180 We tried democracy, that didn't work either.
00:18:44.760 So now we're just going to give it to the machines.
00:18:46.660 We're going to program it, let the algorithm do it.
00:18:49.780 Algorithms are inherently discriminatory.
00:18:52.120 That's what an algorithm is.
00:18:53.860 It's giving a computer orders, giving software orders, telling some code what to do,
00:19:00.620 to choose A instead of B, to select for X instead of selecting for Y,
00:19:04.960 to wait in favor of variable Q instead of variable W, right?
00:19:11.640 There's no way to create algorithmic systems that do not in some way discriminate.
00:19:16.820 And so the question is, on what basis are you discriminating?
00:19:19.220 And if the basis that you're discriminating on is, well, we're in charge and we have this sort of spiritual understanding of life,
00:19:28.800 and we are going to ensure that the way the system is built and the way the machines are coded is going to hardwire,
00:19:37.320 so to speak, that spiritual worldview into our system of government and way of life.
00:19:43.620 Now, that is why I'm going around talking about a cyborg theocracy, because really, you know, sounds like a buzzword.
00:19:49.820 Maybe it is not too hard to figure out.
00:19:52.140 Cyborg just means you uniting the human and the machine.
00:19:55.800 And theocracy just means uniting spiritual and temporal power.
00:19:59.160 Now, that isn't America.
00:20:01.700 It's never going to be America.
00:20:03.960 It never has been America.
00:20:05.520 You got to go, you know, back to the Puritans.
00:20:07.640 They tried it, putting Leviticus into the civil code.
00:20:10.620 Didn't work very well.
00:20:11.920 They gave up and moved on.
00:20:13.040 That caused some other problems we can talk about.
00:20:15.220 But it's, on the one hand, not surprising to see our leaders, our elites,
00:20:22.160 reaching for that kind of theocratic control,
00:20:25.080 because it's a powerful temptation when you've got machines this powerful.
00:20:28.480 But in the U.S., we have to play by different rules.
00:20:30.720 We have different understandings, fundamental understandings of our humanity
00:20:33.560 baked into our founding regime.
00:20:37.080 And we need to stick to that.
00:20:38.580 Otherwise, if we don't, it's not going to be America anymore, online or offline.
00:20:42.960 Yeah, I think it is very telling of, you know, the usage that they will have for this algorithm,
00:20:49.160 for the things that it specifically is not allowed to impact.
00:20:51.880 They specifically say that, you know, it's not allowed to be biased in things like
00:20:55.180 risk assessment or surveillance, predictive policing, forensic analysis,
00:21:00.020 all these things that would keep you safe, right?
00:21:02.160 All these things that would just be, I guess, what most people would hope would be
00:21:05.320 kind of neutral uses of AI that would generally increase the safety of the average person.
00:21:10.880 All of that stuff is off the table.
00:21:13.520 And otherwise, it seems like it's focused on kind of bringing about all the stuff you're talking about,
00:21:18.540 where it's going to infuse the new civic religion into every aspect of life.
00:21:25.220 And so I guess the question is, like you said, there's going to be multiple versions of this.
00:21:30.600 You're going to have many states working on this simultaneously.
00:21:33.220 The United States is going to try to restrict it in a particular way.
00:21:36.440 Perhaps other regimes are going to try to restrict it another way.
00:21:39.540 But the question is, can it actually be restricted?
00:21:42.560 Will these attempts to catechize it be effective or is kind of just the noticing power of AI going to be too much?
00:21:52.180 Or will an uncaged AI just be that much more useful as to where people will constantly be trying to utilize that,
00:21:59.400 getting around these rules, making them as useful as putting the puritanical code into early American settlement?
00:22:06.420 Well, to some degree, the cat's already outside of the bag.
00:22:12.000 No matter how much the U.S. government focuses on AI safety, however defined, we are not the only game in town.
00:22:21.980 There are other digital superpowers in this world.
00:22:24.320 I can name them U.S., U.K., China, Russia.
00:22:27.420 I think India probably in there.
00:22:29.340 And Israel.
00:22:31.480 And in every one of those cases, look, there's a sovereignty crisis.
00:22:36.960 And the only way that you can reestablish your sovereignty as a major power in the world with that kind of digital equipment
00:22:45.500 is by going to the root of your civilization state.
00:22:49.680 Every one of those countries that I named represents or embodies a particular civilization.
00:22:54.500 And it's not surprising to see them all reaching for those resources.
00:23:00.500 Russia has rediscovered Eastern Orthodoxy.
00:23:04.820 If you go back and look at the consecration of their newest cathedral, the main cathedral of the armed forces in 2020,
00:23:13.340 they put out a big shiny video and said, you know, this is inaugurating a new relationship between the military and the church.
00:23:18.800 They're going back to basics.
00:23:20.560 Even in China, you know, I think that the smartest or most insightful analysts in China and analysts of China
00:23:29.280 recognize that a large number of Chinese are reaching to Taoism and to some of those other foundational Chinese religious doctrines
00:23:39.620 in order to reassert their sovereignty over the technology within their civilization state.
00:23:46.600 Obviously, you know, Israel, not usually thought of in those terms, but now you've got Benjamin Netanyahu out there talking about Amalek
00:23:53.980 and, you know, that they're moving quickly in that direction, too, it seems.
00:23:57.400 India, you've got Hindu nationalism.
00:23:59.280 And in the U.S., we're having a cold civil war over which version of religiosity is going to be our touchstone.
00:24:09.400 That's that's going to be a big deal. But the point is there are going to be other paths taken by other countries,
00:24:16.320 and it's not going to be the U.S. saying we run the show over the entire world and we're going to dictate AI policy around the entire world.
00:24:23.000 It's not the way it's going to go down. So in that respect, yeah, we have to be prepared for for AIs to do things and behave in ways that are radically different from however we end up doing so in the U.S.
00:24:37.200 At the same time, you know, look, in order for America to be America, we need to take on the risk of protecting American citizens right to keep and bear compute,
00:24:51.740 to have that access and and use of those basic technologies.
00:24:56.060 I'm talking high power GPUs. I'm talking, you know, certain kinds of AIs.
00:25:01.500 I'm talking things like Bitcoin. I like Bitcoin not because you watch the number go up and we all achieve nirvana just without having to lift a finger.
00:25:10.580 No, I like Bitcoin for the opposite reason, which is right now the technology is well enough developed that regular ordinary people can use it to build culture,
00:25:19.640 to build institutions, algorithmic markets, buying, selling, exchanging goods and services that strengthen our form of government, strengthen our way of life, strengthen our humanity.
00:25:28.480 That's why I put my book Human Forever on the website called Canonic.xyz.
00:25:33.560 It's on sale in for Bitcoin. It's encrypted right there onto the blockchain.
00:25:39.500 Fundamentally different way of doing things. Very powerful way.
00:25:42.060 I think that's why regulators and others really want to get their hands on it and nationalize it in the same way that I think ultimately they want to nationalize AI.
00:25:51.140 So you got this AI safety conference going on right now in the UK. Why is it there? That's an interesting question.
00:25:57.320 I'm interested in, you know, the deeper question of what is human safety with regard to this technology?
00:26:03.760 Who can you really trust to exercise proper spiritual discernment with regard to the development of these technologies?
00:26:11.220 It's not all about just saying, oh, well, if we purify the law, then we'll live happily ever after.
00:26:16.140 That's not America. That's not something that law can do even before super powerful technology enters into the picture.
00:26:23.200 But that's where we are and falling prey to that temptation of saying like, well, look, you know, just like Europe is trying to do, we'll just get the regulations right.
00:26:30.400 Right. As long as we have the pure and virtuous regulations, the law will rule and humans will be safe.
00:26:38.380 Not exactly. We need to look inside. We need to look in our hearts. We need to look in our souls.
00:26:43.460 We need to understand how to reestablish spiritual trust in the digital age.
00:26:48.440 And if we do that, then, yes, we can have nice things.
00:26:52.120 What's better than a well marbled ribeye sizzling on the barbecue?
00:26:55.020 A well marbled ribeye sizzling on the barbecue that was carefully selected by an Instacart shopper and delivered to your door.
00:27:02.300 A well marbled ribeye you ordered without even leaving the kiddie pool.
00:27:06.760 Whatever groceries your summer calls for, Instacart has you covered.
00:27:10.920 Download the Instacart app and enjoy zero dollar delivery fees on your first three orders.
00:27:15.780 Service fees, exclusions and terms apply.
00:27:18.480 Instacart groceries that overdeliver.
00:27:20.960 So you've mentioned this many times.
00:27:24.780 I guess we can get into it a little bit here.
00:27:27.840 Theology, you know, injecting the spiritual into this conversation seems to be a critical part of the future for you.
00:27:34.740 So I think one of the biggest problems, one of the biggest disadvantages that the right has is that the left, while it's a it's a mystery cult of power in many ways, it does have some kind of stand in for religion.
00:27:46.360 It does have a unifying world vision, no matter how corrupt that might be, that they're moving towards.
00:27:52.840 However, the right has this vague notion of just returning back to something.
00:27:58.160 They don't even know what Christianity might be somewhat involved in it.
00:28:02.200 But even people who talk about Christian nationalism, there's all these warning bells that go off.
00:28:06.320 People lose their minds.
00:28:07.900 You know, oh, this is this is an attack on the Constitution.
00:28:10.360 And so I guess the question is, is this a major disadvantage for the right?
00:28:15.000 Because I hear all these people talking about, well, we need to symbol the logos or we have to return to constitutionalism.
00:28:21.180 All these really vague, non-particular ideas of binding spirituality that feel like they just have no hope of fighting against what we're doing now.
00:28:29.540 What would be something that would unify and bring a coherent kind of front to the opposition to the current regime?
00:28:38.400 Yeah, it's a tough question.
00:28:39.880 I mean, you know, look, it's what has unified America in in the recent past war, the the the use of technology to create ever more powerful weapons of war and have that be our source of authority.
00:28:57.260 You know, it's it's worked out to our advantage, relatively speaking, until quite recently.
00:29:02.760 And I think, you know, hitting that wall, being like, well, wait a minute, like we did create the most technology and we did sort of take over the world.
00:29:10.860 And it was going to consummate, turn the world into America and, you know, all sort of live happily ever after.
00:29:18.000 That is what people at the elite level who were basically raised by television, formed by that kind of televisual medium that was the most powerful thing on Earth before digital came along.
00:29:28.500 They really did think like, hey, you know, we're building these things.
00:29:31.240 These things are going to be our friends.
00:29:32.360 They're going to work to our advantage.
00:29:33.900 How could they not?
00:29:35.060 We're the most powerful.
00:29:36.100 We have the biggest dreams.
00:29:37.460 We dream the biggest dreams.
00:29:38.500 We followed our passions.
00:29:39.400 We did all those things that, you know, people were taught in grade school before DEI came along and we built the Internet and we we won the Cold War.
00:29:51.000 So it only makes sense that these tools are going to allow us to perfect our domination of the world and sort of, you know, terraform a planet Earth until it's all basically America.
00:30:03.080 We're going to end tyranny.
00:30:04.700 George W. Bush, second inaugural.
00:30:06.120 It all made sense on paper and it made sense in a lot of smart people's minds and it just didn't happen.
00:30:14.080 In fact, quite the contrary.
00:30:16.060 The rules based international order is basically already gone.
00:30:19.600 You know, you see the U.S. trying to find this kind of middle way on Israel, which, you know, might be might be the correct thing to do.
00:30:25.620 It might be under the circumstances.
00:30:27.240 If you're the statesman and you're trying to figure out how to navigate, sometimes you do need to muddle through.
00:30:32.000 But muddling through is a lot different of a brand than the brand that says we have established rules for the entire world and their objective rules.
00:30:40.100 And you need to follow them or else the missiles are going to rain down on you.
00:30:44.840 Multi polarity is here.
00:30:46.720 We've got these radical differences, different civilization states, different approaches to technology, different kinds of spiritual authority.
00:30:52.600 And so the U.S. is really in uncharted waters.
00:30:57.640 That's especially hard for the right because the right is divided and it has been divided.
00:31:03.440 It's been divided on religion.
00:31:05.400 It's been divided on what's the best regime.
00:31:08.500 It's been divided on military policy and military adventurism.
00:31:16.680 It's been divided on commercial issues, cultural issues, lots of divides.
00:31:21.140 And that's not necessarily a bad thing.
00:31:23.020 You know, I think people on the right have some justification to pat themselves on the back a little bit for saying, you know, we can sort of discuss things openly and we can debate like basic political considerations.
00:31:35.360 And yes, that's citizenship.
00:31:37.260 Yes, that's a good thing.
00:31:38.140 You don't want to just sort of enforce mental conformity onto everyone who's aligned with you.
00:31:46.260 That takes us back to, you know, why the rule of law seems like a good thing.
00:31:50.120 Yeah, you want sort of like predictability and social interactions to some degree.
00:31:55.840 And it helps economically if it's not, you know, you wake up in the morning and all the rules have changed.
00:32:00.680 Like, obviously, yes, like in that sense, the rule of law is a pretty good thing.
00:32:04.000 But when you start worshiping the law, when you start seeing the law as the thing that will save us, that becomes really difficult.
00:32:11.040 So on the right, there's kind of this, you know, well, what about the rule of man?
00:32:15.920 Perhaps we need stronger men to come along and kind of take charge of things and just kind of tell people what the new rules are.
00:32:23.140 You know, you look back at the historical record and even in circumstances where a regime is collapsed and needs to be rebuilt or when you're coming out of a war and you're just kind of like,
00:32:33.440 well, we need, can someone, anyone sort of take control for at least a minute while we clean up the rubble?
00:32:38.640 Even in those circumstances, the track record of strong men is not super great.
00:32:44.260 And it's basically pretty inconsistent with the way Americans think about the point of life and why we should bother going through the struggles of life.
00:32:54.300 So for me, what that means is we have to look to spiritual matters.
00:32:58.820 You know, we got to look to, I'm a Christian.
00:33:01.060 You got to look to the presence of the Holy Spirit.
00:33:03.720 And if the spirit is not moving among the people, if we're not, if our relationships aren't mediated by God, then what are they mediated by?
00:33:13.460 Well, probably by the worship of something else, whether it's money or weapons or fame or fantasy or anything else.
00:33:23.180 If the spirit is mediating our relationships, if it's flowing through us, then we can be spiritually fruitful.
00:33:30.800 We can take our spiritual treasure that we've been given by God and we can put it into circulation the way we've been asked and perhaps ordered to do.
00:33:38.940 And then we can see the fruits of those things and you can know the tree of our society by those fruits.
00:33:46.660 If we don't do that, what will happen?
00:33:48.860 I think what will happen is we will put all of our energy into trying to simulate that experience, trying to simulate spirit, trying to build an invisible, omnipresent technological entity in the hopes that it can substitute
00:34:08.080 for the lack of the life-giving spirit moving through ourselves and through our relationships.
00:34:15.840 And I think we're starting to see some of that happen already.
00:34:19.080 Technologists are becoming increasingly overtly theological or religious, more openly tech worshiping.
00:34:26.520 That's, you know, you might see something nefarious in that.
00:34:28.980 You might see something awesome in that.
00:34:30.720 Your mileage may vary, but it's to be expected because of these dynamics that we're discussing.
00:34:36.760 If the right in America doesn't kind of have that in some ways literal come to Jesus in Jesus moment, but in other ways, just kind of like recognizing that this is the environment that we're in.
00:34:48.300 There's a new set of rules.
00:34:50.320 They're not rules that come out of a law book.
00:34:52.280 They're not rules that come out of a strong man.
00:34:54.060 They're rules that come out of the kind of environment that we've created for ourselves, like it or not.
00:34:59.440 If the right doesn't recognize that and adjust accordingly, then yes, I think there is going to be a lot more pain and more tears.
00:35:08.520 Are you optimistic about their ability to do that?
00:35:11.280 Because I feel like the core of the right really has been this legalism.
00:35:15.000 It really has been this idea that, well, we just need a constitutional convention.
00:35:19.200 Once we can write one more amendment, if we can just go ahead and find a trick around the Washington swamp, then everything will be set right again.
00:35:29.640 It feels very difficult, and it might be due to the scale.
00:35:33.280 I feel like at some point, the problem is the scale of social organization.
00:35:38.560 The right is looking for one national election to just kind of swoop in or one quick adjustment to a document to go ahead and set everything back to the 1950s.
00:35:50.840 I think it's a lot harder for people to kind of come to grips with the idea that they're going to need to center their lives spiritually in communities, families, regionalities.
00:36:00.420 I think that's a lot longer road to hoe, and I think that very few people on the right are in touch with that reality.
00:36:08.380 We're still running around and expecting a new Speaker of the House to make a significant difference instead of understanding that this is going to be something that's a much longer and deeper project.
00:36:19.200 Yeah, I mean, in theory, in principle, in the abstract, I think you're absolutely right.
00:36:24.300 We do, however, have an amendment process.
00:36:27.060 We have a regime.
00:36:29.240 It's not totally dead.
00:36:30.420 We have a constitution.
00:36:31.460 It's not totally dead.
00:36:32.480 We have an amendment process.
00:36:34.240 Perhaps that could help us remain American in a digital age.
00:36:39.140 And so, you know, I support a digital rights amendment.
00:36:41.540 I support, you know, that kind of broad, overarching language of the First and Second Amendments explicitly laying out that, you know, look, America is not going to be America.
00:36:51.180 But if ordinary people are alienated from our most powerful technologies, we need to be able to put our hands on these things to restore our competence, to restore our confidence.
00:37:01.680 And with that competence and confidence should come spiritual humility.
00:37:06.460 Without that ingredient, then we're just going to fall prey to the same delusions of grandeur that people throughout history, including Americans, have had in the past.
00:37:14.240 And it always ends in tears.
00:37:15.460 So, you know, I do think that addressing in an explicit way the constitutionality and the way that, you know, natural rights that are addressed in the Constitution, these things do not stop at tech's edge.
00:37:28.880 They need to go into that area of life, which is so dominant, and they need to be present there as well.
00:37:33.540 Otherwise, we can say goodbye to the regime that we have.
00:37:37.100 And, you know, God knows what will come after that, but it won't be good.
00:37:41.300 It'll be less human.
00:37:42.180 It'll be less free.
00:37:43.340 It'll be less American.
00:37:44.780 Am I optimistic?
00:37:47.280 There's a lot of talk of tech optimism going around these days.
00:37:50.020 Mark Andresen, the venture capitalist, dropped a tech optimist, techno optimist manifesto that's been making the rounds.
00:38:01.240 And, you know, I've gone around with Mark a little bit on this, but thanks to those conversations, I can be a little bit sharper in this conversation.
00:38:09.580 I'm more concerned, really, about the optimism part than about the technology part.
00:38:14.420 Mark I am less worried about having too much technology than I am worried about having too much optimism, because I don't really know what optimism is.
00:38:23.400 You know, you look under the hood, and if in the box marked optimism, it just turns out to be tech worship.
00:38:29.520 Like, well, no, not only is that not optimism, but it's not good.
00:38:32.660 But it's hard to explain what optimism is, and I think it always pulls us into this kind of secular key that is especially unhelpful or even harmful or especially dangerous in a digital age.
00:38:46.520 The worst thing that we can do is to succumb to this temptation to hail technology as our savior, to give in to that kind of giddy experience of like, it's happening, you know, but elevated to more than a Ron Paul level to like a worship of the Antichrist level.
00:39:10.340 This is, you know, people can sometimes start backing away when you start talking about the Antichrist, but it's, you know, conceptually, it's not that bizarre to think that there will be something very enticing, tempting, a feeling of real sort of shared catharsis.
00:39:32.000 That feeling that says, like, we finally did it, you guys, we finally, like, cured what ails us, we left behind all the bad things about our humanity.
00:39:40.080 And, you know, here's our hero, here's the superhero, who is, is, is the man meets the moment of super intelligence.
00:39:49.680 And that figure is someone we can all rally around, we can finally have that unity that we crave, you know, this is it, all our prayers have been answered, right, but it's not going to be Jesus, it's going to be this other figure.
00:40:02.420 And technology can really pave the way for that.
00:40:04.940 And we have to guard against that, because it's not intelligent to do that.
00:40:08.980 It's not smart, it's not clever, it's not wise, it's, it's, it's lying to ourselves, and encouraging all of us around the world to be complicit to be conspirators in the same grand lie.
00:40:22.740 Um, that's, you know, that's been a bad idea since the beginning.
00:40:26.880 Um, and it's a really bad idea now.
00:40:29.540 So, uh, optimism, I don't really know what that means.
00:40:31.900 You know, I, I think that being hopeful, being prayerful, uh, being discerning, um, not giving in to panic, not giving in to despair.
00:40:39.700 Um, those are all good, and now more than ever, um, if we have all those things, uh, the development of our technology is going to be something that is under spiritual control in a healthy sense.
00:40:53.380 It can grow, it can, uh, um, it can exist in a kind of harmony, um, that I think reflects and represents the proper harmony between, uh, between church and state.
00:41:08.060 You know, theocracy is bad because it tries to combine church and state into a single entity, and we know historically that that's not very good, and we know what, that the, the cost of that can be very high, both to, both to the state and to the church.
00:41:21.920 Um, at the same time, you know, having a strict separation, uh, that usually doesn't work out very well either, historically speaking.
00:41:28.560 So, the, the harmonious dance between these two, uh, institutions, it can be messy, it can be imperfect, it can require a lot of patience.
00:41:38.060 It can require a lot of discernment, uh, but welcome to earth.
00:41:41.920 You know, this is not, there's no way to speed run this.
00:41:44.580 We can't do it on easy mode.
00:41:46.320 Um, it is demanding, and what is demanded of us right now is, I think, something that, uh, in some senses is, is simpler, more humble, more straightforward than optimism.
00:41:56.160 But in other senses, yes, is more demanding, requires more discernment, and requires us to get out of our heads and to come down into our hearts.
00:42:04.780 Yeah, and, uh, while I, I, I hope for the, kind of, the discernment that you're talking about, I, I worry that it's going to be in short supply.
00:42:13.620 You know, uh, uh, philosopher Nick Land said that the thing about being an accelerationist is it doesn't really matter what you do, because it's going to happen either way.
00:42:21.180 And I, I feel like that, that just seems true.
00:42:24.620 There's, there's a certain inevitability.
00:42:26.300 You know, we look at the way that, uh, that technology has accelerated since kind of its escape from, from kind of regionalities.
00:42:34.400 And the entire time that humanity has kind of thought about the consequences, I mean, I don't know, you know, probably predates, uh, you know, Mary Shelley's Frankenstein.
00:42:44.700 But that's kind of the first thing that pops in my mind, where you, you just have this inevitability.
00:42:48.760 You know there's something dangerous around the corner.
00:42:51.060 You know you shouldn't go there.
00:42:52.600 You know that pursuing this is going to have a, kind of, an inevitable, inevitable bad effect.
00:42:56.760 But, but humanity just seems compelled to chase it no matter what.
00:43:00.580 Yeah, no, we probably shouldn't go ahead and engineer viruses to maximize their lethality or their ability to spread.
00:43:07.440 But we're going to do it anyway, because it's an option.
00:43:09.380 And somewhere, someone is going to kind of cross that bridge no matter what.
00:43:13.100 It feels like the ability of individual humans to discern might exist, but the nature of technology and its ability to kind of spread throughout the globe means that individual discernment doesn't really kind of put any kind of limiter
00:43:26.640 on what technology, uh, what technology will develop and where it will go.
00:43:31.520 Well, there is something to that.
00:43:33.360 Um, you know, I, my, my response is there's one thing that, that moves faster or stays one step ahead of technology and that's religion.
00:43:40.680 And I think we see that happening around the world right now.
00:43:43.940 Um, no matter how fast technology moves, what it's always going to arrive in a place in space and time where, uh, worship is already,
00:43:56.460 already in full swing, uh, different kinds of worship.
00:43:59.100 And yeah, you can have, uh, worship of technology itself, but even that, you know, even that seems to me to be kind of just, uh, you know, like, uh, an antecedent or a, an attempt to kind of present, uh, something as, as other than what it is.
00:44:15.480 Um, it's funny to me that, uh, uh, the kinds of people who say, you know, it is written, technology is going to eat everything.
00:44:22.460 There's no escape.
00:44:23.560 Like whatever is going to happen is going to happen, um, are also the kind of people who tend to look at like the book of revelation and go like, this is ridiculous.
00:44:31.160 So you can't be sure.
00:44:32.080 Like, what do you mean?
00:44:32.880 This is a faded to happen.
00:44:34.000 And it's like, okay.
00:44:35.200 Um, it's, it's pretty clear what human nature is.
00:44:38.340 It's pretty clear what, what the temptations are, uh, and it's pretty clear that, you know, the way of the world is to double and triple and quadruple down on itself and on the idea that, you know, gosh, darn it.
00:44:49.860 We are going to solve our own problems.
00:44:51.380 We don't need to rely on a creator.
00:44:54.180 We don't need to obey.
00:44:55.320 We need to, uh, leverage our way out of everything that sucks about who we are.
00:45:00.760 Um, and, uh, and we know that, um, every time that's been tried in the past, it hasn't worked.
00:45:07.560 Um, and we know that the ultimate attempt by that logic will fail in an, in an ultimate way, but there can be a lot of casualties.
00:45:16.400 Um, there can be a lot of misery and suffering.
00:45:18.780 And, uh, you just look at the, the trend lines on everything from birth rates to T counts to sperm counts.
00:45:27.320 Um, it's going down.
00:45:29.020 Uh, it's, there's a deep seated longing, um, in the human breast.
00:45:37.560 For, uh, a surrender, a total surrender of responsibility.
00:45:42.460 Um, and in bad times, that leads people to think that they're perhaps best off just exiting life altogether.
00:45:49.880 Um, even just with, with the world, the way it is right now, you know, there's so much energy, uh, around.
00:45:56.900 I mean, look like, yes, let's make America great again.
00:45:59.200 Sure.
00:45:59.880 It would be better if we were great than if we sucked in order of just declining empire, falling apart.
00:46:04.020 That's what it seems.
00:46:04.840 Sure.
00:46:05.900 But that, that longing to go back to be great again, whether it's the, you know, art deco or the Norman Rockwell, you know, you're grasping for these signifiers.
00:46:18.800 Um, and there's a lot of energy going into like, no guys, like, we'll just innovate more and then it'll be awesome.
00:46:25.700 We'll have jetpacks and we'll like, uh, the NFL will play games on the moon and we'll just go in the sphere and you two will be cool again.
00:46:34.100 It'll be awesome.
00:46:34.560 Like, will it though, will it really, I think more and more people are becoming just disenchanted with all of these kinds of like attempts to just zap them with more voltage to get them to get up and do stuff again.
00:46:48.480 Um, a lot of people just think that it's all kind of a joke and they are hungering for spiritual reality, for spiritual nourishment that is nowhere to be found in these, you know, optimistic camps about like innovation and dynamism and all that.
00:47:04.180 They are thirsting for spiritual life.
00:47:07.700 Um, and it's looking increasingly like more people are going to turn away from the world in order to find it.
00:47:15.940 Um, I think we're going to see more monasteries.
00:47:18.360 I think we're going to see more monastics along with, unfortunately, more suicides, more, more, you know, people shutting themselves in their, their basement with their VR headsets, more efforts to withdraw from the world.
00:47:30.460 Now, in order for us to respond to that deep seated longing in a way that is constructive and healthy and, and protects our humanity and restores it, we have to be able to provide spiritual institutions that give people who want to withdraw from the world, a way of, uh, asserting, um, a healthy kind of authority over our technology.
00:47:51.700 One that the law itself can't do.
00:47:53.440 Uh, so when you hear me running around talking about Bitcoin monasteries, like what's that?
00:47:57.240 Well, that's kind of what that is.
00:47:58.760 It is an alternative to, uh, disappearing into addiction, disappearing into suicide, disappearing into the sphere, disappearing into the VR headset, disappearing into porn.
00:48:09.820 Well, no, you can, if you want to renounce the world, great.
00:48:12.700 Come into the monastery, um, gather your spiritual energies with others and apply that spiritual energy to the shaping of how basic technology is used.
00:48:23.760 I think that's going to become a more important part of life as life goes on.
00:48:27.900 Um, but this is America, people still like to mix it up.
00:48:31.200 Um, there is still a lot of, of restless energy out there.
00:48:33.980 This is a very big country.
00:48:35.440 You look at Europe, you got very, not a lot of land to go around, even if population goes down, America is going to be a big place for a long time, I think.
00:48:42.940 And there's a lot of room to roam and try things out, um, and experiment, uh, in a, in a practical sense, uh, in the sense of, of trying to restore, uh, the basics, not some potted vision of what the cool future could have been.
00:48:56.320 Um, but really just the basics of like, okay, you know, the tables have been turned, like the, the, the hourglass has been flipped around.
00:49:03.700 Um, America isn't what we thought it was going to be.
00:49:06.260 So how do we respond to that reality?
00:49:08.200 Uh, building from the ground up in a way where the spirit can move through us and, uh, the things that we do are, are fruitful and, and, uh, increase our flourishing.
00:49:16.220 Oswald Spengler, uh, predicted that the West would walk away from science and technology, that it would grow sclerotic.
00:49:26.220 It would find that it had put itself into these containers and these rules that were simply just a straitjacket to defining the, the metaphysical could no longer manifest itself.
00:49:37.260 And so they'd need to basically leave those, those restraints behind by kind of releasing those stringent rules.
00:49:44.600 I also think about somebody like Alexander Dugan, who has kind of said that once you get to the end of the strictly logical present, the idea that, you know, we're, we're, we're, uh, we're postmodern, we're post-liberal, that, that kind of opens an absurdity up there, lets the spiritual return.
00:50:03.240 And I wonder, is it necessary for people to walk away from these things?
00:50:08.680 Is it necessary for these things to collapse for the return of the spiritual?
00:50:12.460 I, I, I would like a world that you're describing where people bring that, which was America forwarded to these things or that, that, which was, you know, I guess there will always be some aspect of that, but I have a hard time seeing it kind of reform itself into something that allows that spiritual manifestation while still holding on so tightly to what many people have, have kind of assembled as the American identity.
00:50:38.120 So, yeah, uh, yeah, a couple of things, um, it's hard for me to assess Spangler as, as thoroughly as some people in my orbit might, might want me to, just because there's this question of like, well, wait a minute, what is the West anyway?
00:50:56.160 You know, what are we talking about here?
00:50:57.640 Is this, is this a name that applies to an actual thing?
00:51:02.000 I mean, just to take a, a semi-random example, if you're Eastern Orthodox, like what does the West mean to you?
00:51:09.020 Uh, you know, Orthodoxy, you do Russia, you got, uh, Syria, Greece, Romania, uh, Georgia.
00:51:17.900 Uh, like it's, it's, it's part of, um, a, it's part of Christendom, uh, in a way that, that goes beyond the boundaries of, of East and West really, uh, or, or, uh, rises to, um, a higher level of organization than is captured in the, the sort of East West dynamic.
00:51:38.740 Uh, so I think we need to be a little bit more precise about, you know, what, what the West means, if we're going to assess, you know, what the West needs to do or, or how the West is going to end up.
00:51:48.440 Um, but I think, you know, you're, you're right to suggest or to hint that as important as that spirituality is, um, it's not enough.
00:51:58.620 You can't just like drive around the country, just like spreading spirituality around and expect things to go well.
00:52:07.100 Like America's always been a very spiritualistic place, uh, great awakenings, cults, uh, utopian communities, heresies, just from every poor, you know, in some ways.
00:52:20.480 Um, and this is, you know, this is like sociological analysis here.
00:52:24.800 I'm not, I'm not here to, to point out people's denominations or whatever and go, you know, heretic, heretic, heretic.
00:52:31.300 Um, that's, uh, for another venue.
00:52:35.000 Um, my point is that there's a lot of talk, um, in our digital age about networks and about the importance of networks.
00:52:44.140 You've got technologists out there saying like, Hey, you know, you can, you can build new forms of governments on the network or use networks to do it.
00:52:52.240 Um, and yeah, there's something to that, but, but more importantly, um, spirituality needs to have a certain kind of institutional ballast or else we know from experience that it spirals off into a bunch of like weird and unhelpful and oftentimes very deeply damaging directions.
00:53:12.100 Uh, or, you know, if it, if it doesn't go that wrong, then at least you just end up with, uh, I mean, when I think of what's the West, I think of religious war.
00:53:20.740 Like the West is a place where the wars are religious.
00:53:23.840 Uh, I, I include Islam in this, you know, you got jihad, you got the wars of religion in Europe and Protestant reformation.
00:53:31.160 Uh, and you can go to America and you say like, well, America has been quite free of religious war.
00:53:36.620 And it's like, well, you know, civil war was basically like a religious war.
00:53:41.060 Um, uh, the, the Indian wars, basically religious conflict.
00:53:47.060 Um, and all that craziness with sort of the Mormons getting chased out into the, the Utah desert, um, lots of religious conflict.
00:53:57.240 Uh, and when conflict erupts in the West and in America, it oftentimes takes on a religious character.
00:54:02.360 Uh, not just cause the Bible thumpers are at it again, but because this is something that's, that's deeply rooted in our civilization.
00:54:09.800 Uh, so if we want to somehow find a way out of that kind of trap, um, I think people are going to have to recognize that a, a strong and ancient church is going to be a help.
00:54:25.300 Uh, just sociologically speaking, if you walk away from a strong and ancient church, you are going to have the kinds of problems that is going to, that are going to make it too difficult for you to get on top of the technological problem.
00:54:38.860 Uh, if your spirituality is dispersed, if you are, um, organizationally weak, if you don't have that ballast of a strong ancient church, uh, then you're probably going to be too divided, too confused.
00:54:52.860 Um, too, uh, too, uh, focused on haggling over interpretive disputes to, um, muster the spiritual authority, um, that scales to a level where you can really, uh, restore some, some basic controls over technological development.
00:55:13.860 Uh, so that's, you know, that's my view of things.
00:55:17.340 Um, I think, you know, it's, it's, uh, it's for some Americans, that's something that they're, they're going to be reluctant to, uh, accept, but you know, look, uh, sometimes it takes a long time for someone to have a religious conversion.
00:55:33.860 Other times it happens very quickly.
00:55:35.760 Um, just the, the, the, the whipsawing events of the past, you know, four, eight years, uh, there's such dramatic change, not just in, uh, in governance, not just in the economy, not just in technology, but in people's spiritual experience.
00:55:51.280 And it's really been profound, especially coming out of lockdowns.
00:55:54.200 Um, things can change very fast.
00:55:55.760 And so what seems to be outlandish or, or, uh, far-fetched today, uh, might become something approaching conventional wisdom tomorrow.
00:56:07.980 Yeah, I, I hear what you're saying and I think there's some merit to it.
00:56:11.660 I would say that America is probably more spiritually Protestant than it is spiritually anything else.
00:56:16.360 Uh, that's more core of the American identity than, than many other things.
00:56:20.860 And, and a discarding of that would probably be a discarding of American identity in general.
00:56:25.200 But, uh, you know, that, that's a longer discussion to be sure.
00:56:28.920 Well, yeah, let me just say one more word about that though.
00:56:32.120 Uh, lots of Protestantisms out there, lots of different kinds of Protestantism out there.
00:56:35.960 Uh, when you look at high church Anglicanism versus, uh, uh, Pentecostalism, I mean, you got Unitarian Universalists running around saying, oh, we're Protestants too.
00:56:45.160 And it's like, well, I don't believe in the Trinity.
00:56:47.160 You know, I mean, it's there, there's so much, uh, variance here.
00:56:51.300 Um, where's the center of gravity in American Protestantism?
00:56:54.760 Um, I think that's a very interesting question.
00:56:56.280 I think that in, in some cases, um, I mean, it's really almost a jump ball.
00:57:01.380 Like you look at the retention rates of even some very successful churches, uh, that are bucketed together, uh, as, as Protestant.
00:57:11.260 I, I think, you know, Latter-day Saints is about 50% retention.
00:57:15.200 Uh, evangelicals have issues.
00:57:17.580 A lot of these churches are fragmenting.
00:57:19.080 Some of them are dying off.
00:57:20.300 Some of them have been woke-ified.
00:57:22.180 It's very fluid.
00:57:23.360 Um, and people are, you know, uh, looking for a lifeline, something that they can grab onto that isn't going to slip through their fingers.
00:57:31.160 Um, so I think, you know, I think the, the religious future of America in that sense is very wide open, perhaps more wide open than it's been in a long time.
00:57:38.360 Um, and, uh, yeah, I think you're right.
00:57:40.620 You know, there is a kind of, uh, uh, tendency toward, uh, uh, an anarchist sensibility with regard to spiritual matters.
00:57:51.260 Uh, but, you know, how's that working for you?
00:57:54.460 Um, I think that's a question a lot of Americans are going to be thinking through.
00:57:58.220 Fair enough.
00:57:59.300 All right, guys, we're going to go ahead and pivot to the questions of people real quick.
00:58:02.300 We got a few over here, but before we do that, James, is there anything people need to be checking out?
00:58:06.220 Where can people find your work?
00:58:08.540 Uh, well, gosh, uh, blazemedia.com slash tech.
00:58:12.940 Um, the show is zero hour on blaze TV.
00:58:15.380 Uh, I, I will, uh, plug a few other things, uh, the American mind at the Claremont Institute, American mind.org.
00:58:22.520 Uh, there's a podcast over there called the round table as well as weekly publishers and editors podcast.
00:58:28.300 Uh, it's mostly focused on, uh, straight up and down political stuff, but there's cultural and tech stuff in there as well.
00:58:34.500 Uh, the book is human forever, uh, canonic.xyz, um, all flavors of Bitcoin, all the major ones anyway.
00:58:40.620 Um, and, uh, what else?
00:58:42.700 I feel like there's something else, but, uh, but it's slipping my mind.
00:58:49.420 Oh, x.com.
00:58:51.520 Yes.
00:58:52.320 How could I forget?
00:58:53.480 At James Polis, first name, last name, uh, I'm on Twitter, uh, trying to, to, to tread lightly, but DMs are open.
00:59:01.280 And of course, guys, make sure you go ahead and check out the new blaze site, uh, articles from both I and James are over there and, uh, they've redone everything.
00:59:09.600 They've gotten rid of all those really ugly ads so that you can go ahead and enjoy, uh, all of the different articles that they put up there.
00:59:15.800 Of course, they've gotten rid of the problem of demonetization with big tech because you don't have to worry about the ads anymore.
00:59:20.500 But of course, that means they do need your support to make sure that work like ours keeps showing up on the site.
00:59:25.160 So make sure you go ahead and check out the new blaze, uh, new site.
00:59:29.620 I think you're really going to enjoy it.
00:59:31.200 All right.
00:59:31.520 Let's see what we've got here.
00:59:34.620 Uh, Maximilian Cunnings for $2.
00:59:37.200 I'm studying worldviews.
00:59:38.600 What advice do you have?
00:59:39.840 Uh, well, man, that's a, that's kind of a wide question.
00:59:42.500 Of course, if you mean religions, there's many different options there.
00:59:45.200 If you're talking about, uh, you know, political systems, uh, I don't, I don't know, uh, James, where would you suggest someone start if they're looking at different worldviews?
00:59:53.580 Yeah.
00:59:53.960 I mean, I would start with what is a worldview, you know, what, what are you really studying and is that really what you want to be studying?
00:59:59.240 So I think, you know, taking some time to just dig into that question, uh, understand, you know, who, what that concept is, where it came from, uh, and whether it's leading you really in the direction that you're trying to go.
01:00:10.100 I think that's a good place to start.
01:00:12.020 All right.
01:00:12.660 Skeptical Panda here for $5.
01:00:14.020 Thank you for the great discussion, Oren and James.
01:00:15.760 I can't always catch live streams because of work, but I listened to the podcast after.
01:00:19.960 And yeah, guys, of course, it's great to catch you during the live streams.
01:00:22.740 It's great to have the camaraderie, have the audience here, everyone talking, having a discussion.
01:00:27.100 But of course you can catch all of these episodes on places like blaze TV, and you can catch the podcast.
01:00:33.280 Make sure you subscribe to the Aaron McIntyre show on your favorite podcast platform.
01:00:37.160 So you don't miss an episode.
01:00:39.340 And then we have, uh, George.
01:00:41.780 Hey, Duke here for $5.
01:00:43.240 The transhuman ideologues and the right leaning tech optimists are traveling the same road and will find themselves in a common destiny.
01:00:51.780 Yeah.
01:00:51.940 I mean, this is obviously a, uh, kind of a famous argument.
01:00:54.820 There's, there's the old joke that, uh, most of Nick Land's argument are, are like trans cat boys or something.
01:01:00.400 So, uh, there's, there's certainly an overlap there to be sure either way, I guess it is an, a, an abandonment of humanity.
01:01:07.280 And that's why I think that James's, uh, work is interesting because of course he's focused on keeping the humanity, uh, first and foremost, when we're talking about tech, not walking away from tech, not, uh, not avoiding the question, uh, not becoming Luddites,
01:01:20.000 but also ensuring that humanity is the one being served by tech and not the other way around.
01:01:28.220 All right.
01:01:29.340 Well, guys, I think that's everything.
01:01:31.120 We're going to go ahead and wrap it up, but thank you, James, so much for coming on.
01:01:35.040 It's been great, man.
01:01:36.400 A hundred percent.
01:01:37.040 And yes, everything you've heard about the new blaze is true.
01:01:39.880 Uh, website's beautiful.
01:01:41.320 The content is, uh, nutritious and delicious.
01:01:44.120 So, all right, guys, well, thank you everybody for coming by and as always, I will talk to you next time.