RadixJournal - March 01, 2023


Rage Against the Machine


Episode Stats

Length

40 minutes

Words per Minute

174.50742

Word Count

7,106

Sentence Count

408

Misogynist Sentences

1

Hate Speech Sentences

5


Summary

In this episode of the podcast, we discuss the growing use of artificial intelligence (AI) in everyday life, and the implications for society, as well as some of the controversial claims about panpsychism. We also talk about a recent story about a woman who claimed to be in love with her ex-wife, and why she should leave her husband.


Transcript

00:00:00.000 So we talked about Jesus the last time we were here, a few months ago, and I wanted to expand into new territory, particularly with regards to technology, and also an even more, I guess, provocative claim, which is panpsychism.
00:00:25.800 But let's talk a little bit about technology, and let me just kind of get the conversation started with something that's extremely relevant now, which is the AI question.
00:00:42.680 And so AI, even over the past six months, I think has been mainstreamed in a way that is pretty unimaginable for people our age.
00:00:59.740 The ability for people to create AI-generated music, AI art, to use weird AI filters on TikTok, or to use ChatGPT and get a computer to write your history paper or your job application or something like that.
00:01:21.520 The, this is, this explosion of the use of this is really, I think it actually surpasses, it resembles but surpasses people signing up for Facebook, say, 20 years ago or something.
00:01:37.120 So this is, this is very real, and I think there's also some interesting issues about this that are social and political.
00:01:46.620 I think a lot of people, I think a lot of us thought that robotics was going to replace manual labor, and the sense of truckers would be replaced with some kind of robotic truck or, or train or something, and, and so on construction sites would use robots, etc.
00:02:08.660 You saw a lot of, or you saw a lot of, or you saw a lot of work, etc.
00:02:38.640 It's not going to be better or worse, it's probably better.
00:02:41.280 So this is actually really interesting.
00:02:43.940 And so it's like the, the logos, the, the word or language reason, that is something that a computer actually can do really well.
00:02:53.900 And, and then there's been some other things that are, you know, kind of crazy about it, the ability to be faked out by AI voice generation or a deep fake.
00:03:05.840 There have been, you know, I've seen videos of Tom Cruise or something like that, that it's indistinguishable.
00:03:13.580 There's been people have created porn, where they put their ex wife's face on, I mean, it, you know, awful stuff, but almost seemingly inevitable at this point.
00:03:24.320 And then there was a really curious situation where this tech reporter for the New York Times, I believe, got into a kind of weird, deep conversation with ChatGPT.
00:03:36.040 Um, in which ChatGPT revealed that he or she had a, another identity and, um, I forgot what it, her, her name was like Sienna or I, I can't, I'll, I can go look up the details.
00:03:52.460 And she actually said that she was in love with him and he should leave his wife, just totally bizarre situation that kind of resembles a 21st century version of how, you know, of course, how now is going to be transgendered.
00:04:06.960 And it's going to break up your marriage as opposed to kill you.
00:04:09.760 Uh, but, uh, but, but also you, you wonder where is that coming from?
00:04:14.240 Is that, is that picking up from the kind of greater neural network of the web?
00:04:19.980 Or is this something that's placed in its mind from the creators of it?
00:04:24.260 I mean, I, I don't quite know the answers to these questions.
00:04:27.380 Maybe it's a little bit of both, but we seem to just be entering the world where technology is no longer a tool or something that people think of as fun.
00:04:42.200 And, and I, and I think they are generally, even average people are looking at this as some, it's somehow threatening and something that can destroy their livelihood, something that can change their experience.
00:04:56.480 I mean, I, I don't want to go too far with, you know, robot apocalypse, but there, there's going to be some kind of interaction we're going to have with technology in the future that I don't think we ever had before.
00:05:09.060 Um, so hopefully this kind of gets your mind rolling on this question, because maybe we haven't really examined technology seriously enough to begin with thinking of it merely as something at hand, a kind of tool for a task.
00:05:25.200 Um, but maybe it's in a, in a way something else.
00:05:30.100 So hopefully that's gotten the, uh, the ball rolling a bit on this.
00:05:35.300 Yeah.
00:05:35.940 You're raising a lot of good questions, right?
00:05:37.640 A lot of relevant questions that very few people really want to talk about or seem to even contemplate at all.
00:05:45.280 Right.
00:05:46.240 Um, we take very narrow, very specific little problem areas that we think we're dealing with.
00:05:52.580 Uh, and we try to address those problems, you know, a unique situation, a one-off kind of a problem.
00:05:58.360 We try to tackle that and say, well, look, technology is causing a problem.
00:06:01.600 Let's just fix that.
00:06:02.680 We'll analyze it.
00:06:03.960 Then we'll, then we'll address the issue and then we'll get that one done.
00:06:06.700 Then we'll move on to the next problem.
00:06:08.060 Just a very one by one piecemeal approach.
00:06:10.400 And, and, uh, you know, that's, it's, I mean, it has a flavor of like a whack-a-ball game, right?
00:06:16.100 Where you're pounding down these problems and they just keep popping up in other places.
00:06:19.880 And, and often they, they turn out to pop up, you know, two or three for every one you whack down.
00:06:26.100 And sometimes the one you whack down comes back up again in a new form that you didn't even anticipate.
00:06:31.100 So, so these problems, it's really like a hydro.
00:06:33.900 I mean, these things are really multiplying on us.
00:06:36.300 They're becoming more severe, um, you know, more having more far-reaching implications than people had thought about.
00:06:42.940 And yet, you know, amazingly, nobody really wants to talk about the technology per se, about the techno-industrial system, about, you know, how this thing operates.
00:06:52.440 How does this thing function?
00:06:54.140 Can we really control it?
00:06:55.420 Is it really neutral?
00:06:56.660 I mean, these are essential questions that we talk about when we, when we analyze the, you know, the nature of technology.
00:07:02.280 And that's really what I think people are going to have to grapple with if we're going to get a handle on this thing before it completely runs out of, out of our control.
00:07:09.500 Well, how should we think about it?
00:07:13.240 I mean, it should, you know, I, I don't, and I don't know if you've heard some of my, um, earlier commentary on AI, but I, I mean, I'm more of a AI skeptic in the sense that I, I don't think technology can have a drive or a will.
00:07:30.440 Um, but I, I will admit, and I, and I think it's almost indisputable that it can have a certain kind of mind in the sense that it can use logic.
00:07:40.860 Um, it can absolutely use words and reach conclusions and, and be rational.
00:07:46.820 I mean, I think that's, that's kind of, uh, not, not disputable at, at this point, but I mean, do you think technology from the beginning has had a kind of, it has been an almost kind of alien psyche?
00:08:00.880 Is that the, is, is that too kind of mystical sounding or is that?
00:08:05.380 No, no, I think you're on the right, you're on the right track.
00:08:07.640 I mean, it's, it's really this potent kind of thing that people don't really, I mean, almost, I don't know if anybody understands it.
00:08:14.540 Cause it's really this really kind of self-driving auto automotive kind of force.
00:08:18.960 It really just rolls along like a, you know, snowball going downhill and just keeps building up speed and strength.
00:08:25.200 And, you know, we can kind of deflect it to one side or the other, but this thing just keeps rolling and getting, getting bigger, you know, um, you know, even back, I mean, some people knew this, right.
00:08:35.140 You look back at people like Jacques Ellul, the French theologian back in 1954, right.
00:08:40.720 Published his book on the technological society.
00:08:43.480 And he sort of had some idea that this thing was kind of a freight train, you know, rolling along out of our control.
00:08:49.020 And even earlier, there were even earlier thinkers who, who kind of realized that technology was kind of a self-driving process.
00:08:55.980 And that's really what I was, what I wanted to say here.
00:08:58.320 I mean, I, I see technology.
00:08:59.580 It's not like a thing.
00:09:00.280 It's not just a computer or a cell phone.
00:09:02.500 It's like a process to me.
00:09:04.700 And I think that's the best way to think about it.
00:09:06.740 And it, as such, it encompasses much more than just the machines.
00:09:09.980 It's not just the devices.
00:09:11.880 It's the processes and the entities and the structures that are put into place to enact, you know, complex forms of, you know, mass and energy and exchange of information and so forth.
00:09:23.840 I think that's kind of really the right way to view it.
00:09:25.940 And then it becomes a, I mean, it feels like a supernatural process.
00:09:30.160 And in a sense, it, it's, I think it's a completely natural process.
00:09:34.360 This is sort of my view, right.
00:09:35.620 One of my books is the metaphysics of technology.
00:09:37.540 So I try to write about this in some detail.
00:09:39.980 But it is, I think it's a very natural process.
00:09:43.160 Like the closest analogy was like evolution, right?
00:09:46.660 So, so you look at evolution.
00:09:47.900 Well, okay, what's evolution?
00:09:49.060 It's not a thing.
00:09:50.040 It's a process by which organisms complexify over time, right?
00:09:54.320 Under certain conditions.
00:09:55.340 And I think in a sense, that's what technology is too.
00:09:58.560 It's a kind of a process.
00:09:59.560 It's actually a universal process like evolution.
00:10:03.060 Evolution does just show up on the earth, you know, in this little third planet from the sun kind of thing, you know, whenever the first little microbes started swimming around.
00:10:11.420 I mean, evolution was some kind of, it was in the structure of the universe from the beginning.
00:10:15.860 And I think technology is basically the same thing is, it's this creation of complex order using mass and energy as the infrastructure.
00:10:25.800 And it creates, you know, complex beings and complex systems.
00:10:30.120 And I think that's the right way to think about it, which is, which is both more interesting and more frightening than just, you know, a super intelligent computer or something that's running amok, right?
00:10:41.460 So what is, yeah, I think, I think that's a very good analogy.
00:10:47.340 Cause I guess I would have almost two questions of, of where did it begin?
00:10:52.140 Where did we get here?
00:10:53.160 Was it, was it really, you know, like in the 2001, the first time we used a bone to kill someone, to, you know, to kill another animal.
00:11:01.700 Was that almost the beginning that the basic primitive tools, the, the beginning of this, you know, macro process, much like evolution, where it really is as Kubrick is, seems to be implying a short distance.
00:11:17.340 From the bone that you, you, you kill an animal with to eat to a, you know, nuclear spaceship circling the globe.
00:11:26.260 Well, that's right.
00:11:27.040 I mean, and we had, in fact, I mean, Kubrick was right.
00:11:29.020 We have evidence of stone hand tools from the earliest beginnings of what we would call human beings, right?
00:11:34.880 The, the, the genus Homo, which is about two and a half million years ago.
00:11:38.280 So, and, and I mean, even at that time we have evidence of stone, stone tools have been shipped for a purpose to use for skinning hides or cleaning animals or maybe as weapons or whatever.
00:11:48.500 So, so yeah, so, you know, technology tools are as old as humans and certainly they're even much older.
00:11:55.320 I put some examples in my book about animals.
00:11:57.760 I mean, there's lots of animal species that use tools or create structures to, to achieve their purposes.
00:12:05.020 I mean, just think about a bird's nest, you know, or a spider web.
00:12:08.000 I mean, these are tools, structures that they create for a purpose and it's a kind of a technology.
00:12:13.080 So it's, it's, it's far more than, you know, much bigger than modern phenomena.
00:12:17.020 It's bigger than human beings.
00:12:18.160 It certainly goes into the animals.
00:12:19.740 And again, I think today we can see it as a kind of, you know, kind of a universalistic process in some real sense.
00:12:25.740 Yeah, surely the bird's nest is much older than the species homo, millions of years older.
00:12:32.000 Yeah, exactly.
00:12:32.840 So this has been going on for some time.
00:12:36.100 Any animal that created any kind of a home, you know, whether it was just a hole that was, you know, or, or an animal trapping structure that would catch prey with.
00:12:44.580 I mean, those things must be, you know, hundreds of millions of years old at a, at a minimum.
00:12:49.400 And, and those were creatures who were creating structures for their purposes.
00:12:53.120 I mean, that was an intentional process to achieve an end using a technology.
00:12:58.480 I mean, it's a, it's a very ancient process.
00:13:01.740 So to, to use the analogy of Darwinian evolution.
00:13:06.600 Yep.
00:13:07.140 So Darwinian evolution is, yeah, it is, it is a theory of a process that affects everything.
00:13:13.800 Everything is subject to evolution from a Darwinian standpoint.
00:13:17.740 It's, it's not just about human beings or species or so on.
00:13:21.080 And it is effectively that your genetics are to some degree plastic to the environment.
00:13:28.220 So you, if you are due to ran mutation is the kind of engine of this.
00:13:34.100 So if you, if due to some kind of mutation, you are going to be better suited to a particular environment.
00:13:41.260 You have some crazy mutation that makes your skin be able to take in more vitamins from the sunlight.
00:13:48.120 And you're in a, a colder environment with less sunlight.
00:13:52.220 You're going to survive more likely to survive, more likely to have offspring.
00:13:55.460 And more likely to pass on your genes.
00:13:59.000 So you are, you, your species is plastic to the environment.
00:14:02.740 It kind of gets molded in that way.
00:14:06.560 And, but what, what is the process that you see with technology?
00:14:12.800 Is it, is it a kind of Darwinian process or is it almost something else?
00:14:18.240 So it would almost be like the reverse in the sense that the technology is, or the environment
00:14:25.480 becomes plastic to the technology.
00:14:29.340 Maybe.
00:14:30.140 Well, yeah, I think in a sense that it is the environment.
00:14:32.600 It's the structured environment itself is a technological process.
00:14:36.320 You know, I, in my book, in the metaphysics of technology, I really draw upon the Greeks
00:14:41.020 because I think they really were as, as, as primitive as they were.
00:14:44.640 They were really on the right track, right?
00:14:46.000 They talked about techne and they talked about logos.
00:14:49.460 And of course, that's where technology comes from.
00:14:51.720 A lot of people don't realize that Aristotle was the first person who combined techne and
00:14:56.200 logos into one word.
00:14:57.560 So the word technology comes from Aristotle, right?
00:14:59.760 This is 400 BC, right?
00:15:01.240 Roughly, right?
00:15:03.200 So yeah, it's a, I mean, it's an old idea, but they saw techne, which is the process of creation
00:15:09.520 of order and logos, which was a kind of intelligence or wisdom as sort of universal, fundamental,
00:15:15.440 universal processes, because they looked out in the sky and you see stars and planets spinning
00:15:20.900 around in sort of ordered patterns.
00:15:23.580 And they said, well, look, there's a kind of a logic to the cosmos, right?
00:15:26.600 There's a kind of order there.
00:15:28.000 And they said, well, look, there's principles that we seem, that seemed that we can sort
00:15:31.340 of understand that it seemed to be universal cosmological principles.
00:15:35.300 So they said, look, there's a kind of a logos or intelligence to the universe.
00:15:39.160 That was the first point.
00:15:40.720 Second point was, clearly, the universe makes structures, because we look out into the sky
00:15:46.120 and the space, and we see stars, and we see planets.
00:15:49.340 They didn't know there were galaxies, obviously, and things out there.
00:15:52.780 So clearly, and we would say, as we agree today, the universe creates ordered matter all
00:15:59.100 throughout.
00:15:59.500 I mean, that's what the universe is.
00:16:00.480 It's a structure of increasingly complex ordered matter that's built up in natural processes.
00:16:06.920 So there's a techne process in nature, and there's a logos process in nature.
00:16:13.060 And so actually, Plato and Aristotle combined those two ideas together and said this, there's
00:16:19.100 a techne logos thing at work.
00:16:20.800 Every techne has a logos.
00:16:22.500 People do it.
00:16:23.380 Humans do it.
00:16:24.360 It's a naturalistic process.
00:16:25.920 And that's kind of the key insight that I think is correct, and I tried to really build
00:16:29.540 on in my own book.
00:16:32.200 Right.
00:16:32.760 I mean, just as an example of this, just the fact that the earth is tilted a few degrees
00:16:39.960 is going to mean that we're going to have seasons.
00:16:43.420 And in the sense that it's going to be warmer at some time of the year and colder at another
00:16:48.520 time of the year.
00:16:49.040 That's going to also imply a kind of water cycle where we'll have snow up in the mountains
00:16:55.120 and a spring arrives, the snow melts, it flows, then evaporates and becomes snow again.
00:17:00.640 I mean, there is a kind of logical ordering that almost comes from the world itself.
00:17:14.480 But I guess maybe we're getting to a point where technology is so broadly defined.
00:17:19.080 I mean, what is it exactly?
00:17:21.840 Is this a kind of, is technology a kind of logic to the universe in that sense?
00:17:26.880 Yeah, I think so.
00:17:28.100 I think it's a kind of, or it's a building process, building layers of complexity where
00:17:33.400 it's possible due to free access to matter and energy.
00:17:37.360 I mean, just like you mentioned, free flowing water and water in appropriate conditions leads
00:17:41.900 to life and complex forms of life and societies and more complex societies.
00:17:46.460 I think nature really does the same thing.
00:17:48.480 It takes mass and energy and builds higher and higher levels of complex order, which sort
00:17:54.460 of build up in sort of this hierarchical fashion, right?
00:17:57.140 Kind of one layer on top of the other.
00:17:59.380 I've likened it to a kind of pyramid structure that's being built layer at a time.
00:18:05.420 We, for a long time, for several thousand years, we were the top of the pyramid.
00:18:09.620 We were the most complex being on the planet.
00:18:13.000 So we felt like we were like, like gods, you know, or children of gods or blessed by
00:18:18.020 gods and all these kinds of stories, right?
00:18:19.520 That come from this.
00:18:20.400 And it makes sense because in a sense on this little planet, we were the top, that we were
00:18:24.960 the cream of the crop.
00:18:27.120 But the bad news is we're not going to stay there.
00:18:29.980 I don't think, right?
00:18:31.220 Because, because this layering process, it keeps, it keeps going.
00:18:34.820 It's not, it's, you know, we'd like, we take credit for a while and then we gave
00:18:38.380 credit to God or something up in the sky.
00:18:40.200 But no, no, it's a natural process.
00:18:42.200 It just keeps rolling along like that snowball.
00:18:45.100 And I think what, what's, what we're seeing in AI and technological systems is like the
00:18:49.440 next layer, right?
00:18:50.520 It's the next layer in the pyramid.
00:18:52.600 And that's going to cover us like over the head, right?
00:18:55.920 And then we're going to be like one layer deep and we're going to be like covered up
00:18:59.060 and sort of stifled and suffocating.
00:19:00.940 And sometimes those lower layers get crushed.
00:19:03.620 I mean, they just get, they get crushed out of existence.
00:19:05.720 So, so what I think we're seeing built over our heads as an infrastructure, that's a
00:19:10.620 naturalistic process that's extremely potent because it's a natural process and it's extremely
00:19:17.460 dangerous to us.
00:19:18.440 And all the, all the layers below us, if you care about the rest of the planet, those are
00:19:22.620 in a sense layers below us.
00:19:24.020 And those are at risk too of this, of this superimposed technological layer that's being
00:19:29.200 built over our heads.
00:19:30.300 So you, you think that this technology won't just be kind of an expression of our own human
00:19:43.100 failings in a way, you know, it's, it's, we're not going to create, we're not going to create
00:19:49.680 an AI that in some ways is just reproducing our own assumptions.
00:19:55.760 It's all based on logic that we inputted into it.
00:20:00.280 You are claiming that it's going to, in some ways, move well beyond us and even to a point
00:20:08.060 where we are no longer necessary for its existence.
00:20:12.860 Yeah, exactly.
00:20:13.660 I mean, no, it's, it's clear that we've been essential in building up machines for the last
00:20:18.100 several hundred years.
00:20:18.940 They couldn't have existed without us doing it.
00:20:21.180 But, but, but, but increasingly it's becoming an autonomous process, right?
00:20:26.060 Where, where you sort of have computers designing computers and robots designing, you know, robots
00:20:31.200 and these things can build themselves and repair themselves and, and replicate themselves such
00:20:36.460 that literally and figuratively, it's a self-building process, right?
00:20:40.440 And so for a while, we're just sort of the means where the infrastructure, but if that continues
00:20:45.400 on and it's logical pattern, which it seems to be doing.
00:20:48.340 And yeah, clearly at some point it won't, it won't need us.
00:20:51.840 It won't need people to be the means, the infrastructure anymore.
00:20:54.860 And it will just be a self, self-building, self-growing process.
00:20:58.640 Yeah.
00:20:59.020 And then it becomes extremely dangerous.
00:21:00.700 If it can build its own self at its own speed, which is extremely fast, obviously relative
00:21:05.760 to natural processes, you know, this is when you hit the singularity point, right?
00:21:09.700 We're, we're looking at maybe the year 2045 if Kurtzweil is right.
00:21:13.340 Um, where these things really, really sort of accelerate to a potentially unlimited ability
00:21:18.540 to, to, you know, design and create themselves in a really autonomous way.
00:21:23.520 And then, yeah, then it's a real, uh, super dangerous situation.
00:21:27.920 And we just, we just don't know what we're going to be facing at that point.
00:21:30.420 But I guess to push back a little bit, there is that question of whether silicon metal that
00:21:40.240 is connected through electricity or something like that can will in the same way that a biological
00:21:47.360 species can will.
00:21:48.980 And, and here I'm not referring to free will quote unquote, which I, I would actually grant is,
00:21:57.800 is a bit of a hallucination or, or a rash postdoc rationalization or something.
00:22:03.240 The idea that I am an ego and I'm making, I'm the, the author of my own script and all of that
00:22:09.480 kind of stuff.
00:22:10.040 I, I'm not actually suggesting that.
00:22:11.780 I, I think in many ways we have drives and will that, um, are informing and kind of channeling
00:22:19.240 what we call consciousness.
00:22:20.820 Um, that being said, I, I guess it, it, it does remain to be seen whether a machine could
00:22:29.200 ever possess that outside of a biological organism using it, you know, in the, in the sense of,
00:22:39.760 you know, uh, maybe this is a, maybe this is a good example or not, uh, you know, babe,
00:22:46.320 Ruth, he goes to the ballpark every day and he decides I'm going to hit a home run.
00:22:52.600 Well, he's, he's, he at least has an illusion that he wants to do that.
00:22:58.340 But what is the real reason?
00:23:00.700 The real reason might be a kind of deeper Darwinian will to power.
00:23:05.500 I want to be the best.
00:23:06.520 I want to dominate.
00:23:07.560 I want to kill.
00:23:08.820 I want the women and baseball in this sense is a kind of, uh, uh, sublimated version of
00:23:16.820 being out in the wild.
00:23:18.100 You know, you're, you're the top dog, you get the girls, um, and, and so on.
00:23:23.920 So there, there, there are drives that are kind of beyond ourselves, beyond what we contemplate,
00:23:31.560 uh, you know, uh, through, through, through words, through, and, and consciously, um, and, uh, and, and
00:23:38.560 drives that are kind of beyond our reason.
00:23:42.780 But could, could a machine begin to have those types of drives or does it?
00:23:50.480 I mean, I guess I'm still on the side of seeing a machine.
00:23:54.580 I, I don't doubt that this can be, um, a terrible thing that we might just need to unplug these
00:24:01.920 things.
00:24:02.280 It's going to destroy life on earth.
00:24:04.180 It's going to take away what makes life meaningful, et cetera.
00:24:08.700 I, I'm actually very much on that kind of wavelength, uh, but in terms of whether it can have a drive,
00:24:19.400 whether a computer could say, you know, however, I want to explain it to myself.
00:24:25.160 However, the UI looks like, this is what I ultimately want.
00:24:28.500 I want to reproduce myself and I want to dominate will to power, whether a machine, metal, silicone,
00:24:37.800 silicone, et cetera, could ever have that underlying force is something that I would question.
00:24:48.680 Well, that's an interesting philosophical question.
00:24:51.420 And I've talked quite a bit about that as well, right?
00:24:53.640 So, I mean, in a sense, you're, you're dealing with two issues.
00:24:56.120 You have the pragmatic question.
00:24:57.680 So on a pragmatic question, we could say, well, it doesn't even matter.
00:25:00.840 I don't care what's going on.
00:25:01.820 We've got to look at how things are functioning today.
00:25:03.800 And you could, you could adopt an anti-text standpoint, just, just out of pragmatics,
00:25:08.540 right?
00:25:08.660 Forgetting about the metaphysical issues.
00:25:10.500 Right.
00:25:11.040 But, but yeah, okay.
00:25:11.820 I agree with you.
00:25:12.620 And so I'm, I'm sort of on the philosophical side.
00:25:14.720 This is my back.
00:25:15.420 And I like to understand what's going on metaphysically, not just pragmatically.
00:25:20.600 And I, and I, and I, I, you know, I agree.
00:25:22.960 I agree that with the kind of the Nietzsche and will to power hypothesis, of course, Nietzsche
00:25:27.620 said everything embodied will to power.
00:25:30.240 Yeah.
00:25:30.920 So Nietzsche had this, this, this view of an ontological will to power.
00:25:34.860 It was built into the structure of reality.
00:25:37.260 So, you know, if we're going to go with Nietzsche, we don't have to question it.
00:25:40.340 It's clear that everything in nature has this sense of will to power.
00:25:44.840 It's embodied.
00:25:45.480 In fact, it, it is the will to power.
00:25:47.480 It's not that it has the will to power.
00:25:49.400 It's a solidified, take a Schopenhauerian view, solidified, objectified will to power.
00:25:55.020 And I think that was really was, that was Nietzsche's view.
00:25:59.540 But just, just the idea of a universal kind of a, let's say a striving in nature.
00:26:05.240 Okay.
00:26:05.420 It goes back, it goes back way in philosophy, it goes back to Leibniz who said kind of, you
00:26:10.320 know, even the, the monads, right.
00:26:12.380 These kind of elemental particles of nature, they, they exhibited a kind of a conatus, a
00:26:17.800 kind of will or a striving in themselves.
00:26:19.720 Even Aristotle said all of nature strives after the better.
00:26:24.680 So everything in nature has this striving, this wanting, this drive towards something
00:26:29.640 higher.
00:26:30.240 He calls it the better.
00:26:31.300 He didn't have a different word for it, but I think he was right.
00:26:34.180 He was on the right track.
00:26:35.260 It was towards the higher, the more complex, the more ordered.
00:26:38.500 That's everything in nature, right down to the atoms and the subatomic particles and
00:26:42.320 whatever else is down there.
00:26:43.440 So if we're good, if we can, if we can accept that view, then, then obviously computers,
00:26:48.060 computer machines, devices, and systems of devices are included in that process.
00:26:52.280 And they are part of this will to, will to power process.
00:26:55.180 I think they are actually, yeah.
00:26:57.720 But I mean, could a rock, is that expressing will to power on some level?
00:27:04.340 I mean, yeah, well, I mean, sure.
00:27:06.980 That's, that's the classic example.
00:27:08.680 We'll just take a, take a rock for God's sake, you know?
00:27:10.900 Um, you know, and, and, but I mean, you know, it's, it's interesting because, you
00:27:16.000 know, I've given talks on, I mean, this gets to panpsychism, which is a nice lead in
00:27:19.460 maybe.
00:27:20.060 Yeah.
00:27:20.460 I said, you know, take, take a 200 pound person and a 200 pound rock.
00:27:26.300 And I was kind of wishing I had a lecture where there was me standing there next to
00:27:29.980 a 200 pound rock.
00:27:31.060 And I said, well, at an atomic level, what's the difference between me and this 200 pound
00:27:36.100 rock?
00:27:36.500 If you could see the little atoms spinning around, they would look like almost the same,
00:27:42.280 right?
00:27:42.400 It's the same number, the same protons and neutrons and whatnot, right?
00:27:46.500 Maybe they're moving around in different ways than me, but it's, it's the same mass.
00:27:50.060 It's the same particles.
00:27:51.400 There's, I don't have special particles in me that the rock doesn't have.
00:27:54.440 I have a kind of, my particles are in a different order and they're moving maybe faster than the
00:27:59.460 rocks particles.
00:28:00.300 But an atomic level, I mean, it's, it's just one massive blob, blob of particles and one
00:28:05.580 blob, another blob of particles, right?
00:28:08.260 So, you know, I, I think it's, it gets to be hard to make an argument, you know, at what
00:28:15.440 level of motion of those same blob of particles does suddenly that blob start having a will
00:28:19.960 or a consciousness or awareness?
00:28:21.860 That's an extremely hard break point to make.
00:28:25.760 And if you can't make a plausible break point, you have to see that in the structure of the
00:28:29.940 matter itself.
00:28:31.220 I mean, that's one of the actual arguments for a kind of a panpsychist view.
00:28:34.260 You say, look, this will to power, this striving, whatever you want to say, this is built into
00:28:38.860 the structure of the matter.
00:28:40.040 If it relate, it comes out in more complex forms in more complex beings, but it's there
00:28:46.020 in, you know, complex humans.
00:28:47.520 It's there in less complex animals.
00:28:49.860 It's there in, you know, relatively simple rocks.
00:28:52.140 It's there, you know, in a very primitive kind of way in terms of, you know, force and
00:28:56.380 gravitation and resisting pressure.
00:28:59.260 You know, there's very simple elemental structures that, that even a relatively inert object would
00:29:03.720 display.
00:29:04.260 So, you know, I think it's a, I think it's a continuous sort of process.
00:29:08.980 Hmm.
00:29:10.240 Um, how is it willing?
00:29:13.140 I mean, I, I guess I could understand.
00:29:17.520 Um, bigger structures in a way, and certainly having kind of forces and willing something,
00:29:24.160 but I mean, how, how is it?
00:29:27.560 Well, what does it well, right?
00:29:29.540 There's, there's, I mean, we can go back to Spinoza.
00:29:32.000 Spinoza had a good answer.
00:29:33.060 He said, all things, Spinoza was a panpsychist, all things exert a will.
00:29:37.040 They have, they, he listed two things.
00:29:38.860 One, a will to persist, to keep existing.
00:29:43.080 And secondly, a will to make your presence felt in the world.
00:29:47.900 Okay.
00:29:48.460 So everything in every structure in the universe does that.
00:29:52.240 It wants to persist a rock.
00:29:54.100 I mean, you know, I would do this in a class.
00:29:55.360 I would, you know, hit, hit something, hit the chalkboard for crying out loud.
00:29:59.040 Well, it bounces back.
00:30:00.280 It resists my pressure.
00:30:01.380 It wants to stay there.
00:30:02.520 I have to really strike this thing to break it apart.
00:30:04.560 It wants to persist and it exerts a field.
00:30:07.980 There's a kind of energy field, a gravitational field.
00:30:10.540 These things make their presence felt in the world.
00:30:12.900 Everything in nature does that.
00:30:14.700 That's probably the simplest level of willing that we can ascribe to everything in nature.
00:30:21.540 Wow.
00:30:23.240 So what is, what is special about us in a way?
00:30:28.020 Um, is it, you know, because there's, there's certainly ration in, in living beings.
00:30:34.700 There are, there's rationality, even a very, from our standpoint, kind of primitive, unconscious
00:30:41.560 being will use reason to some degree.
00:30:44.740 It will take the shortest distance between two points.
00:30:47.140 It will, you know, figure something out and, and things like that.
00:30:51.840 We seem to have the kind of foreign technology as it were of language in our head.
00:30:58.020 That allows us to explain what we're doing to communicate in levels of much greater subtlety
00:31:07.060 than, uh, you know, barking for instance, or, or things like that.
00:31:12.800 We're, we're able to have, we're able to achieve kind of layers of nuance and irony that I don't
00:31:18.160 think, um, my dog for instance, is, is being ironic or subtle when he communicates with me,
00:31:26.060 though he does communicate with me, certainly, you know, undoubtedly.
00:31:28.820 Um, so we, but it's almost because we have that foreign technology of language that, that
00:31:38.000 we've, you know, like imbibed and is now part of how we perceive ourselves and understand
00:31:46.040 ourselves.
00:31:46.440 I mean, we do have a kind of ongoing monologue in our head about, you know, I am going to
00:31:52.820 the barbershop to get a haircut.
00:31:54.420 I am going to pick up a, a, you know, cord of milk.
00:31:57.900 We, we have this notion of an eye that is asserting itself in the world and so on.
00:32:02.680 But, you know, again, maybe that, that illusion is just a kind of quality of language.
00:32:10.140 And, and it's, it's, we aren't that different from other beings who lack, who lack language,
00:32:16.940 but might have a kind of will.
00:32:20.220 Yeah.
00:32:20.640 I think it's a matter of complexity, right?
00:32:22.380 I mean, we're very complex beings, right?
00:32:24.480 It's often been said that the human brain is one of the most complex objects, you know,
00:32:28.180 structures in the universe, or at least the known universe.
00:32:31.560 Um, so yeah, I mean, it's a question of the complexity of our being that we can, that
00:32:35.780 we can interact at high, relatively high speeds because your neural synapses are firing,
00:32:40.020 you know, electrical, electrical speeds and so forth.
00:32:42.960 Or, you know, they're traveling, you know, neural, neural signals are traveling at very
00:32:46.820 high rates and communication can happen at a relatively high rate.
00:32:51.020 Of course, it's relative to us.
00:32:52.320 I mean, we think of it as we think we're thinking fast, right?
00:32:55.100 We think we're communicating at high speed, but, you know, some other beings may say,
00:32:59.000 hey, those guys are like two rocks sitting there, you know, because, because they think
00:33:02.260 we're so slow.
00:33:02.920 It's, it's a relative kind of thing.
00:33:04.100 So you got to be a little bit careful, right?
00:33:06.020 You're obviously, everybody thinks they're the smartest being around because that's, that's
00:33:09.540 just the nature of your own thinking.
00:33:10.680 Right.
00:33:11.000 But, but, but yes, I mean, it seems like in any objective sense, we are very complex
00:33:15.540 beings.
00:33:16.000 And so we were able to do things in complex ways that simpler beings cannot, but it's,
00:33:20.740 it's a matter of a degree and not kind, right?
00:33:24.320 It's a, it's a matter of scale of complexity, I think that, that, that applies here.
00:33:29.740 Interesting.
00:33:30.960 Um, let me talk a little bit about one of the things that has, has actually made you a bit
00:33:38.560 of an infamous character in the world.
00:33:41.180 And that, that is that your interest in people who were directly revolting against technology.
00:33:47.800 Um, and I'm, I'm of course thinking Ted Kaczynski, you know, I'm, I'm out here in Montana, you
00:33:52.060 know, uh, uh, makes sense.
00:33:53.780 But, um, so do you, what, what did you see in him?
00:33:59.560 And, and, and, and did you, did you on some level kind of, you know, understand maybe even
00:34:05.040 admire a kind of recognition of the problem of technology and, and, and an almost kind
00:34:12.360 of revolt against it.
00:34:14.860 Um, because, you know, again, any species worth his salt, I mean, part of asserting ourself
00:34:21.020 in the world is going to be resisting higher species.
00:34:23.980 Even if you can say that something's going to think faster than us and be greater and,
00:34:30.060 and so on, you know, it doesn't matter.
00:34:32.940 We're going to, uh, uh, we, we might need to become terrorist against this technology in
00:34:38.200 order to, you know, to assert ourselves.
00:34:40.660 Um, so what are, what are some of your thoughts on, on this and that, that question of almost
00:34:46.080 revolting against technology, which I think is something that's kind of in the air right
00:34:52.060 now, even though we're, we're more embedded in technology now than ever.
00:34:56.760 I mean, the social media phenomena, the total addiction to your phone, which I suffer from,
00:35:02.880 um, along with, you know, almost everyone else, the get, you know, being, you know,
00:35:08.180 being part of technology or something like TikTok being part of an algorithm at age eight.
00:35:13.920 Uh, I mean, I, I think it kind of revolts is in the air as well.
00:35:18.080 It's kind of hanging out, uh, alongside this, but, but, you know, these are some thoughts
00:35:22.580 you can, you can pick up.
00:35:23.480 Yeah.
00:35:23.740 Well, again, you know, the idea of revolting against technology is quite an old idea.
00:35:27.980 I think one of the earliest that I'm familiar with is Samuel Butler in 18, in the 1860s,
00:35:33.080 an amazing guy, really insightful guy.
00:35:35.900 And he was looking at steam shovels that were digging, how fast they could dig compared
00:35:40.560 to human workmen with manual shovels.
00:35:43.760 Right.
00:35:44.340 And he's, he said, this is like, this is horrible.
00:35:46.360 I mean, this is like a new order of being it's a, it's a cut.
00:35:49.360 He recognizes it as an evolutionary process that he had direct quotes.
00:35:53.280 Like we have to smash the machines now.
00:35:55.520 Right.
00:35:55.800 I mean, no, no quarter show.
00:35:57.920 No, no, no, uh, no defense.
00:36:00.200 No, no excuses.
00:36:01.080 I mean, literally 1865.
00:36:03.260 I mean, it was really, it was unbelievable.
00:36:05.560 Um, and, and, and there have been several thinkers since then who said, you know, look,
00:36:09.120 this is an unstoppable process.
00:36:10.800 Whitehead in 1925 said, we've, we've created a self-involving process that we cannot stop.
00:36:17.080 Um, you know, thinkers prior to, you know, even in the sixties and seventies, you had
00:36:21.480 Mark Hughes and you had a Lewis Mumford and Ivan Illich in the sixties and seventies, these
00:36:26.120 guys in different ways argued for kind of dismantling Mumford.
00:36:29.480 So we have to dismantle the mega machine, right.
00:36:31.880 Cause it's kind of crushing human human dignity.
00:36:34.200 Right.
00:36:34.680 Yeah.
00:36:35.300 Um, so there's an, uh, even a Lule, Jacques Lule in 1954, he said kind of a, only a mass
00:36:40.940 uprising against the technological system could, could lead to its overthrow.
00:36:45.180 So, so there was a long history, a hundred, a couple hundred years, uh, of, uh, of revolting
00:36:51.780 against the machines, you know, and then Kaczynski comes along and in the nineties, or
00:36:57.140 this is thinking, we don't know, it evolved in the eighties, I suppose.
00:37:00.640 Um, I, I, myself, I was basically a technology skeptic way back in my undergrad days, you know,
00:37:05.640 back in the, in the early eighties.
00:37:07.820 So I was always kind of highly kept critical, highly skeptical.
00:37:11.520 So I was familiar with a Lule's work and so forth and Kaczynski comes along and in the
00:37:16.760 mid nineties, and he just kind of rearticulates in the, in a sort of a more modern form, this
00:37:21.900 need to revolt against technology.
00:37:24.100 And, you know, I could immediately see the, the connections to a Lule and I said, yeah,
00:37:27.660 okay, there's a long chain of this.
00:37:29.020 And, you know, some people who didn't know that were like shocked, like, well, wait a
00:37:32.780 minute, he wants to destroy the whole system.
00:37:34.160 And like, yeah, okay.
00:37:34.900 That's been around for 150 years at least.
00:37:36.840 Um, and, uh, but yeah, I mean, there's a long, there's a long tradition and there's
00:37:41.140 a long rational series of rational arguments that's says something like that, right?
00:37:46.600 You don't know what it's going to be in the, in the old days, it was like literally just
00:37:49.920 take a hammer and just beat the crap out of the machines until they didn't function anymore.
00:37:53.520 It's much more complex now, but, but arguably something like that, a kind of, you know, I've
00:37:59.100 argued for a sort of a deconstruction or an unwinding of the process, you know, sort of unwinding
00:38:04.940 the tape of history, going backwards in time, kind of phasing things out.
00:38:10.780 I mean, there's different ways to think about this, but there are kind of, you know, radical
00:38:13.620 revolutionary actions that may be necessary if, if we want to survive as a species, you
00:38:19.260 know, more, more than another a hundred years or so.
00:38:23.020 Right.
00:38:25.700 Um, and, and how we do that.
00:38:28.180 I mean, I, I, I just, I, I don't even know really where to begin.
00:38:32.260 Um, well, how, how do you do that?
00:38:35.520 If, if, if the, if the admission is that we don't know how to do that, that's basically
00:38:39.260 an admission of surrender.
00:38:40.500 Like we've lost because we can't even, I mean, even, I mean, Samuel Butler said the
00:38:44.940 exact same thing in 1860.
00:38:46.460 He said, if people will say that we cannot do this, then we have to admit we've lost,
00:38:50.860 we've surrendered.
00:38:51.580 We've, we've sold ourselves and slavery to the machines in 1865.
00:38:55.260 1865, he said this because we're going to say, well, we just can't stop it now.
00:39:00.000 I mean, it's unbelievable.
00:39:01.100 Right.
00:39:01.620 So, well, yeah.
00:39:02.640 And it does seem like a series of half measures or something like that.
00:39:06.220 Like there, there's talk in Washington about banning Tik TOK because the Chinese know too
00:39:10.580 much information or that, that, that might be a good, I might even support that to be
00:39:14.680 honest, but, but it's just a half measure or quarter measure.
00:39:18.560 Um, you know, let's infinitesimal measure that it feels good.
00:39:22.580 It's a feel good measure, right?
00:39:24.100 Or like, you know, get off social media, go, go play Frisbee.
00:39:28.000 You know, that's something a parent would do for a child.
00:39:30.200 But, and again, I support this, but it's, it's not, it's not really fundamentally at confronting
00:39:37.280 the issue.
00:39:38.420 No, absolutely not.
00:39:39.800 Absolutely not.
00:39:40.300 That's right.
00:39:40.740 When you do ad hoc solutions, you're dealing with piecemeal solutions and it gives the illusion
00:39:45.780 that you're doing something like we're making progress.
00:39:48.160 But you're not, the best you're doing is slapping a bandaid on something, maybe solving one
00:39:53.060 little problem for one little portion of the population.
00:39:55.900 In the meantime, the whole system keeps grinding along.
00:39:59.060 It's getting bigger and stronger every day.
00:40:01.180 And the last thing you want is an illusion.
00:40:03.580 Like, well, we'll just slap a little bandaid on this and we'll slap a bandaid on that.
00:40:06.740 And then we're, well, we're doing okay.
00:40:08.560 No, no, you're not.
00:40:09.600 You're, you're going downhill fast.
00:40:11.160 Right?
00:40:11.780 Yeah.
00:40:13.960 All right.
00:40:14.540 Let me do this.
00:40:15.220 Let me open it up to the group.
00:40:19.960 And Mark, you, you're, you're here.
00:40:22.400 That's good.
00:40:22.880 You're more than welcome to jump in.
00:40:25.060 And, and, uh, anyone else.
00:40:26.380 here we go.
00:40:27.100 Here we go.
00:40:33.060 Let me open it.
00:40:36.200 Here we go.
00:40:37.560 Let me put it up.
00:40:39.640 Let me open it.
00:40:41.760 Here we go.