Confronting Technology - David Skbrina
Episode Stats
Length
1 hour and 2 minutes
Words per Minute
126.00058
Summary
In this episode, we discuss the impact of technology on our world, and the implications for the future of the world. We talk about the role of technology, and how it affects us, and why we should be worried about it.
Transcript
00:23:35.960
so it's really very very uh i would say very clever in how it works it it entices us with
00:23:41.160
all these little benefits and advantages but the collective effect the the net result of the whole
00:23:49.160
system of technological uh developments leads to kind of a um yeah kind of a a system where
00:23:57.960
you're trapped into it you're you're you're you're drawn in in and you're you're compelled
00:24:03.960
to use this system you become part of the system you have to promote the system you have to sustain
00:24:08.680
the system uh almost you know beyond your attention and all you say is well look i just wanted one
00:24:13.960
little one little advantage i just wanted to use my cell phone i just want to be able to drive down
00:24:17.880
to the store i you know i would just want to be able to use a tool to make you know some food or
00:24:22.520
i want to cook some bread you know i mean it's just like well i just have simple little things
00:24:26.600
but these are part of a huge system which has to be sustained for your little benefits of your
00:24:32.200
you know your car or your cell phone or your email things just seem like very trivial what's
00:24:36.280
wrong with that kind of stuff right in an individual case there's really nothing wrong with it but
00:24:41.320
collectively the the net largest scale effect is really really a disaster so that's really this
00:24:47.400
interesting sort of um uh you know dichotomy that's going on that really has a hard hard time
00:24:53.640
for people to really grasp because it looks like individual things seem handy and convenient and
00:24:58.200
and fun and life-saving but the whole system is kind of just driving us into the ground it's
00:25:03.400
driving the planet into the ground it's driving humanity into the ground it's really really uh
00:25:08.440
yeah really a disastrous scenario here and then the more complex it gets the weaker each link in that
00:25:15.000
chain is much more dependent right or how do you how do you put it if if one thing breaks now the
00:25:20.840
whole thing could fall apart essentially like a very advanced or machine in of itself like you could
00:25:25.880
look at the well the collapse during the bronze age or something like that right you have an intricate
00:25:30.600
system of trade and dependency on different uh you know kind of empires or old civilizations at that
00:25:36.600
time and then one falls which leads to the fall of all of them essentially i kind of liken it to
00:25:41.560
us crawling longer and longer out on a branch that's getting thinner and thinner of sorts
00:25:46.440
and if that breaks does anyone know how to actually return back to it to go back to like
00:25:51.400
does anyone even know how to start a fire anymore you know what's yeah that's a good point yeah you
00:25:58.040
know i think it's actually a combination of things there are there are some dependencies and there are
00:26:01.960
also redundancies built into the system so in many ways the system is very robust i mean even you know
00:26:07.720
kaczynski talked about this when he said look we need to revolt against the system we need to bring
00:26:11.960
it down before it destroys us basically and and one of his points was that he's right this system is
00:26:17.400
very robust to certain kinds of attacks right i mean there's backup systems and there's you know
00:26:24.680
multiple modes to do things and i mean just like the internet how the internet works it's massively
00:26:29.400
distributed so if any one link or one portion gets chopped or cut out the the system just routes
00:26:34.600
around it and it finds a different way to happen right so so much of the technological system is
00:26:40.440
massively distributed so that makes it very robust in one sense but it's also an integrated system and
00:26:47.320
it relies on certain things like energy sources and at least until now human inputs and you know certain
00:26:54.440
thermodynamic requirements to keep the thing working so it's this weird interesting combination of of
00:27:00.920
really robust uh backup systems and duplicates and triplicates of of modes of operation versus a kind
00:27:08.440
of integrated possibly uh interconnected dependency on you know a few essential elements like energy or
00:27:17.000
information or you know other sort of yeah thermodynamic criterion so it's it's a really a quite quite
00:27:24.200
interesting complex situation oh definitely um i want to bring up this tweet here from elon moss because he's
00:27:29.560
working on the optimus bots he's working on the tesla cars self-driving cars he's obviously on the on
00:27:35.160
the cutting edge of of this type of technology that we're talking about uh with many other people many
00:27:40.120
other companies from palantir to google and you know the energy thing that you mentioned is very
00:27:44.600
interesting just the consumption of energy to power all of this now even the fresh water i read about
00:27:50.120
in terms of keeping these you know data centers running with all their ai stuff now but you know very very
00:27:56.120
very it's just crazy when you think about it we're we're asked to conserve you got to change out your
00:28:00.600
light bulb kind of thing but these companies are just like you know we can roam free essentially but
00:28:05.480
he said here as i mentioned several years ago it increasingly appears that humanity is a biological
00:28:11.960
bootloader for digital super intelligence kind of again giving us this idea that we kind of touched on
00:28:17.800
earlier that we're just kind of uh we're here for a part of the process but but what happens next and
00:28:23.480
again i mean i i kind of keep up with a lot of this stuff and where ai technology is and i know
00:28:28.440
there's probably r d development programs of stuff we haven't even heard about yet that they're involved
00:28:33.320
in or doing or things like this it's kind of like an arms race when you think about it right ai is
00:28:38.840
potentially the the greatest weapon ever created it's bigger than the nuclear bomb it's it's i mean these
00:28:45.240
are you know manhattan project style development programs times 10 times 100 we don't even know yet
00:28:50.840
actually depending on how expansive it gets and how dependent we get on it and civilization itself
00:28:56.840
essentially um but what what do you view what's your view on this that is like are we do you agree
00:29:02.680
with this like he's almost suggesting we're we're just here to kind of hand over to digital super
00:29:07.400
intelligence and at that point we become obsolete essentially that's what i'm reading here yeah yeah
00:29:12.760
exactly that's that's actually a very old idea i'm sure musk has almost really probably no idea what's
00:29:18.360
going on but the idea that the machines are here to surpass us in evolution goes back at least to
00:29:23.320
samuel butler around 1865 and he was observing how just steam powered machines like steam steam steam
00:29:32.200
diggers you know digging machines and steam engines uh seem to be evolving faster than biological life and
00:29:39.240
in in 1865 he speculated he said that these machines seem to be the next form of life beyond biological
00:29:45.880
life it's kind of a mechanical life and he also said if we don't like that fact we better destroy
00:29:51.000
those machines now before they take over and this was in 1865 i mean it's really a shocking statement
00:29:57.480
right and that this is one of the entries in my confronting technology book you can read the essay
00:30:01.400
there um so so it's really a very as 150 year old idea at least uh that the machines are here
00:30:08.520
they're going to surpass us and we're building the next level of evolution so musk you know that's just
00:30:13.800
the latest manifestation of an old idea um and it's it's consistent with what i was arguing about
00:30:20.200
technology being kind of a universal process right it's sort of moving through complexity
00:30:25.160
it got it got to you know simple life it got to complex life it got to human life and now it's moving
00:30:29.960
on to machine life or computerized life if you will ai sort of life ai intelligence and yeah maybe this
00:30:37.880
is just an evolutionary process and and and we don't know where that's going that's what that's
00:30:42.760
what musk is suggesting that ai is really part of a whole universal process and we are just a phase
00:30:49.880
uh in that process now whether that's whether we are only a phase if that's our only purpose
00:30:56.440
right uh or whether that's a catastrophic purpose i guess arguably could be a good purpose we don't really
00:31:02.680
know yet it's just it's just too it's too early in the process at least or or at least we're not
00:31:08.040
really able to understand where it's going i think probably we're our minds are probably a little bit
00:31:12.440
too simple too limited to really understand this whole process of where it's going generally speaking
00:31:18.120
in evolution when you get more complex forms of life they tend to coexist with the simpler forms of
00:31:24.280
life right so we we didn't eliminate all the creatures that are simpler than us we eliminated some of
00:31:30.520
them unfortunately uh but a lot of similar simpler creatures are still there we're sort of coexisting
00:31:36.040
with them you know and some people if they want to be optimistic to say well look we're just going
00:31:39.960
to coexist with these ai intelligences um and you know maybe we'll use them or maybe they'll provide
00:31:46.680
us with some benefits or they'll do their own sort of thing or you know i guess i suppose you can
00:31:54.280
sort of be uh you know you can be naively optimistic about this and say well look look you know all these
00:31:58.680
kind of great things that'll be coming and just i mean just read ray kurtzweil he's he's the prototype
00:32:03.960
of the naive optimist of all the wonderful things are going to come from from ai um but but of course
00:32:10.280
it's so powerful and the potential is so far beyond what we can understand that that any prudent person
00:32:16.200
is going to say look the risks you know and the dangers far outweigh any potential benefits or you know
00:32:23.080
the entertainment or the health benefits or whatever things um i mean if we lose control of
00:32:28.120
the system uh and these ais are sort of really beyond our control we you have no reason to think
00:32:34.200
that they're going to be working in our interest they will if we're if we're lucky they will care less
00:32:40.120
about us if we're not lucky they'll decide we're competing with them for resources and then they're
00:32:44.920
going to go squish you know and then there goes the human race because it's just too much competition for
00:32:49.000
for an ai system so there are there are just you know so many scenarios of disasters that no thinking
00:32:55.320
person with a conscience would willingly go ahead with this um you know it's certainly not that at the
00:33:02.280
high speed rapid rate that we're doing today right so you have to ask yourself what is driving the
00:33:07.320
process why are we actually going ahead in such an acknowledged dangerous way with such dangerous new
00:33:14.280
technologies and we're just barreling ahead with us full speed like like we can't help ourselves
00:33:20.440
like we can't stop it i know that's really that's really the frightening sort of thing to me yeah
00:33:25.000
has there ever been any technology we we did not proceed with because they're like ah this could
00:33:29.400
be i don't think there is i don't think there's a single example i can think of unless it's some
00:33:33.640
technology that like oh there's a test levels onto something in terms of you know wireless
00:33:38.360
electricity or free something whatever but they're like you kind of you know put that to the side
00:33:42.360
but that was just you know because they couldn't monetize it essentially in the same way but yeah
00:33:46.040
right i mean you know there's certain things that we've sort of you know put a lid on like chemical
00:33:49.880
warfare you know arguably or maybe nuclear weapons yeah it's a little bit arguably we've sort of tried
00:33:56.040
to put a lid on this kind of stuff so that it doesn't get out of hand it doesn't really get used
00:34:00.600
yeah you know some people say well look we'll just treat ai like you know chemical weapons or
00:34:05.000
something we'll try to you know just sort of you know have it but we won't let it get away from us but
00:34:10.840
i mean it's such a different sort of thing it's such a different kind of creature i mean almost
00:34:15.080
almost literally uh that you know whatever the ways that in the past that we would constrain or monitor
00:34:23.080
the the technologies it just seems like those are just absolutely destined to fail with with ai
00:34:29.400
so you know to me the only the only rational option is just to stop doing it just just to stop it to ban it
00:34:36.280
to block it if that's even possible if it's not possible then we've already lost the argument if
00:34:43.960
it's not possible to stop it then we've already surrendered to what we would call technological
00:34:48.520
determinism which says technology drives itself it's already beyond our control we already cannot
00:34:54.680
stop it so let's at least be honest right i mean people are like oh no we control technology we
00:35:01.000
design technology it does what we want it's here for for our purposes was a baloney if you can't
00:35:06.120
stop ai if you can't not do ai then you've already lost control you've already surrendered the point
00:35:12.440
it's already beyond your ability to do anything about it and and then only the most radical actions
00:35:18.040
will be called for or or you know run for the hills because it's going to get real bad real fast yes
00:35:23.480
i mean there's just there's really you know not many not many positive scenarios that we're looking at
00:35:27.720
here i know um there was a paper out here uh ai 2027 which was interesting and they talked a little
00:35:33.720
bit about this in terms of the two i mean again there's so many variables so even a study like
00:35:38.200
this is kind of like yeah okay sure what it might might turn out that way but it might be a totally
00:35:42.760
different you know avenue that this goes down but regardless just that what drives it right you you
00:35:47.160
talked about that and that's an interesting aspect because well it's kind of like this if we don't
00:35:51.800
do it the other bad guys will be that you know china or you know i don't know russia or some other
00:35:56.280
country or something like some other superpower so therefore the guard rails we put in place in
00:36:01.560
our system now we're hampering it right now we just have we have to proceed because if we don't get
00:36:07.080
this weapon they will and the other guys think the same way essentially and that the paper kind of you
00:36:12.120
know talks about this a little bit of basically the acceleration of this and there's other elements
00:36:16.600
we've into this where basically well now also the ai is undermining their human programmers and coders
00:36:22.440
and it's evolving and improving and rewriting itself so fast we can't even human humanly possibly
00:36:27.480
keep up with it and then they create other ai systems to monitor that ai and it's just this
00:36:31.960
cluster it's it's a gordian knot of just like it's impossible right just it's just right it's insanity
00:36:37.560
right yeah but but you're right i mean when you look at the even even the advocates of of advanced
00:36:43.480
tech and ai you know and i and i mentioned this in the book and i would always talk about this when i
00:36:47.800
was teaching my course it's like you you see this common little phrase that shows up a lot in these
00:36:52.600
guys and it's the phrase is we have no choice yes yeah so even guys like kurtzweil will say you know
00:36:59.160
we might not like it but you know we have no choice yeah because if we don't do it the bad guys will or
00:37:04.280
the other guys or the competitors will do it or some some rogue guys you know some you know lab in borneo
00:37:09.560
will do it you know oh yeah so you know i mean it's they go over and over again even if we don't
00:37:15.000
like it even if it's dangerous even if it might kill us you know what we have no choice and they'll
00:37:20.040
say that over and over again and that and that that to me that's a total surrender that means
00:37:24.840
you've completely lost the argument is over technological determinism is is proven the system is
00:37:32.120
beyond your control because you have no choice you can't do anything about it and and we either better
00:37:37.880
you know do the ket kaczynski you know take take the dynamite and go and just blow it up you know
00:37:42.520
real quick or or it's gonna like i say it's gonna get catastrophically bad sooner than we realize so
00:37:49.160
i know yeah there there's so many of these companies now working on this it's maybe partially a bubble
00:37:55.160
and inflated it's a little kind of maybe there's some aspect of you know kind of how they view nuclear
00:38:00.040
you know power in the you know if you go back to the 50s or something like your car will be powered by
00:38:04.760
you know nuclear fusion or something you know or you know nuclear power uh but but but this
00:38:10.760
this feels different in a way to me because it is also just it's actively rolled out we can watch the
00:38:16.200
progress be that generative ai being the tools that are now becoming widely available to the public
00:38:21.880
your personal assistant it does this and that for you it writes your emails for you and weaved into
00:38:26.760
all of this you have that issue of dependency as well that almost like we're i don't know if that's
00:38:31.480
the right term it's sleep walking into the post-human transhuman world essentially where we
00:38:36.920
maybe not directly right away embedded the technology in us or a computer brain you know
00:38:41.720
interface or something but just the due to the fact that we're becoming so dependent on it
00:38:47.080
makes us a slave to it right well exactly right you know you you sort of we right almost literally
00:38:53.720
cannot live without it right i mean that's that's part of this you have no choice sort of story right
00:38:58.120
we can't live without these things uh you know one one of the little exercises i used to ask my
00:39:03.320
students to to do uh was to take one day and not use a computer not use their cell phone not use email
00:39:11.880
for one day you know just just to see if they could do it and and you know they had a really hard time
00:39:19.960
just getting through one day of not using a cell phone or email or texting i mean as they were like
00:39:26.520
so i mean even the ones who didn't like it they're like yeah i hate it i have to use my phone all the
00:39:30.120
time and they couldn't they couldn't do it they couldn't get through a whole day
00:39:33.720
right with without just these most advanced technologies and using these things so um so yeah
00:39:39.720
you're you're really in this i like like your video clips i mean it's like unbelievable where this
00:39:45.960
is heading right i know you've got you've got these robots you've got you know i mean warfare is
00:39:50.920
horrible right because you're developing killer technology yes right they're working on what the
00:39:56.360
the the ukrainians is that the ukrainians it's really us are working on these autonomous killer
00:40:01.320
drones where you just release masses of drones and you give them vague directions of what they're
00:40:07.000
supposed to do and they go flying around and they're thinking and they're tart and they're
00:40:09.880
finding targets and they're killing killing people just like autonomously so so you're you're
00:40:15.080
specifically developing killer deadly technologies to operate on their own because it's really neat during
00:40:20.680
warfare to send those you know swarms of these drones out just random and looking for dangerous people
00:40:25.560
and just start killing them uh but that's you know okay now you're giving a you know a major weapon
00:40:30.920
putting in the hands of an ai system that that uh you know god knows how that could be used in in a
00:40:36.360
million wrong ways so it's well and even even the robot that's there to improve your life or help you
00:40:43.160
like the dishwasher was at one point right and and and i want to pick up on that but let me just mention
00:40:48.040
one thing i was thinking about too right because it was like this will save you time right this will now you
00:40:53.400
can do other things well the thing is everyone else can also do that now so how much do you
00:40:57.560
because it's all about competition as well right like if you outsource more of your things to robots
00:41:01.480
and machines that can do it for you now you can either then do leisure things or you can study
00:41:05.560
or you can learn something else or work on something else and produce other things but if
00:41:09.480
the adaptation of it is the same kind of across the spectrum then now everyone is just matching
00:41:14.840
your ability to do the same thing and then it doesn't really matter right and then instead
00:41:18.920
now the machines break down right yeah yeah it's you know this this goes back it's 100 years about
00:41:24.280
technology was supposed to be time saving labor saving you know it's gonna it was gonna do the
00:41:29.400
workforce and we're gonna have free time we'll be able to do hobbies and fun stuff and sports and spend
00:41:34.200
time with the kids and instead the complete opposite has happened it's completely sucked up everybody's
00:41:39.880
lives you're working faster harder more than you used to you're working two three jobs you know
00:41:45.720
you're scattered jobs because you're doing you know remote work and you got you know two or three
00:41:49.960
you know uh you know online uh you know employers that are paying you a little bit of money for gig
00:41:56.520
jobs and so forth i mean it's it's done and i've really emphasized this in my books it really does the
00:42:03.640
opposite of what it promises it promises to save time to make you healthier and happier and it's doing
00:42:09.000
the opposite it's sucking up your time it's making you sicker it's making you poor it's got it's doing all
00:42:14.360
the negative the opposite of what it promised us and that's that in itself is a very strong sign
00:42:19.960
that this this is a deceptive system that's at work it's doing the opposite of what it's promising
00:42:25.160
it's counter to human interest it's counter to natural interest because it's destroying the
00:42:29.560
environment and it's getting stronger all the time so where do you think that's going to go
00:42:33.720
is it going to suddenly switch gears and suddenly be good for humans and suddenly be good for nature
00:42:38.040
that's baloney that's absolutely impossible it's only going to get worse for people
00:42:41.800
worse for nature until something catastrophic happens that's the only way it can go there's
00:42:45.960
no other option yeah so back to the point there with the uh let's say you're basically a glorified
00:42:51.720
dishwasher right you have a robot at home now musk talked about this they want to scale production to
00:42:57.000
such an extent that these optimist bots are like they churn out what a few hundred million tens of
00:43:02.520
millions a month or something crazy like that and you wanted to set you know you can buy one for like
00:43:07.080
maybe ten to twenty thousand dollars if we get the scaling right kind of thing right assume you have
00:43:11.480
them in your home or something uh and sure it's just a dishwasher but the reality is every one of
00:43:16.120
these tools is a potential weapon in and of itself even if it wasn't designed to be a weapon it's going
00:43:21.960
to be stronger than you it can find ways to because again right it's centrally connected to one global
00:43:28.200
computer or one global brain as they call it right to one brain and what one of them learns all of them
00:43:34.040
learn right so they learn from exactly it's not it's not an independent thing exactly it's tied
00:43:39.160
into a very large system it fosters dependencies right because if you're if you're just not it's
00:43:44.520
not going to do your laundry or your dirty dishes it's going to take care of you it's going to support
00:43:49.000
you it's going to take care of old people it's going to take care of sick people and then we'll
00:43:52.200
say well look what we have to have this because otherwise you look people will dive if we don't have
00:43:56.360
this and then it becomes a necessity at first was just a kind of a cute little luxury to have all
00:44:01.000
this thing's doing a little bit of household chores for me now suddenly i gotta have one i can't do
00:44:05.160
without one and look it's interconnected to other systems and i don't know what it's what it's doing
00:44:09.320
what it's you know what's driving this thing and it can it can it can take alternate action you know
00:44:14.280
at a moment's notice so yeah again just a myriad of problems that we're facing with those kind of
00:44:19.640
kind of devices yes exactly i mean tesla cars they argued was is is essentially a tesla optimus bought
00:44:25.960
bought but it's on wheels right it's learned and studied the environment yet sensors and cameras
00:44:31.560
even when it drives along the road it's actually studying humans and how they walk and every every
00:44:36.680
data point right you go back to this idea of like why was you know facebook and these big data big tech
00:44:41.560
companies always you know come for free it was not it's not to sell you ads or like you know tailored
00:44:47.080
ads or something it's just they're studying us right and and each one of us now have a representative
00:44:53.480
in some of their digital supercomputers they're running simulations and all that stuff because
00:44:57.480
if you're creating this kind of omniscient omnipresent being essentially uh that knows
00:45:03.640
everything that sees everything there's cameras and sensors and haptic devices everywhere you're
00:45:08.920
basically creating this god essentially right that's what that's what this is right well that's right
00:45:15.240
i mean you know you're reading a lot these days about what palantir is doing in conjunction with
00:45:20.200
the u.s government right it's some kind of comprehensive mass data collection monitoring
00:45:26.200
processing system and they pitch it as well we're going to use it to look out for the criminals or
00:45:31.400
the illegal immigrants or whatever right they'll they'll throw some nominal beneficial uh purpose out
00:45:36.680
there um but but in the end i mean it's collecting data and everybody it's monitoring everybody it's
00:45:41.800
watching everybody it's making uh drawing conclusions about everyone and and who knows what's going to
00:45:46.440
happen with all that data so you really have to worry about you know what's happening with all
00:45:50.520
this what how the system is working who's running the system who are these people who actually are
00:45:54.600
doing all this stuff i mean you know what's what's their motive so you know what uh that that worries
00:45:59.320
me as well right oh for sure and i always said that too like they officially even that ai 2027 report
00:46:05.240
i showed they're like kind of oh basically us versus china and the ai develop you know targeting race or
00:46:10.680
whatever uh for for ai to get control of it or do the best one i guess and it's like well no there's
00:46:16.200
multiple variables behind that who's programming it who has the back doors who can hack it who can
00:46:22.360
uh alter it or change it even unbeknownst to the people that might be programming it on different
00:46:27.640
um set of pretenses than those who actually have paid for the project even or something i mean there's
00:46:32.840
because you still have human nature of um not being removed from this where you do have ethnic
00:46:38.040
interest you have other you know interest groups you have certain uh you know the global globalist
00:46:43.720
interest industrialist interest you have okay religious interests right all of these things are
00:46:47.960
still true as this is being developed and in fact so far and i want to ask you more about this in in
00:46:52.200
part two when we continue uh later here but all of a sudden you have a completely different variable
00:46:58.600
that i don't even think studies like this go into or take into account at all that there are they have
00:47:03.560
this like good faith approach approach to it of like no this is just u.s interests you know kind
00:47:09.160
of thing it's like well is it right i don't i don't think it is right no i think i think there's a
00:47:16.040
lot a lot of dual loyalties out there and these guys you know we've got people working multiple
00:47:20.440
interests and multiple multiple forces going on and yeah there's a lot of a lot of room for
00:47:25.320
nefarious action out there there's no doubt about it some of the largest corporations right open ai
00:47:29.960
sam altman we have google love care one of them i mean they don't it was eric schmidt more recently
00:47:34.520
but still i mean it was larry page sergey gay brin you have oracle larry ellison they had their
00:47:38.520
project stargate thing that they're talking about doing the trump announced with them palantir alex
00:47:43.640
carp john lonsdale i mean are they are they aligned with our values these people it doesn't look like it
00:47:49.080
to me yeah they have their own interests yes it's remarkable how many you know israelis i mean jews are
00:47:55.640
running these institutions i mean you know what those guys have a lot of connections to israeli
00:48:01.400
intelligence and the mossad and you know whose interest are they looking out for i mean you get a
00:48:07.320
lot of information that's pumped israel about how you know how the us operates and how the world
00:48:13.720
operates and that's coming from from these guys who are running our major tech companies i mean that's
00:48:18.360
not that's that should not be a surprise to anyone no definitely not yes we'll get into this more here
00:48:22.680
i did uh i guess kind of points questions here for you uh archie says breaking the spell of jewish
00:48:28.120
power seems to fulfill jewish eschatologies we are uh damned if the spell is broken or not
00:48:35.560
can you describe a metaphysical relationship between technology and the jewish spell he's asking you
00:48:43.800
yeah so it's quite a bit to chew on that's a good question um yeah i mean so if if you go back
00:48:56.280
yeah one of my areas of study is the is the bible so you go back to the old testament sort of the roots
00:49:00.680
of judaism and you see very very clear trends in there related to um kind of a hatred and distrust of
00:49:11.400
non-jews and you see a quest to dominating control control i mean if not to control the whole world
00:49:19.080
i mean this is what what you read in the old testament god gave the world to the jews and it's
00:49:23.080
theirs to own and to have dominion over it's not it's not to everybody else it's not to humans in
00:49:28.120
general it's just to the jews so so from the very beginnings this jewish mindset is about control
00:49:34.200
dominate and suppress the the non-jewish populations whatever those may be so uh it's it's kind of
00:49:42.360
striking how advanced tech and ai in particular is is a means to to do that so it so it fits i think
00:49:49.320
very well into this jewish eschatology this kind of you know ends oriented outlook where they're sort of
00:49:55.880
using it to control and suppress and to dominate other people arguably the whole world in the end right
00:50:01.880
so um yeah that's that's a very very another very troubling instance of just the human maliciousness
00:50:09.000
that's that's you know at least one component of this problem that's completely apart from all the
00:50:14.600
technological problems that the technology itself manifests yeah so you have the human aspect which
00:50:20.440
is troubling and then you have the technological aspect which is troubling so it's again it's a multiple
00:50:25.320
multiple problems that are going on here in parallel yeah and i feel there's very few that even
00:50:29.640
acknowledges this as a as an issue where it's a problem you know i mean because they're just totally
00:50:34.040
oblivious to it or they just like well what does that matter you know because well it does
00:50:38.280
every all of these variables matter especially when you're talking about this being used as weapons
00:50:43.320
uh autonomous warfare and you who programs them what's their intention or whatnot but yeah there's
00:50:48.600
no sign that there's any kind of guardrails i think was it trump that tried to put in a
00:50:53.400
a that no state could ban any type of a or put any guardrails i forget the wording of this but it was
00:51:00.440
something to that effect that no state can step in and say you can't do this when it comes to ai
00:51:05.400
development that was initially part of this the the big one big beautiful bill thing uh but apparently
00:51:10.360
was scrapped unless someone you know miss was misinformed with that uh which i guess is good but um
00:51:15.640
um but that doesn't matter still you could still have people doing this on their own as far as i
00:51:22.040
know uh or maybe not on their own but like that are running the ai developments that most people are
00:51:28.440
not even aware of they're kind of off the books you know i mean like yeah exactly hidden sites or black
00:51:34.040
sites exactly so yeah there's a lot of things that can go on that you really can't can't monitor
00:51:39.320
and can't know about so uh yeah i'm not surprised that trump who you know tends to work on behalf of
00:51:45.320
jewish and israeli interests i mean he wants to keep this this ball rolling because they think
00:51:49.320
they're going to use it to their advantage and it may well destroy them and the rest of us in the
00:51:53.080
process so it's another sort of a nightmare scenario that uh you know this this this lust for power
00:52:00.440
and control which ai seems to offer could be sort of the ultimate frankenstein story which you know
00:52:06.520
turns on the uh on the maker and and destroys us destroys us all potentially yeah like a golem
00:52:12.280
jewish mythology exactly uh rt says uh is mind an emergent property from matter or is matter an
00:52:20.760
emergent property of mind what do you think of isaac asimov's uh story the last question
00:52:30.040
yeah that's a good question i don't i don't know the asimov story so i i can't comment on that but but
00:52:34.920
uh yeah i spend a i spend a fair amount of time part of my work is in philosophy of mind
00:52:40.840
and uh there's a lot of problems the idea that mind is an emergent thing uh a lot of people just
00:52:46.680
assume that mind emerges at a level of biological complexity but that seems to be very hard to defend
00:52:51.480
philosophically um and this again is consistent with my pan technicon story which says there's a
00:52:57.720
kind of logic or intelligence uh or consciousness maybe that's built into reality into sort of the
00:53:04.200
universe itself um and that that's consistent with this idea that i've also researched on panpsychism
00:53:11.080
which says you know this kind of psyche or mind is is part of part of nature it's built into the
00:53:16.280
structure of nature it's one of the one of the essential qualities of nature and that would so so
00:53:21.720
there's the revised edition of the book um so yeah it's it's that's a very old idea this is this book
00:53:27.480
is basically a historical study showing how this idea goes back to the the even the the pre-socratic
00:53:33.160
days uh in philosophy but um but yeah mind intelligence i think is kind of a it's an essential
00:53:39.960
given to reality so it's there it's it's the logos of the universe is a kind of a mental thing it's a kind
00:53:46.360
of an intelligence in the universe and it's there in the universe as a whole and an individual
00:53:51.640
things individual components whether it's matter simple forms of life complex forms of life or
00:53:56.920
systems um so so yeah mind is not an emergent quality because that's almost impossible to defend
00:54:04.520
philosophically it's built into the structure of reality and that's part of what we're seeing with
00:54:09.400
the evolution of ai it's it's really this kind of intelligence is really sort of coming to the fore
00:54:14.680
that's built into the way matter and energy works and you're really kind of seeing it manifest
00:54:20.120
itself here in a concrete form yeah i'll ask you more about it's like is it inevitable kind of
00:54:24.920
thing we'll talk about that later but uh one more question here for you and then we're going to uh
00:54:28.520
plug some of your websites some of your books we'll mention all of them and then uh take a short
00:54:32.200
break there but uh thoughts uh azulem eight is that yeah azulem eight says thoughts on rupert
00:54:37.800
sheldrake he debunks a ton of dogma dogmatism in science have you do you know of rupert sheldrake's work
00:54:44.040
a little a little bit yeah he he's a he's a kind of a panpsychist so i mean i think he is sort of on
00:54:50.440
the right trail he's kind of a renegade uh you know fringe thinker in some ways uh he talks about
00:54:58.440
some kind of spooky no not the spooky effect what was the the uh what was it again here i had rupert
00:55:03.480
sheldrake on the show many many years ago actually uh yeah i forget that title he had but he had all
00:55:09.880
kinds of things over like well how do you how does your dog know like when you're coming home like
00:55:13.640
before you're actually home your dog knows that you're coming home like it sends extended mind
00:55:19.080
concept holistic uh you know kind of a vision of reality so um i think it's interesting stuff i i don't
00:55:26.200
personally find a lot that i find compelling in in sheldrake but i think he i think his intuition is
00:55:31.640
right that there are some of these uh these connections are sort of built into the structure of
00:55:35.400
reality i think that's that's probably a correct statement to make i want to find the the actual
00:55:42.120
term he used and i'm i'm i'm beating myself for not morphic resonance that's right morphic resonance
00:55:48.120
that's the theory right kind of thing we talked about that yeah it's interesting i mean you could
00:55:52.600
one of the arguments for that i want to get too detailed on this but it was this idea of
00:55:55.960
when we actually uh genetically modify something again another type of technology we didn't talk about
00:56:00.760
yet our food supply you know improving food and things like this um that you have to kind of keep
00:56:06.360
doing it or it will just reverse back to its kind of basic form that it was intended to do uh apparently
00:56:12.200
now they are breaking that by applying different methods apparently where coming generations now uh
00:56:18.040
preserve the genetic changes that you actually did up end up making so i'm not sure if that disproves
00:56:22.600
morphic resonance or not but regardless maybe you've altered the morphic resonance then or something but
00:56:26.280
anyway different topic yeah it's a little bit of a platonic form thing right there's
00:56:30.680
these forms out there in the in some metaphysical realm and i mean it's it's a very platonic sort
00:56:35.640
of thing i mean it's kind of interesting idea but yeah i'm not sure it gives a lot of insight
00:56:40.120
into what's going on today at least not to my knowledge yep good to see adam and uh chat as
00:56:44.760
well adam green no no more news good to see i think you were on with him a while ago right no more news
00:56:48.840
adam yeah and i'll be talking to adam again pretty soon okay cool cool excellent all right so yes let's
00:56:54.600
plug some of your stuff here obviously your website davidscabrina.com and that's s-k-r-b-i-n-a
00:57:00.760
davidscabrina.com people can find links uh are the links to books on the website maybe they're not
00:57:06.600
actually i clicked in there if you click in on each one i think they are maybe yeah i don't know they
00:57:11.560
should they should be there somewhere yeah yeah they're there okay they are there so yeah we got
00:57:15.800
confronting technology obviously the kind of main main thing we're talking about now the ecotheology is
00:57:20.360
very interesting too something i want to ask you more about maybe later if we do have time for that
00:57:24.120
um the participatory mind panpsychism um metaphysics of technology you mentioned that you wrote a book
00:57:31.960
on the jesus hoax uh the calling son of god son of the sun a lot of titles here for people to dig into
00:57:38.920
where should they begin if they're new to your work do you think i mean i guess speaking from the
00:57:41.640
technology angle then where should they start yeah just yeah um confronting technology is is a reader it's
00:57:47.640
a it's a it's an anthology of critiques of technology kind of historically speaking so it
00:57:53.000
goes back to to the greeks and then it kind of moves through history as a critical reading um
00:57:59.960
yeah critical reading anthology for for views on technology so that's that's actually the book that
00:58:04.600
i use for the textbook when i taught the course so it doesn't have any analysis it's not my own words
00:58:10.680
it's just a collection of readings but it's interesting readings and it's relatively straightforward so
00:58:14.760
that's i would say that's a great book to start with uh metaphysics of technology is a little bit
00:58:19.480
of a higher level uh discussion but i try to walk people through step by step the arguments so uh
00:58:25.720
either of those two two works i think for most readers what they would find it quite quite interesting
00:58:31.080
all right very good david stay with us here then we're gonna take a short break here in just a
00:58:34.760
minute or two i'm gonna do a couple of plugs here uh where people can go if you guys want to go
00:58:39.000
sign up and uh watch part two it's a great way to support us as well remember we are ad free we don't
00:58:44.040
interrupt our broadcast with annoying vpn commercials and gold and crypto tokens and
00:58:48.200
everything else you can head over to redisemembers.com or you can go to our locals page
00:58:51.880
redicetv.locals.com or subscribestar.com slash red ice helps us tremendously so thank you guys we
00:58:57.880
appreciate that also we want to give a shout out to our special supporters here before we wrap up part
00:59:02.440
one we got our executive producers arctic wolf albert thank you so much we appreciate your support
00:59:08.440
also thank you to william foxx from america first books we appreciate you we got angry white
00:59:13.400
soccer mom thank you as well for your support we got purple haze as one of our executive producers
00:59:18.920
we appreciate you we can't do this without you guys so you're vital thank you glenn as well thank
00:59:23.560
you for your support much appreciated red red pill rundown check out his odyssey channel down below
00:59:27.960
there if you want thank you for your support as well we got president ubunga keep pounding that chest
00:59:32.680
man executive producer teutonic werebear thank you we appreciate you as well okay we got two more
00:59:39.000
here good luck lap thank you for your support as well and last out the gates for executive producers
00:59:43.720
no one jeebs thank you man we appreciate it then we have our producers charles turner jr you once on
00:59:49.320
leroy demand eyes open single action army lord hp lovecraft trevor der schwabe shane b alcion the boo
00:59:56.280
man aurelian and perfect brute thank you ladies and gentlemen we appreciate your support very very much if
01:00:01.640
you want to get your hands on one of those you can do it at red ice members.com or subscribe
01:00:05.960
star.com slash red ice so uh we'll take a quick little break here david thank you so much again
01:00:11.480
for joining us here for part one and we'll be uh right back after a short break give us five or so
01:00:16.280
and we'll boot up the second part thank you david see you on this on the other side ladies and gentlemen
01:00:21.560
thank you for watching go to red ice members.com and sign up for our exclusive members content don't
01:00:33.880
miss our latest shows interviews and other videos only for subscribers you can also become a member by
01:00:39.880
signing up at subscribestar.com forward slash red ice get full access and help support our work see you on
01:01:05.640
new red ice merch available now both first t-shirts for adults and for toddlers fatigues for men
01:01:13.480
our favorite the red ice camper mug or ceramic with black print high quality leather keychain with
01:01:20.520
solar boat imprint our red ice hat one of our best sellers pick one up today or why not gray oslander
01:01:30.840
rouse t-shirts for both women and men we also have fridge magnets bulk burst one disease and our black men's
01:01:40.120
red ice t-shirt with the classic red solar boat
01:01:46.680
book first get your red ice merch from lana's llama.com get an item today lana's llama proud sponsor of red ice