In this episode of the Red Ice TV podcast, Henrik and I have a chat about artificial intelligence (AI) and what it means for the future of the world, and the impact it can have on our daily lives.
00:04:22.200which i think um we haven't we haven't seen the likes of this a technology that's basically primed
00:04:29.000to replace every aspect of human labor and uh well it's up to this point uh purpose uh that's right
00:04:35.560uh so far you know how you doing eric thank you for coming on i'm doing great thank you very much
00:04:39.720for having me and i i completely agree yeah for the record i am not an ai expert i don't work with
00:04:44.600one of the big ai companies that's pushing all this tech forward but i i do work in tech and uh
00:04:50.680that's really positioned me well to see how these technologies are changing what's you know
00:04:55.320interesting to clients in the market how are people using these things what's got traction
00:04:59.400in the business world so i'm more of a more of a close-up spectator i would say yeah exactly uh
00:05:04.360i mean again that's kind of like you know my job part of doing this is as i've said many times
00:05:08.840before but threat assessment looking at the whole map what are we what's the dangers that we face
00:05:13.960both in terms of our people our folk but also in terms of my my children you know everything right
00:05:18.040I'm just like always been a very astute observer.
00:05:21.220I've always appreciated and kind of liked technology to a certain extent.
00:05:25.800I think I understand the dependency, obviously, that we're developing on it,
00:05:30.980but at the same time, the potential that it has.
00:05:33.140But I do have to say off the outset here, this is a different technology.
00:05:36.300This is not like we can understand where like maybe where technology makes us weaker, right?
00:05:42.280Like let's take a couple of examples, right?
00:05:44.340You could go back to, let's say you go back to the, what is it, the Paleolithic times or something, or even before that, like before we have fire, right?
00:05:51.480It's like, okay, fire is a technology, but did it weaken us?
00:05:58.080We understood how to harness fire, how to make sure that we have it in our procession, how to start it eventually, things like this, right?
00:06:56.260And now I think we're coming into that,
00:06:58.420the type of technology now is where it's replacing our cognitive ability.
00:07:02.440And the question is, what happens when we stop using our brain?
00:07:06.600I think that's a completely different technology altogether, wouldn't you agree?
00:07:11.120Absolutely, I agree. And the thing is, to me, it's kind of about surface area. So those previous changes in the way that the economy worked, the industrial revolution, machines coming online, what that allowed for was the retreat from those who used to have to engage in physical labor to either other physical labor, or more importantly, especially in the last 100 years, to our last, let's say 75 years, to the realm of what we often just call knowledge work, right?
00:07:39.120So it allowed for more and more brains to be put to work in a way that was meaningful.
00:07:44.680But now AI is coming in and is kind of threatening that final frontier.
00:07:49.140And at present, we don't have another one to retreat to.
00:07:51.800If we lose knowledge work, it's a massive portion of the world's population who's currently
00:07:56.800providing for their families and having a way of engaging with the economy meaningfully,
00:08:02.640as well as, unfortunately, we've also got robotics spinning up at the same time and
00:08:06.880rushing to meet this not to mention the fact that ai itself will help in the development of those
00:08:12.320robotics to the point of practicality so it's it's not it's we're kind of rushing to terminator all
00:08:16.880at once but more like terminators in the fields or in the office or you know swabbing the decks
00:08:21.000i think the chances of like n ai or the ai whatever you want to call it there's many many
00:08:28.180different ones competing ones which one will be the thing maybe maybe it will be a multitude
00:08:32.220obviously but still yeah the chances of it outright just you know building like death bots
00:08:38.180or something to come get us it's very it's very small obviously i i if it's if it's sufficiently
00:08:44.180intelligent it would fight us if it sees us as a nuisance or a problem or like maybe kind of similar
00:08:50.240how we view like ants when we're building a structure or something like we're not going to
00:08:54.520take the time to even move the ant call like why even bother right but even if it did that even if
00:08:59.820it saw it as a nuisance or yeah a resource competitor essentially um it would fight us
00:09:05.960in a type of war where we don't even understand that it is a war i would assume right yeah i mean
00:09:11.780we could easily get to that point it's kind of humorous but i end up thinking about kind of the
00:09:16.860late doom possibilities far less only because i feel like you know you titled this about like the
00:09:21.680black box of ai right i feel like those late stages are the most black box like it's very
00:09:27.560hard to penetrate that far into the future and understand, like, is it one bot? Is it many bots?
00:09:32.500Sure, we could be like ants to it. But I think that the counterpoint to that could be,
00:09:36.080but it might also have the empathy of a god, right? So it might be like us,
00:09:40.060our relationship with ants, but with a much greater consciousness to where it can both,
00:09:43.820you know, rub its tummy, chew gum, and take care of every ant in the way. We don't know.
00:09:47.280So I often find myself more consumed with thinking about the next 5, 10 to 20 years,
00:09:53.000that interim period before we get to whatever sort of super intelligence that is, there's a
00:09:58.140whole lot of runway in which we have to deal with very powerful humans at the helm of an increasingly
00:10:03.620powerful AI that is taking up all of the space upon which we've been able to perform labor to
00:10:08.420retrieve resources and to engage with each other. That's where most of my attention lies.
00:10:13.020Yeah, because we don't know what direction this will go, right? Will it benefit
00:10:15.640the people who are coding it? Some people said they're actually, you're growing AI,
00:16:07.800You get specialization, industrialization.
00:16:10.320All of a sudden, while I sit at an office, but the point is kind of the same around that,
00:16:14.820that is like you need to basically do something to carry your own weight, right?
00:16:19.080You need to, as you said, kind of contribute value to the overall society that we live in.
00:16:23.460But what's interesting about the system then is that humans are basically, within the system, educated and trained to be like machines, right, in a way, you could say, where the system is a machine-like system.
00:16:37.580But the reality of this then is that machines will obviously be better at being machines than humans will be, right?
00:16:45.380Much better. So as the technology itself progresses, to make machines, ironically, also more like us, because we're making them in our likeness, here we go with the kind of spiritual god-like analogies here, but still, but it's just a matter of time before we then basically are replaced by the very technology that we are now building.
00:17:06.860I kind of liken it to, you know, those things that they present with like the natives, I guess, in South America or something in this case.0.84
00:17:14.020The natives are standing on the beach and all of a sudden here's this ship approaching and obviously, you know, contained within it is superior technology and they don't stand a chance.0.99
00:17:23.440And it's very kind of similar here, right?
00:17:25.100We're kind of the natives, but what's strange here is that we are the builders of the very technology that probably will replace us.
00:17:33.000I don't think there's any civilization that fare very, very well when a superior technology showed up.
00:18:29.940So it's not that these technologies don't have that capability, but unfortunately, what we're seeing is not modern technology and AI-generated content creating better people, more connected people, more engaged people, better attention spans, right?
00:18:46.000Like we were already pretty disillusioned with what was happening just with social media
00:23:59.960No, in fact, as I said, even if the people, this is third variable that obviously people that warn about AI or talk about the threats of it or whatever, they don't even take that into account.
00:24:09.240They're talking about like alignment issues and values and all that kind of stuff.
00:24:12.940But what even, but what about the people that are actually building it?
00:24:16.120that we can't even ensure that they will be have our values and that they're hoping it's aligned
00:24:21.340with them. There's plenty of us who aren't super happy with the alignment of the general population
00:24:26.560and the way things are now. So at best, they'll get it aligned to the thing that we are already
00:24:30.780pretty skeptical of. So there's a headline here on the screen, mind captioning AI decodes brain
00:24:36.380activity to turn thoughts into text. Now, of course, it's using, what was it? It was a couple
00:24:42.340of different technologies that they're using different type but the point is this is already
00:24:47.460kind of in the works and i would assume that this will be uh far less uh advanced going forward and
00:24:53.600it will be far less intrusive it won't be like well you have to put this kind of helmet on or
00:24:57.900something or put a you know a brain chip in your skull like no that's uh that's old school i think
00:25:02.980the watches and some of the wearables now that they are talking about at least um are kind of
00:25:10.560sensitive enough to basically pick up on your nervous system. And basically, at some point,
00:25:16.800it's going to be able to decode brain activity. And I don't even know where we go at that point,
00:25:22.080to be honest, because as you said, even now, most people are receptacles, essentially, of
00:25:28.840inputs of other people's wills and commercials and music or movies, whatever. They're not even
00:25:35.280themselves already can you imagine this as a layer in terms of manipulation or kind of knowing what
00:25:41.800it it will know what you want before you even know it yourself that type of predictive ability
00:25:47.560you know that's right and again that's the other side of a feedback loop right it is it is shaping
00:25:51.880your desires it is shaping your chemical reality constantly with the with the content you intake
00:25:57.500and then you're going to have all of these i mean they're i'm sure folks have seen it but you know
00:26:02.120your wi-fi can be used to make a map of your house and see the router yep yeah it's a it's
00:26:07.680a lidar essentially now they're building into the new uh chips yeah precisely and that was like they
00:26:12.240could do that it was like kind of fuzzy like 20 years ago it's really good now right now i mean
00:26:16.740i'm sure you saw recently that now there's going to be kill switches in every vehicle and ford and
00:26:20.760other companies are going to have they're now going to allow law enforcement to tap into the
00:26:24.620the cameras and different things that are inside of the the interior of your car that are looking
00:26:29.140for are you tired are you drunk are your pupils dilated right so like just what it can read on
00:26:33.740the surface i mean there's a lot that can be said about your mindset and your chemical state just
00:26:38.080from your skin temperature pupil dilation micro expressions now you add on to what you've got on
00:26:43.000the screen right now with actually being able to get into our brain i think that a lot of that will
00:26:46.960be happening before we even realize what the technology is that's doing it unfortunately
00:26:50.320yeah exactly um okay so there's a lot to break down there i talked about the financial system
00:26:56.700And I was kind of, I want to convey that picture, I guess, or paint that picture for people of us, basically.
00:28:01.920Because the tokens of the words that it processes are just what we call
00:28:08.900numerical symbolic values based on actual words, right?
00:28:13.960So the words are really kind of at the root of it.
00:28:16.460It takes a word and it breaks it up into different tokens, right, which is numbers.
00:28:21.220It crunches these numbers in these gradient layers, essentially, like in these big data centers.
00:28:26.480And exactly, again, how that worked, the gradient descent.
00:28:29.360And they're talking about dimensions, even like relatability between words, like even where words are, you know, they're spelled the same way, but within the context, they mean different things.
00:32:54.220and replace them with AI, we'll make more money.
00:32:57.120Okay, great. Let's do it. Can it work? And in some cases, you know, it might have varying degree of success and it might not be, I've heard of some people as like, oh, well, some companies did that and it didn't work and now they're rehiring or whatever. Oh, well, trust me, they'll be back. Okay. This is not like over it. They're not going to, you know, they're going to keep refining this and eventually at some point it will click and they got it. Right. But so the point with that is, you know, the UBI talk, universal basic income, it will just produce things for us.
00:33:26.580you know like what do we how does this work right but the point is i've seen some estimates and
00:33:32.740stuff like that right and they say ubi even if it even temporarily would work long term how does
00:33:39.080that even work right like ai itself is then starting to kind of like generate income or
00:33:45.160money or because it's producing things and selling those things well presumably it needs to sell it
00:33:50.520to humans i would assume or you're going to sell it to other ais maybe for other reasons that's
00:33:55.660possible they might have their own economy or ecosystem or whatever but at some point if we
00:34:01.180don't get salaries if we don't have money we're not going to be able to buy all these products
00:34:05.120that these great machines produce for us all the automation and all the ai and stuff like that
00:34:09.080right we will literally stand on the outside looking at this thing that we've built and we've
00:34:15.620designed our way out of it essentially and it's almost like but but you'll have this huge system
00:34:21.440of an economy running it will be running your civilization it will be your judges your police
00:34:26.360enforcement i know i'm going to kind of ahead of myself here but i'm just thinking of the all
00:34:30.260encompassing like the efficiency of society runs much better now on ai right more higher trust
00:34:36.360they will probably try to pitch that to us uh it will be objective when it comes to law enforcement
00:34:41.100it will run your government much more efficiently so i don't see us like kind of turning away from
00:34:46.560it i think we'll just adopt more and more and more of it but who's going to buy all the products if
00:34:52.240there's no salaries to go around who's going to buy the products we will have to be we'll be down
00:34:57.280to like subsistence farming again on the outside of the system where ai itself is just like kind
00:35:03.600of running civilization for its own i i don't know i mean that's just one variable one possible
00:35:08.320outcome but as i'm looking at this like we won't even have a part in this we won't even fit into
00:35:12.880the system i think that's right and i mean without without jumping the shark let me let me kind of
00:35:18.880walk through a couple of the things you said so my my concern is i think the one thing that we can
00:35:24.240all see right is is this ubi piece and i think we've we've been pitched ubi for a very long time
00:35:29.360so we can comprehend this concept of being given a paycheck to simply exist or to to subsist my
00:35:36.560concern with that up front is leverage. So the way that the populace has always been able to have
00:35:43.720leverage against government, leverage against centralized powers is one of basically two
00:35:48.500things, either the use of violence as in like a revolt, or our labor, the ability to withhold it,
00:35:53.900right? That was why the unions were so powerful over 100 years ago, was the ability to withhold
00:35:58.020your labor changed, changed the calculation in the system, right? So those who would have used
00:36:03.100their power against you in the government or otherwise, you had leverage against them.
00:36:07.980As we let go of our labor, we are changing that calculus and we're changing one of the only two
00:36:13.920things that we have by which to negotiate as citizens with government. And that leaves us
00:36:19.320with only violence. And again, unfortunately, we're coming into a time period where there's
00:36:23.100already robot dogs on the streets of Atlanta. They don't have guns strapped to them yet,
00:36:27.400but they are walking around and videotaping things and keeping an eye on things. So there's
00:36:31.360this pincer movement going on where citizens are going to be, they're going to find it much harder
00:36:36.200to use either of the two methods we've always had to maintain balance. Now, as you carry that
00:36:41.420forward, the question does become like, okay, well, if everyone's on UBI, a lot of people go
00:36:46.300like, oh, that's really cool. Because they kind of look at what they look at the benefits some
00:36:49.740folks get out and say, well, I'd probably be fine with that. It's like, yeah, maybe I think you'd
00:36:53.680probably be a lot more bored than you think. But more importantly, there's no more social mobility
00:36:57.600at that point so if you get your ubi and more labor's gone more labor's gone more labor's gone
00:37:02.840and whatever labor is left everyone is trying to pile into that drives the wages for those jobs
00:37:07.660down right because it's raised to the bottom okay well you can be a car mechanic okay but now there's
00:37:11.780like an extra million people who need to be car mechanics or need to be electricians or need to0.99
00:37:16.560be construction workers great so like just like when you let in a whole bunch of immigrants right
00:37:20.600you're letting in digital immigration you're letting in something that replaces that way
00:37:24.000I've always called this replacement 2.0.
00:37:25.980That's what we're looking at here, you know?
00:40:57.720It is about getting to having as much stuff as you can for as cheaply as possible.
00:41:01.060So it's only going to accelerate all of those lines of progression in a way that could easily lead towards the world that I envision is one possibility that these folks have in mind for this technology.
00:41:17.740What do we need all the humans for?1.00
00:41:19.140The World Economic Forum, the little gay Jew, right, who was talking about this.1.00
00:41:22.260And he said, like, yeah, we just give them drugs and computer games.1.00
00:41:27.000Are these people really going to be driven to breed?
00:41:29.060And then you have the issue of compiling or compounding toxicity, unfortunately, partially because of the population explosion, but also plastics, microplastics, food toxicity, glyphosate.
00:42:26.600You know, it's harder to have, you know, five kids in an apartment in a big city somewhere where you're squeezed together, understandably, right?
00:42:37.180But there's actually a depopulation, you know, campaign essentially.
00:42:40.420They can easily, like the depopulation is, because I agree with everything you're saying in terms of the effect of these horrors that we see around us.
00:42:47.640I mean, ironically, they could we've needed the glyphosate, we've needed the chemicals in that we believe that that was it was the way to make food cheap.
00:42:57.480It was the way to bring food to places that couldn't grow their own.
00:43:00.140Right. Nigeria is not the largest growing country on the planet because Nigerians are smart.
00:43:05.400It's the fastest growing country on the planet because we shoved cheap wheat down their throats.1.00
00:43:08.960Right. So it's very easy for if I'm right or even close to right.
00:43:14.040right it's it's actually ironic that the powers that be could easily say cool yeah we'll stop
00:43:19.580using glyphosate we'll stop making cheap energy we'll stop doing all these things that actually
00:43:24.820did allow for the population boom and then again we can either just i mean i i don't think they're
00:43:29.580going to kill us i think that's too messy but we'll just slowly let the population dwindle it
00:43:33.240turns out if we give you guys you know video games and drugs you'll just slowly go away0.82
00:43:37.620and then we shall inherit the earth yeah yeah exactly um i wonder if that is there some silver
00:43:44.180lining with the straight of humors closure i mean i i still can't think you know that they're you
00:43:49.960know this is i mean they could be inept there's ineptitude on a certain level here obviously i'm
00:43:54.540not trying to say that either but it is an interesting kind of concept because because
00:43:58.260at the same time it's also a squeeze right in terms of like then it's dependency we're back in
00:44:02.820that the ball is back in that court again we're all of a sudden like haha now you can't just take
00:44:06.840your car where you want to the push for evs will now increase which takes us into kind of the
00:44:11.480digital prison which is tied into the whole ai thing over like self-driving cars and uh you know
00:44:17.220rationing essentially right here like you your very existence is dependent on all these systems
00:44:22.300and um with a flip of a switch you could have violated the terms of service of whatever service
00:44:28.440you are using for example or you didn't comply as you said or something and basically it'll be
00:44:33.300switched off or limited or chokehold or strangled or something like that so much of this is as much
00:44:38.560as i see it and i agree with you in terms of like just the money and and the wealth accumulation by
00:44:43.300people are building these things that's just a means to an end and i think ultimately it's it's
00:44:48.580it's control that's that stands at the at the pinnacle here they do they want to own this place
00:44:55.000they want to run it it's theirs right correct and and again like if you if you could do that right
00:45:00.660if you were this giant ruler of the world and you were in so inclined to gather up that much
00:45:05.240control for yourself you don't want to like it you don't you want to be able to go wherever you
00:45:09.400want on the planet with your catamaran or your yachts and have it be the best version of it0.81
00:45:13.900right so like you don't want 100 million people in nigeria you want like a million people in
00:45:19.200nigeria who are all like the best chefs you've ever had so you can roll up in your yacht and
00:45:22.860and like experience that part of the world right you you want to have this giant playground i mean
00:45:26.720I think most people have never just deeply thought about this because most of us work
00:45:30.960inside of our normal little everyday lives.
00:45:32.600But if you're the type of people who build central banks, if you're the type of people
00:45:36.680who think like, where should I place this trillion dollars so as to get the best outcome,
00:45:56.100you're going you know you're flying from ivory tower to ivory tower to global meetings or
00:46:01.740whatever and then of course you you probably diddle kids and epstein island on the way too but anyway
00:46:06.300whatever reason it goes with the territory yeah uh but anyway so the the point is like yeah you
00:46:10.640you'll be that they're living they're already living on a completely different world than we
00:46:15.460are these people right um so at some point you begin to absolutely toy you with those types of
00:46:20.480ideas they can just they're out on these like nuclear powered yachts or whatever the hell they
00:46:24.240have now um and and they don't even have to uh step on land you know i mean they don't even have0.61
00:46:29.920to step among the plebeians among the unwashed masses um i think that they're they're going for
00:46:36.020the jugular with this technology that they they have they have something here that in a way solves0.79
00:46:42.880all their problems um it it's it's gonna i think it from their point of view take care of business
00:46:48.100for them i think the deeper interesting kind of philosophical addition to that is we but but is it
00:46:55.460though or do they also know what they are building will they actually even be in control of of the
00:47:00.820thing that they think is their salvation i think i mean they're the ones rolling the dice right
00:47:05.460like we're all involved in the gamble but we're not the gamblers they are the gamblers and they
00:47:09.620are it apparently seems like they are willing to roll this particular set of dice yeah yeah i wonder
00:47:16.260wonder why it's the same thing with like some of the toxicity that seems to be i mean maybe they're0.59
00:47:21.620dumb to a certain extent certain level but i showed the glyphosate issue before frankly like
00:47:25.560how they are they not affected by this either you know kind of thing of like just the amount0.95
00:47:30.000of toxicity but anyway that's it that's a different question um let me go back to the
00:47:34.420point here there's a lot of things we can go into um it's musk mentioned the bootloader for
00:47:41.260super intelligence i think that was kind of interesting too right that humanity again it's
00:47:45.380And this idea that I thought about earlier today, it's almost like this, I don't mean to get spiritual philosophical, but I can see the way that they see these transhumanist or technocrats, whatever you want to call it.
00:48:00.460Like, I think many of them are genuinely excited about, like, well, we're going to live for, you know, 400 years.
00:48:06.860We're going to just print organs, and we're going to be integrated with machines and be super smart, and they'll be, like, the gods of old, right?
00:48:15.320They're going to be the, you know, Mount Olympus gods, and they're going to be able to do whatever they want to do.
00:48:20.400So I think they're genuinely kind of excited about that.
00:48:22.220But it's almost like the earth is like, it has all these rare minerals and things were placed here, like a big puzzle for us to just kind of put together somehow and extract and refine and, you know, put together.
00:48:38.100And all of a sudden, poof, here's this new being, essentially, literally like kind of an alien intelligence, something that, sure, comes kind of out of our, out of us to a certain extent, right?
00:48:53.100It's not some alien creating this thing, but at the same time, it will be, it's a golem, it's a monster, it's a Frankenstein's monster to a certain extent as well.
00:49:03.120but we don't even know what it's going to do
00:49:31.040And I think that it's hard to I think this should not be extracted from a number of the world's religions that are focused on salvation or running the world.
00:49:41.080Right. I mean, with Judaism, the goal, you know, in the next couple hundred years, by the year 6000, the Jewish calendar is to have, you know, Jerusalem as the center of the planet and ruling over it in a way that, you know, it shapes it into what they want.0.85
00:49:53.680I don't think that those viewpoints are not the ones that are shaping this idea of ruling over0.60
00:49:59.760the planet and what these technologies can do. I think Elon is kind of a unique person in the
00:50:05.720billionaire class in that I don't think he's as tied into all those things. I think he's usually
00:50:09.540kind of speaking from his mind, to be honest with you. And so I think when he says that humans are
00:50:14.440the bootloader for AI, I think he means it from a more metaphysical evolutionary standpoint.
00:50:19.740like perhaps humankind are you know we are the thing that's helping to evolve this next being
00:50:25.100whereas again i think on the other hand it's quite possible that some elon has described himself
00:50:29.740thinking on 300 500 year 1000 year timelines some of these families though have been at that for a
00:50:35.020long time and are already in control of a lot of resources and more importantly leverage and power
00:50:39.740and i think that they probably have a different view that this is not so much like we're the
00:50:43.740accidental bootloader in the evolutionary line but rather like no we are the bootloader but that
00:50:48.940was the plan and like we're designing right we designed the bootloader to boot up the thing
00:50:53.180that's going to give us what we want without all the messiness of having to get it from human labor
00:50:57.180and it's it gets kind of it's hard to have these conversations without getting kind of metaphysical
00:51:02.300and it gets a cult and we it's impossible not to i mean it's yeah when you get into something this
00:51:07.580big it's it's challenging us at the level of values and when you're talking about bringing
00:51:11.580in something that can crunch this much data and have this much understanding you're quickly moving
00:51:16.220into the realms of considering things like gods or super beings and so you can't help but think
00:51:20.540in these terms that's true um thank you guys for the super chats there too appreciate that i'll
00:51:26.060well i can read a couple of them now as we so i don't lose them here in the flow uh occidentum
00:51:31.100lux earlier thank you very generous over he says oh you need pollination that's only 9.99 per month
00:51:36.940yeah i mean yeah exactly everything as a service is kind of an interesting thing too we i mean
00:51:41.020I mean, that's kind of an idea that surfaced during the, I think, the COVID times, right?
00:51:45.620Of like, just your total dependency on everything.
00:51:49.880And of course, it came from the slogan there of like, you'll own nothing and you'll be happy type of thing, right?
00:51:54.740But like, ownership is kind of a way thing of the past.
00:51:59.000It's really a, with technology, it's a merger of a new system altogether.
00:52:04.060Again, long term, I don't think that's true.
00:52:06.000But that's how they're selling it short term of like, you're not going to have to worry about money anymore.
00:52:10.300It will be, what was it, Aaron Bastani called it fully automated luxury communism, I think he called it, right?
01:10:02.500And then there is this idea that you can have AI running those smart contracts or voting on what they should be and determining or whatever.
01:10:08.960But what I'm saying is just that as an aspect in and of itself, where like an AI, if it has the legal rights to a personhood, like a corporation does, for example, right?
01:10:21.740It can now contact a spokesperson or hire a CEO or whatever it is over, you know, LinkedIn and hire someone.
01:10:37.180And AI can, you know, bet on the market.
01:17:20.320And maybe some people that believe in God will say that, yeah, that will happen.
01:17:23.560It will be divine intervention or whatever.
01:17:26.500From my point of view, I'm not so certain of that.
01:17:30.380This is up to us to guide the process and take our responsibility
01:17:34.320and make sure that this is done right as opposed to.
01:17:37.500And I'm not saying that that's how people who are religious just see it.
01:17:40.500Oh, it's fine. Don't worry. God will take care of it.
01:17:43.200I think that's kind of a dishonest view too.
01:17:44.560But I'm saying we have to do our utmost to try to warn.
01:17:48.160And I mean, there are some people that warn about that, right?
01:17:50.140I've covered that a couple of times, but the If Anyone Builds It, Everyone Dies book, Eliezer Yudkovsky, I think he's Jewish, and then Nate Suarez.
01:18:01.880And this is an interesting take, and even if it's like, well, this is exaggerated, or that's not going to happen or whatever, and it's like, well, shouldn't we make sure first before we move ahead with this kind of thing?
01:18:16.940And it just, no, I don't think there's any singular technology anywhere, anywhere, any technology anywhere at any time that we just didn't develop because like we thought that they could pose a problem later.
01:18:31.320Now we're, I think we're at a crossroads, Eric, that we haven't been at before.
01:18:36.660this is a new completely this is completely new territory and we're talking about potentially
01:18:44.040within the next couple of years i think from anywhere from a year to 10 years forward is
01:18:49.420highly critical in terms of where these people decide to basically move forward when it comes
01:18:57.320to ai what do you think agreed and sometimes people get caught up on the timelines it's like
01:19:02.640look, my grandmother's in her 90s. She's sitting there playing with her iPad. This woman was born
01:19:09.140before World War II, took a flight in a modified bomber plane to get back to her parents' home
01:19:17.840country in Europe. She's seen so much change in her time. We're on a much steeper parabola.
01:19:24.380We're going to see a lot more in our lifetimes and our kids even more so. Whether it's 10 years,
01:19:29.740whether it's 30 years whether it's 50 this is all a blink of an eye in human history right and the
01:19:34.700people that we love it matters for them it matters for for our future so i think we have to take it
01:19:39.600seriously regardless of what the timeline is um and be adaptable i agree like we'll make it through
01:19:46.920and i think that my take on things is to be adaptable to fight hard to maintain an optimism
01:19:53.260like i believe in the human spirit and i believe that we can get through anything but at the same
01:19:57.320time that doesn't mean it my personal philosophy that we should be passive
01:20:01.760because we in a time of even even with all our gripes and complaints like we're
01:20:06.020we're largely comfortable we're largely safe I'm very thankful for that but if
01:20:09.720you look to hit human history of like our ancestors had to get through some
01:20:12.920very hard times to get us here and there could be very very dark times in the
01:20:17.480future it's when I look at some of these trajectories that we're on I see that
01:20:23.180there could be those dark times and and dark times that were different from
01:20:25.980times of plague and deprivation like again when things are more centralized and we're only able
01:20:31.960to do what we're allowed to do that's a different sort of darkness and how long does that take to
01:20:36.340break you know is it a hundred years is it a thousand years i'm not sure and i do think humans
01:20:40.460will get through it i'm still obsessed to prepare ourselves for it and to be with people that we
01:20:45.440care about so that we can again be maximally adaptable and set people up for success yep
01:20:50.660you're going robo voice a little bit here we're almost we're almost losing you hopefully
01:20:53.760connection will be improving uh but i think i i think i caught most of that um yeah we don't know
01:21:00.400if it's exactly no i i mean regardless we have to have that spirit otherwise what's the point right
01:21:05.440uh to a certain extent um but this is i don't know it's just it's it's a fascinating time
01:21:12.680and it's a it's it's freaky and it's and it's terrifying and especially when you think of the
01:21:18.400people that are you know kind of behind this and and and those who are funding it um that's the
01:21:25.340other thing of just the dependency on it alone right like that that is a manipulative aspect
01:21:29.660right ai itself is kind of hardwired already or built uh hardwired is the wrong word but like built
01:21:34.780to as a as a product at least what we're interfacing with now right it's a product they
01:21:40.580want you to subscribe to these things get the pro plan right they gotta pay for this also somehow
01:21:45.580all the data centers and all the compute.
01:24:27.980Damn it. Tech issues. That's just what you need. Very interesting conversation here.0.99
01:24:31.880well let's see if he'll join back in maybe if you can hear me you can try to just exit out of
01:24:38.520the call altogether you might have done that earlier and just try to enter uh re-enter right
01:24:43.320back in there um while we're waiting for some of the connection issues let's take this wonderful
01:24:50.140fantastic incredible the man himself albert a super chat coming in look at this guy hi henrik
01:24:56.760looking forward to tonight's show hope all is well take care good to see you albert king albert
01:25:01.360guys give king albert a shout out in the chat holy smokes he sets the bar high thank you man
01:25:07.640i appreciate it so much you're so kind we appreciate you much love to you and the folks
01:25:11.500hope everything is good um eric you're back in there we'll find out can you hear me this time
01:25:16.580sounds better cool we're trying a different connection we're uh we said earlier that
01:25:20.620something's in retrograde so continuing it's been one of those days yeah it's all right no it sounds
01:25:25.460good um yeah no i was talking about the the dead the revival of like dead people or not revival
01:25:31.460it's wrong but they're selling it like that right you your grandfather can live on in this digital
01:25:37.120avatar or whatever and i was talking about this thing of like you know we might have a specific
01:25:43.140relationship to it because um because we're like he we didn't grow up with that but imagine what
01:25:51.520it's like for kids that grow up in the two three generations from when those things are are just
01:25:56.920seen as as normal i guess you know i mean like that we can't even imagine how what the perception
01:26:02.620for them that this technology will be and it's it's impossible to to predict right yeah i mean
01:26:09.900if anything it seems like they're more and more adapted to things that are not real in the way
01:26:15.400that we think of them and i think there are things that actually are real um i won't be bringing back
01:26:20.220any of my loved ones as a simulacrum like that personally but we'll see we'll see i mean yeah
01:26:25.460i'm not super bullish on the uh majority's capability to parse between what is real and
01:26:31.580what is meaningful and what is not and i do think this is where it's going to take us to continually
01:26:34.940sharpen our own philosophical metaphysical senses like hold on to anything metaphysical
01:26:39.800that we do have to keep us focused on real human beings real meaning real value right really taking
01:26:45.240care of of real people and and acceptance of things that we've always had to accept like death
01:26:49.480like it's so simple and yet uh i think that the pushing away of those things the rejection of
01:26:55.900the simple cycle of life is causing people to be able to to want to reach out and grasp on to what
01:27:01.720they think is some sort of way of continuing on but really is likely just some sort of sick fake
01:27:06.420golem yeah it's interesting um not only makes does it makes us you know question ourselves or
01:27:13.940question where we are meaning and all those kinds of things right these are deeply
01:27:18.200important questions but also it's kind of like a I guess as a way out of it that it would really
01:27:26.880everything is seen as I see everything as selection pressure to a certain extent right
01:27:31.700like how how well will you fare through these kinds of new conditions and again we're not in
01:27:39.340charge we can sit here and whine about it all day long and complain on it and we just long for the
01:27:43.580right to a certain extent but of course we can take we can make choices individually uh and you
01:27:48.320know for our families and hopefully you know as communities we can also do that moving forward
01:27:52.420what's our you know position on these things but the metaphysics is kind of interesting i think it
01:27:57.880will weed out a lot of people and maybe it's just a it's it's a return mechanism it makes me think
01:28:03.460of like at least mythological stories or prior high civilizations and all kinds of things that
01:28:08.220enters into my head when i think about these things right of like they got prideful they
01:28:12.000dune right they build machines and the machines uh you know took control of them i hope we make
01:28:17.840the dune decision man i hope we make that decision to say no none of this not allowed yeah exactly
01:28:23.180right um i don't know it's it's a very i i've noticed as much as i try to say rational about
01:28:30.480these types of topics and and stuff it quickly takes me into kind of a spiritual thing you know
01:28:35.700like well what's the meaning then you know kind of thing and and we are being confronted uh again
01:28:41.280And for the most part right now, no, not most people.
01:28:43.840But at some point, I think most people will have some type of realization in terms of like meaning, the search for purpose inside of these things, right?
01:28:55.080So maybe it's, in a weird way, maybe it's a healthy process to go through as we're finally confronted on some of those things again, potentiality.
01:34:24.700And I'm just a huge believer in both freedom and, again, also a space in which you have to go out and find where you're needed, how you can provide value and let other humans push back in a way that shows you what actually matters.
01:34:38.940And in a more contrived world where everything is taken care of and it's just, here's your UBI and free time, most humans are not going to thrive under that if current patterns and historical human patterns hold true.0.97
01:34:52.060Was it Zardo's that had like a decadent future, right?0.96
01:34:57.020And I get met up so many years since I saw it now,0.99
01:35:00.040but it was like it kind of reminded me of that, right?
01:35:01.600Like the game, I forget what they call them, the Hunters,
01:35:05.500but Sean Connery's role, if you ever saw Zardo's, right?
01:50:00.220Maybe those things are intentional as well to test you or to see if you're paying attention.
01:50:04.660What are some of the drives that are going on there?
01:50:06.140But there is an interesting thing in terms of consciousness that, like, if it processes language in the right way, is there a kind of form of magic that occurs there where that produces something metaphysical, if that makes sense, right?
01:50:20.460It produces something that is far more deeper than we think it is.
01:50:23.800It's not just the numbers because the numbers represent something.
02:18:37.040I think you can actually glean and learn some interesting historical things about some of the methods, let's say, that there was used in the past that you can see reiterated and echoed today.
02:18:50.020Methodology and how they manipulate us with Scripture, basically, and how it wraps so many people along with it.