SUNDAY SPECIAL: HUMAN EVENTS DEBATE - THE RISE OF CHAT GPT
Episode Stats
Words per Minute
182.5716
Summary
The Great Ayn Rand vs. The Rise of the Machines: Are we welcoming our AI overlords, or are we going to try to fight them, and actually continue to be human beings? Or is there potentially a way in between joining me on this who s got lots of feelings about it? The Great Libby Emmons, editor-in-chief of the Postmillennial and now the Editor-In-Chief of Human Events, joins me to talk about it.
Transcript
00:00:00.000
i want to take a second to remind you to sign up for the pozo daily brief it is completely free it
00:00:06.800
will be one email that's sent to you every day you can stop the endless scrolling trying to find
00:00:10.720
out what's going on in your world we will have this delivered directly to you totally for free
00:00:14.860
go to humanevents.com slash pozo sign up today it's called the pozo daily brief read what i read
00:00:20.780
for show prep you will not regret it humanevents.com slash pozo totally free the pozo daily brief
00:00:30.000
well ladies and gentlemen welcome aboard to this human event sunday special the great ai debate
00:00:44.240
the rise of the machines are we welcoming our ai overlords or are we going to try to fight them
00:00:53.720
and actually continue to be human beings or is there potentially a way in between joining me
00:00:59.440
on this who's got lots of feelings about it is the great libby emmons editor-in-chief of the
00:01:04.000
postmillennial and now also the editor-in-chief of human events as well so make sure you go to
00:01:09.280
humanevents.com read everything that libby is going out there and putting uh in terms of the op-eds and
00:01:14.600
the international news and of course subscribe to human events daily the flagship show of human
00:01:19.340
events let me how are you good thanks jack so this there was a story that came out
00:01:27.180
the other day about the ceo of google sundar pinch high and he was actually saying that i thought
00:01:35.340
this was hilarious because he said apparently he came out and said that he didn't know what his ai was
00:01:43.500
doing and apparently it had started to teach itself programming and languages that they did not program
00:01:50.460
it too i said boy it's like it's like he hasn't actually seen a movie ever in his life or something
00:01:57.160
i don't know what like do we actually believe these people don't understand what's going to happen
00:02:02.020
yeah it is sort of amazing that you have these tech people uh entrepreneurs etc ceos moving forward
00:02:09.740
with so much of the tech that has been predicted in our science fiction and speculative fiction for
00:02:16.740
decades if not perhaps centuries and they press on as though there's no indication of where the human
00:02:24.160
imagination will go with this and what could happen even though we see a lot of times the uh
00:02:31.620
the uh imaginings of our best fiction writers come to fruition as time goes by so yeah i was not
00:02:40.380
surprised to find out that uh the google head doesn't really understand ai and that human beings don't
00:02:48.720
necessarily understand it we don't understand our own intelligence so it is not possible for us to
00:02:55.140
fully understand um something that we create to mirror our own intelligence that's not really surprising
00:03:02.040
well i think you're right too but in the same sense that it's it's it's it's the old michael creighton
00:03:10.000
thing right where you know it's it's scientists these guys they spend all their time thinking about
00:03:15.460
what they what could be rather than thinking about what should be uh should we do this uh you know
00:03:22.600
there's the whole the oppenheimer movies coming out very soon where you know he lived the rest of his
00:03:27.900
life you know being very regretful of leading the manhattan project that led to uh you know he said
00:03:34.540
he said i feel like i have blood on my hands whereas i don't know if you if you ever read the story that
00:03:39.140
truman basically threw him out of his office after that yeah it's interesting though because i had that
00:03:44.980
i mean we all think about that you look at that kind of technology and that kind of weapon and you look
00:03:50.080
at the amount of lives that were lost because of that and the destruction that that weapon wrought and i
00:03:55.520
remember talking to a man who was a veteran of world war ii um and bringing up this idea that this
00:04:03.920
bomb was so incredibly destructive and he brought me back to the numbers and he said if we hadn't used
00:04:10.100
a bomb like that we would have sent a bunch of young men young american men to be fighting on those
00:04:16.420
shores and we would have lost far more americans in the conflict than we did had we not used that
00:04:22.040
weapon so i do think it's important to uh not summarily say that a given technology is a total
00:04:30.480
disaster because of the disaster that we see because there's also the potential for the disaster
00:04:36.120
of not using it um so that's interesting yeah i was just going to say that that um you know to your
00:04:43.820
point that it's you know look at the other end of it that we also have nuclear power now right so
00:04:49.180
it's sure so it it's it's the same technology in um in a broad sense but in one hand on one hand it
00:04:58.200
gave us the ultimate destruction but on the other hand if we can somehow somehow eventually get
00:05:04.260
politics out of this that we could actually be using this to power our cities and power the entire
00:05:08.820
future but keep in mind that i come from the navy right so every single u.s navy submarine right now
00:05:15.460
that's in the water is being powered by at least one nuclear power plant at least one nuclear is in
00:05:22.220
there nuclear oh it's like a generator and then of course the engine but um and our all of our aircraft
00:05:26.940
carriers have two have two nuclear engines and nuclear power plants so the idea that um you know
00:05:34.140
the idea that this thing is like you know it's not chernobyl we're not the soviet union and it's not
00:05:40.040
three mile island anymore every single day the navy uses these on a regular basis um great hyrum
00:05:46.740
rickover if you guys don't know who that is the father of the nuclear navy please go read hyrum
00:05:51.360
rickover just uh you know someone i consider an absolute hero and i think most most navy officers
00:05:57.800
when you look at him in terms of of national heroes we would really consider him one of the great
00:06:02.720
patriots in american history for for just realizing this new technology and the fact he said well if you
00:06:08.080
need a water source or the navy we got the best one you know that you'd ever need for these things
00:06:12.600
so i guess what i mean to say is in the broader sense of the debate that you does ai have the power
00:06:19.580
for great destruction yes but it does also have the power for great innovation and technological i hate
00:06:28.540
to use the word it's like we can't use the word progress anymore because it's so politicized right
00:06:32.480
because it's very politicized i do think yeah and you and i have disagreed on chat gpt and it's
00:06:40.800
relative it's relative merits um and i do think that there are substantial reasons for concern and for a
00:06:48.680
pause and you even saw recently elon musk along with some other tech guys who are big in the industry i
00:06:54.980
forget who they were um but call for a pause on the development of ai and there are yeah and there
00:07:02.520
are serious concerns does that mean it does not mean however that it should not be developed it means
00:07:08.000
that it should be developed um in ways that are going to be beneficial to humanity and not necessarily
00:07:14.380
in ways that are going to destroy us we have seen uh certainly the advancement of technologies over time
00:07:22.440
that have not helped us in ways that uh you know that have like sort of taken over um and have not
00:07:29.980
been as helpful as necessarily they could have been like i do think that we have overdone it with our
00:07:35.780
handheld devices there's been a lot of progress there there's been a lot of benefit but there's also been
00:07:41.780
substantial downsides and i think that it's important to look at that as well that being said i don't
00:07:47.100
think there's a we if that makes sense like when i talk about when people say like we should put a
00:07:53.240
pause on this we should take a closer look at that who who are we what is the we comprised of there is
00:08:01.780
not any kind of global body that um would make these kind of determinations and if there were i imagine
00:08:08.860
we would all be roundly opposed to it because it would likely not have things like individual rights as
00:08:14.940
part of its uh primary tenets so we have to be considerate of that um i think there's a lot of
00:08:21.560
places where ai does not belong and there's a lot of concern well and so um i i appreciate it right and
00:08:28.660
i also appreciate that elon is actually having those thoughts and getting that discussion started
00:08:33.300
because typically we you know we throw out these new technological innovations and we we simply say
00:08:40.660
all right it's great let's go for it let's keep pushing um i saw a tedx speech recently where the
00:08:46.000
guy was um i forget it was and he said he showed how in the next iteration of chat gpt i guess he had
00:08:54.200
like the beta version of whatever the next one is chat gpt pro that's coming out and in his one um he
00:08:59.960
pointed out how we're going to be going into a post app post app environment and when i say post app so
00:09:06.180
the current currently the way we use apps is you know you go on your phone or you know or your your
00:09:12.680
your computer your tablet whatever but we interact through the apps and we let you you know we
00:09:16.940
post something on no post something on twitter post something on telegram post something on truth
00:09:21.300
i'll then i'll go over to another app and i'm booking travel and i'll go to another app and i'm uh you know
00:09:27.540
i'm writing something okay the new chat gpt can go in between the apps for you and you can order it
00:09:39.180
to tell it what to do in the apps and he gave a demonstration of this where he said so he's doing
00:09:46.260
the tedx conference and he said design a lunch for us to hold after this conference write it out
00:09:54.720
then draw me a picture of it or generate a picture of it right in in mid-journey so it writes out the
00:10:00.980
whole thing it's this gourmet um very frou-frou you know dinner or lunch and then at the very end
00:10:07.260
it shows you the photo which looks like a professional photo that came out of some kind of
00:10:11.420
you know magazine and and then here's the next thing then he said and he just it's it's like a like
00:10:17.380
a virtual assistant where then he says now go to instacart and order it
00:10:21.500
and it went to instacart and ordered the whole thing and then this is my favorite part then he
00:10:27.540
said okay now take that instacart order and and craft a tweet and tweet it out for all my followers
00:10:33.800
to have and he's just standing there at the podium at tedx and the chat gpt module is doing all of
00:10:40.640
these things and the very the very end of it the denouement is he said now go check my twitter account
00:10:46.880
and they checked and they put it up on the screen and everyone picked up their phones and the tweet was
00:10:50.480
there and you and you saw he had never actually even touched right his his computer because he's
00:10:58.160
just talking to chat gpt the whole time very right yeah i mean that is very impressive and that's a
00:11:04.180
very impressive tool the concern that i have is that it won't just be used as a tool but that it will
00:11:09.340
be used as a companion already when people use siri they say please and thank you to the machine
00:11:15.820
for providing them with whatever you know they asked for music or an order or something else
00:11:22.000
people say please and thank you to their alexa right we treat our machines as though they are
00:11:28.540
human beings we treat them as though they are personifications and i think that that is really
00:11:34.200
concerning because what you end up with is a simulation of a companion as opposed to a companion
00:11:40.200
and that simulation can fill the void that is missing for so many people um that void of of
00:11:47.500
meaning of wanting to be close you can feel close with your machine people already do it you see people
00:11:53.620
you know i'm sure you've had that experience you go to a party or something like that
00:11:58.360
and if you don't know anybody and you don't have anyone to talk to you take out your phone
00:12:04.500
and you talk to whoever's on your phone and you feel perfectly content sometimes you don't even
00:12:09.180
talk to anybody on your phone you look at an app you check your twitter to be fair um i'm definitely
00:12:16.080
the classic extrovert so i'm not a good example so me i stand there and i look at my phone and i go
00:12:23.780
hide in the corner i'm like until i see someone i know and then i drag them over to the corner hey
00:12:29.060
who's this guy hey who's that guy hey who's this person and it's like i find the one person
00:12:33.140
that i kind of know and i'm just bouncing around from person to person but that's why i hang out
00:12:39.440
with you at parties as soon as i see you at parties i put my phone in my pocket as opposed to as opposed
00:12:44.920
to getting thrown out of parties with me in austin right well that was well that's always the fault
00:12:51.320
we're gonna have to leave that there we don't need to name names about who threw out
00:12:56.340
libby didn't know i was gonna bring that one up listen i have been thrown out of way better
00:13:02.920
parties than that one so i'm not worried about it right as i know it's okay it's okay um that if
00:13:09.360
i've been crashing parties since the 90s so i'm not worried about it so so i guess what i mean to say
00:13:16.760
it but you're right though you're right and there there is this sense of we're you know we're becoming
00:13:23.100
um cyborgs in a way but we're we're meeting with the machine so uh you know we don't have alexa in
00:13:31.780
our house we don't have any of whatever different things like that are um siri we don't do siri um
00:13:40.980
but at some point i i kind of think it's inevitable i do think it's becoming inevitable
00:13:47.060
where it they these things are going to become so ubiquitous in society and they're gonna like
00:13:53.520
even right now it's kind of hard to use them they're not great um by the way i also saw recently
00:13:59.540
that somebody was that some people are taking like their google homes and google hubs and they're
00:14:03.880
connecting them to in their doors so that your door can be locked and unlocked you know which i
00:14:10.880
guess just by saying so i can unlock my door and i'm thinking like oh well if i'm trying to rob
00:14:18.500
people if i'm you know the bandits in home alone you know that's the first thing that i'm going to
00:14:23.080
figure out like i'm gonna now what happens when somebody takes 11 labs and copies your voice from
00:14:30.700
your voicemail or they get a copy of it puts that into their system and then goes and tells your
00:14:37.900
alexa to open the door so you remember fahrenheit 451 the ray bradbury novel that's pretty iconic so
00:14:46.500
at the beginning um the main character whose name i forget his house is yeah his house is the only one
00:14:52.940
on the block that doesn't have the blue light of the television screen flickering through the window
00:14:57.140
that's going to be my house with this ai stuff i have and i have known that since i read that book
00:15:04.280
in middle school and i thought yep that's definitely going to be me with whatever the new thing is you
00:15:10.100
know i i my concern is that we replace humanity with our creation and that it is such a subpar
00:15:17.780
replacement that it will lead to the kind of existential despair um you know that really takes
00:15:25.700
down a civilization and that we don't even detect until it's far too late because we will put our love
00:15:31.340
we will put our spirituality we will put our kindness into these machines and these machines
00:15:37.020
are machines they will not return that we will imagine that they return it but they will actually
00:15:42.200
not we already have men in japan marrying anime looking pillows because they're so lonely uh what's
00:15:49.720
going to happen now we already have teenagers who go through full romantic relationships with each other
00:15:55.380
um online without ever having met each other you know they go through the courtship and the
00:16:02.080
the emotional relationship part and the breakup without ever having met what is that saying about
00:16:09.560
who we are and where we're going i think it's really devastating i'm going to read a quote and i wasn't
00:16:14.660
planning on reading this but i happen to have it up they have greatly increased the life expectancy of
00:16:21.000
those of us who live in advanced countries but they have destabilized society have made life
00:16:25.820
unfulfilling have subjected human beings to indignities it has led to widespread psychological
00:16:31.820
suffering and in the third world to physical suffering as well it has inflicted severe damage
00:16:36.960
on the natural world the continued development of technology will worsen the situation it will it will
00:16:42.100
certainly subject human beings to greater indignities and inflict greater damage on the natural world
00:16:47.460
probably lead to greater social disruption and psychological suffering and indeed may lead
00:16:51.840
to increased physical suffering even in advanced countries yeah that's about right no thank kaczynski
00:17:01.360
yeah well you know his methods his methods were madness but uh obviously there's a lot of reason also
00:17:09.600
yeah he was also one of the earliest when he was 16 years old as a child prodigy at in um in
00:17:16.460
mathematics at harvard he was subjected to earliest iterations of the mk ultra experiments and what's
00:17:23.780
the mk ultra experiments with that the mk ultra experiments were a series of experiments that took
00:17:29.280
place over 20 years where the cia worked with uh psychiatrists and psychologists and pharmacologists
00:17:36.360
to attempt to use mind-altering drugs to study the effects of mind of of mind control because during
00:17:43.480
the civil war of the cold war they had a thought of okay so we know about information warfare we know
00:17:50.500
about economic warfare but what about brain warfare and so it was then they were actually mind control
00:17:57.000
experiments that were done and it turns out that theodore kaczynski when he was a student at harvard
00:18:03.200
where one of these uh cia-backed uh doctors was doing the research that he was one of their earliest
00:18:10.500
research subjects so we took some of our best young brains and destroyed them just for the purpose of
00:18:16.440
cia intelligence research yes wow that's disturbing that's a very disturbing use of tax dollars
00:18:25.020
and at least one of them was ended up being the unabomber so yeah come with that that would have been
00:18:33.260
well well done well done cia that was in 1959 so i mean imagine what this you know this mathematical
00:18:41.080
genius could have potentially done with that what he could have been capable of right well we know
00:18:47.720
for an extent what he was capable of um but imagine i guess i just mean to say imagine had that energy
00:18:54.000
been placed into something that was beneficial for society uh imagine him working in the the space
00:19:00.780
program for example i mean this this was the 50s when he was in the experiment yeah uh and then it's
00:19:06.280
it's the 1970s when he began this uh his his his campaigns and then it was caught of course very
00:19:13.080
publicly in the 1990s right but he got away with it for a long time is he like in some supermax prison
00:19:19.160
or something like that that's exactly where he is yeah adx but the one in colorado yeah
00:19:24.480
and yeah 80 years old uh right now but i guess my point is though that um that being said um when
00:19:33.820
you look at when you look at his description of society it's like he took too many black pills and
00:19:39.160
couldn't couldn't handle it it it does line up and i don't necessarily know if it's if if being a
00:19:44.840
luddite is the right word and i don't i wouldn't consider you a luddite i mean here we are on skype and
00:19:48.840
you you know you actually run websites right you run websites actual websites yeah you're and so
00:19:56.040
there is i guess what i mean to say is that in the same way that when man discovered fire you know
00:20:01.640
it was handed to us from prometheus the that we brought fire into our homes and of course fire has
00:20:08.700
the ability to destroy our homes but it also has the ability to food it has the ability to heat us
00:20:14.380
to keep us warm to heal us um when we when we are ill or or even in some cases um you know injured
00:20:21.920
and so i i look at technology in a similar way that if we can find the right way to harness it that
00:20:31.720
there is a possibility for it and so let's let's just get into it because we've been dancing around
00:20:36.520
it but so i've said to you a million times that you know because when we're doing post-millennial
00:20:43.200
stuff when we're doing human events stuff that there's always there's always that clunky work
00:20:47.980
of just getting the story out right the story has to get the writing the human work yes the right
00:20:55.460
the the words need to be written and so they need to be written down here's here's how it is from my
00:21:00.200
perspective that we'll be in like the slack channel or whatever and and i'll see something and i say
00:21:07.520
hey we should write this up and from my perspective uh you know then you go assign it to someone they
00:21:13.660
write it it gets written up then it comes back and we look at it and we tweak the headline we say oh
00:21:18.680
make sure to include this or like you know we do the fact checking we do the we check the links etc etc
00:21:23.920
and so my but i never actually see the person or talk to the person who who does all those things i
00:21:31.100
just for me it's just another channel on uh right because we're fully remote because everybody's
00:21:36.140
everywhere fully remote and so my question is what
00:21:41.180
substantively would be the difference if we had it doesn't have to be chat gpt or any ai
00:21:50.260
helping us to speed up that process well i don't trust machines so i don't think i would trust and
00:21:58.580
we've had this conversation i don't trust a chat gpt to necessarily get all of the information
00:22:04.440
correct we have seen for example i think with this bard thing this google thing uh it has cited books that
00:22:11.220
don't exist so now you're asking someone to do fact checking on a uh ai generated article which takes
00:22:20.920
the same amount of time as fact checking a human being except you can ask the human being directly and
00:22:26.560
know that you're going to get a substantive answer okay also you still have to do doing yeah you still
00:22:34.520
have to do the fact checking that's right but also i'm not saying if you work with a writer for a long
00:22:39.940
period of time like you know the there's no chat gpt in the slack channel no but the other thing too
00:22:47.040
is the other thing too is some of the writers that we have on staff i've worked with for an extremely
00:22:52.560
long period well not extremely long but like a good couple of years i've worked with them
00:22:56.340
so i know what their level of accuracy is i know how they source things i know their style
00:23:02.580
i know what they're up to i know how fast they can work like there's some writers where i give them a
00:23:08.320
breaking story and i know it'll be done in 15 minutes what if what if they're using that's really
00:23:12.620
valuable not letting you know what if they're using it and then getting back to you because they're
00:23:18.120
remote okay so i'm gonna ask um i don't think any of them are using chat gpt and the reason is
00:23:27.760
because for a while we were you have tracking software on their on their laptop no no of course
00:23:33.520
not of course not i believe in freedom individual liberty i'm not going to track people at their job
00:23:39.400
i was asked once actually do you read our dms and i was like no should i be reading your dms that
00:23:45.620
sounds like a terrible waste of time and a real invasion of privacy i'm not going to do that um
00:23:51.440
yeah i'm totally opposed i don't know if people are using chat gpt but my guess is that they're not
00:23:57.480
because i know that i know what their style is i've been working with them for a while and i can see it
00:24:02.060
i can see the work that they do but couldn't they but to your point couldn't you say read these five
00:24:09.320
articles that i've written for either human events or post-millennial learn my style and now start
00:24:15.020
writing articles in my style i don't know does it work like that yep is that effective what if they
00:24:22.100
what if they make up what and the more you train it like it has to be ai just starts making up
00:24:29.000
information how would you know how would the editor necessarily know if the if it's making up
00:24:35.800
information like with cnn yeah like if it was cnn writers i don't have anything on my stuff
00:24:42.940
no i know but but it was like that cnn article we read the other day about that it was it was one of
00:24:49.900
those shootings and it was a it was like a you know it was the one with the horrific one with the
00:24:54.080
neighbor um yeah family and then there's this the cnn article has this line in there about this is
00:25:01.220
this is just what happens in a country that has why has widespread guns and widespread paranoia and
00:25:06.780
i was like wait what like yeah when i see stuff like responsible no i think the guy with the gun
00:25:12.520
is responsible that's who's responsible the criminal but but again but again to my point though
00:25:19.200
shot and well and shot the father in the back right and the dad the dad took the bullets but
00:25:23.400
my point is that's what dads are supposed to do 100 um you could go to chat gpt or you know bard
00:25:36.100
whatever you call it and tell it that these are the types of things that that cnn or this is the style
00:25:43.100
of cnn and it could start to copy that and mimic it to the point where it would know to start adding
00:25:50.060
in those little extraneous phrases yeah i suppose it could uh that's not that's not enough to convince
00:25:57.880
me that human beings should lose their creative jobs to a machine i just don't think i'm not saying
00:26:03.680
but no no you wouldn't necessarily it's it's not losing your job right it's go back to the example i
00:26:08.700
gave of the guy coming up with um the lunch right the lunch order um someone someone still has to make
00:26:16.480
the food right you still have to go and you know do all that someone still has to you know deliver
00:26:21.960
you just don't have to think about it right so it's it's you don't have to sit there and he's not
00:26:26.580
like a gourmet chef and you know maybe it takes out like a some level of the the catering business but at
00:26:35.480
the same time the money is still going to be there because the because what it's doing is it's taking out
00:26:41.180
a lot of the busy work by just telling you by just giving you options the same way that you know if
00:26:47.740
you went by the way to a caterer they would probably have like three or four standard options
00:26:52.420
that they do here's our we offer three charcuterie boards and this is what we do and for this level
00:26:58.120
and you just pick one right well this essentially is just copying that and um but the but my point is
00:27:05.620
it's it's also giving you that virtual assistant role of you fact check this you tell me where this
00:27:11.100
source is you tell me what that is and so you become more of a a manager of the assistant i mean it's
00:27:17.780
like it's like having you manage your machine sure it's like you manage your machine i mean we've seen
00:27:22.980
this in we've seen this in offices sure like we've seen this in offices for years right there used to be
00:27:29.300
um a secretarial pool and if you needed something typed you would bring it up to the secretarial
00:27:33.900
pool and they would type it now you you basically type it yourself or you dictate it you also had
00:27:39.000
this crazy position in offices called the receptionist and the receptionist would literally
00:27:44.740
answer phones and take messages on a pad of paper and give you that's all gonna be ai and pipers
00:27:50.360
all gonna well we don't have receptionists anymore at all right i mean are there even receptionists
00:27:55.700
people just call your direct line and then they hear your voicemail telling you to telling them to
00:28:00.780
send you a text doctors like i would say doctors yeah those people are also schedulers which perhaps
00:28:07.460
you're not going to need that but is this something that we really is this something that we really want
00:28:12.680
right i mean have you ever encountered a situation where you are trying to get something done and you
00:28:18.840
have to interface with machines in order to get that thing done only the machine does not have the option
00:28:24.740
that you need and you get stuck for example on some crazy phone tree where you're saying i need to talk
00:28:30.860
to an operator and it says please tell me the nature of your problem so i can connect you to the right
00:28:35.540
person and you tell them the nature of your problem and they list off it lists off like 10 there's no they
00:28:40.900
right it lists off maybe five or ten options that you can select from in order to get you to the right
00:28:47.100
potential human being to talk to and your option is not among them and eventually it just hangs up on you
00:28:53.060
and you can't actually get any answers i think that we not only diminish our ability to accomplish
00:28:59.680
things that are outside of perhaps the prescribed area when we engage machines for all of those little
00:29:06.940
basic tasks but we also diminish our humanity there's something so much more positive when you are
00:29:13.000
discussing something with a human being than when you're discussing it with a machine that is only
00:29:18.380
programmed to give you specific options i find it very frustrating to talk to machines uh when i call
00:29:26.000
airlines or anything else where the app stops working and it doesn't give me what i need to do
00:29:32.400
they have yeah i don't i don't like it alex's point was that these companies have spent billions of
00:29:40.860
dollars researching this and putting time into that because they know and and this is what i'm saying
00:29:45.720
like it's it's just going to happen because they know that that in the long run that investing in
00:29:51.860
this technology will save them so much money on labor costs so i'm just going to say it i mean it's not pc
00:30:00.360
or whatever you know that used to be a receptionist then it became all center in um you know bangalore
00:30:07.860
india um now it's going to be ai right it's going to be in a data center they're going to it's going
00:30:17.320
to get to the point where it's going to get to the point where you won't even know
00:30:23.420
yes but you're also going to feel increasingly isolated at least if you have any kind of soul
00:30:30.520
or spirit and you're not an npc you're going to feel increasingly isolated by these machines because
00:30:36.700
these machines do isolate you they put you in a little box they they lock you away they separate
00:30:42.320
you from human beings and then you start to personify the machine and you start to put your
00:30:47.540
hopes and dreams in this machine it's not it's it's not going to do well for our spiritual selves
00:30:54.340
well and this is this is and and you you know me you know how my family is um it's it's what you need
00:31:01.460
to do then right the right move for a person that if you is i and this is what i think is that you
00:31:08.220
use the machines and you use them as much as possible and you use them as time savers you're
00:31:12.640
always maintained you are in the driver's seat always but if you don't like what it wrote you
00:31:16.840
delete all that um but if it pulls up some research for you then why what's the difference between
00:31:22.120
going to google and finding the top three results or asking chat tpt for the same thing just for basic
00:31:27.200
research purposes not that bad and so what we've and we've already outsourced our thinking to google
00:31:32.780
it's just we have so well we've certainly we've certainly i wouldn't say we've necessarily outsourced
00:31:38.780
our thinking to google but i will say that we have outsourced our memory is that you must
00:31:45.620
take that free time that it has given you and that and you you can use it as an increase in
00:31:52.080
productivity and i think that's good and i think that's that's where tech where new technology is
00:31:56.500
great um but you must make sure that you are feeding your spiritual life as well and nurturing
00:32:05.220
your spiritual life and that includes finding a community that includes in in my case in our case
00:32:11.580
being catholic going to church um praying the rosary that means finding those things or even just you
00:32:18.360
know going outside and spending time with your kids um i took uh i took aj for a ride around on
00:32:24.380
on his balance bike the other day just make sure that you're not spending all your time with the
00:32:29.460
machine that you unplug and you use the extra time that you now have because of the blessing of remote
00:32:37.640
work and machines and all these other things that you're able to okay you're using as a digital
00:32:42.340
assistant whatever but you must also increase the time that you're spending doing that and you look
00:32:49.700
at it right you your people are commuting less which i think is wonderful i think it's fantastic
00:32:54.340
people are going into these uh ridiculous job centers less which i think is also in a sense good um we're
00:33:01.520
still kind of dealing with the way that we've changed work um you and i you know we we hardly ever see
00:33:07.520
each other in person and yet we work together every day and so you're saying this as a man with a with a
00:33:14.600
great family as a man with friends as a man with colleagues and a man with a very active and rewarding
00:33:20.800
life um i don't think that everybody has that opportunity and i don't think that everybody has that
00:33:27.760
opportunity growing up so once we start infusing our lives with these machines and this is happening
00:33:35.200
for children younger and younger children who are not raised in homes where the parents take time
00:33:41.340
to take them out and to do fun things with them these kids are married to their machines at very
00:33:47.220
young ages you see this all over the place you see kids just sitting there with their phones if you
00:33:53.140
ever go out to a diner and you see a group of teenagers i don't let my kids do that no you don't
00:33:59.480
you don't you are an exceptional human being right not everybody is like you right not everybody has
00:34:06.400
someone like you in their life either so that's very you know that's something that is is too bad
00:34:13.420
you know but it's true if you go out to like the you know local diner or whatever and you see
00:34:18.080
teenagers sitting around a table they're sitting there like this they don't necessarily know how to
00:34:23.420
communicate with each other and so i think when we take all of our mundane tasks
00:34:28.780
and we take all of our little building blocks jobs and we outsource them to a machine we are also
00:34:36.700
taking away the ability to build on the knowledge and the work um the work experience that you can
00:34:43.760
gain from that i remember distinctly when i was i think 18 years old i was working in an ice cream
00:34:49.360
shop in uh in chestnut hill pennsylvania in chestnut hill philadelphia um i was working in an ice cream
00:34:55.300
shop and my imagination was extremely rich i was always imagining crazy things that would happen
00:35:01.800
there was one time i got locked in the ice cream freezer and that was actually kind of dangerous
00:35:05.440
um but you're always you're meeting new people you're talking to other human beings who have
00:35:11.020
different life experiences from you like the guy who locked me in the freezer you know very different
00:35:15.180
life experience from me um you're you're meeting new people you're meeting different customers
00:35:20.500
you are talking you are imagining you are interfacing with the material reality of life and so how do you
00:35:30.400
come in right like let's say all of these things are done for you you can just tell the machine to
00:35:36.320
order your lunch uh and then your lunch arrives you don't have to prepare it you can tell the machine
00:35:42.300
to bring you your groceries for example you don't have to prepare it all the groceries come in a box
00:35:46.980
you just put the things in the oven and they cook for you you you tell the machine to send your laundry
00:35:53.280
out to get washed to order your clothes to do whatever it is that you need you tell the machine to do it
00:35:58.780
what skills are you generating as a human being if the basic building blocks are already done how do
00:36:05.540
you come in at a high level how do you come in at a high level of of communication of ability of skill
00:36:12.980
set if you don't actually need to um if you if you have nothing to build on how do you start but
00:36:20.880
that's reading shakespeare when you've never even read you know like the little engine that could
00:36:30.560
we can use technology just in the same way that we used we used fire to uh get out of the caves right
00:36:40.300
um the same way that uh agriculture you know kind of ended the hunter-gatherer lifestyle that uh which
00:36:48.380
is arguably you know people debate that as whether that was a positive or a negative um that
00:36:55.360
you're right in a sense it does make us softer but i'm also saying that it's coming and it's
00:37:04.640
just like with every other the rise of every other technology like the industrial revolution which
00:37:09.080
obviously is what tech kaczynski wrote about is that you can't just stop it we've never no you can't stop
00:37:16.420
it and so what i'm trying to say is i guess is that instead of saying let's stop all technological
00:37:24.160
progress or deny ourselves the ability to have technological progress is that we come up with a
00:37:30.100
way to manage the technology on a personal level right we we consider our screen consumption when i
00:37:36.660
was little my mom used to even say how much tv did you watch today my dad how many hours yeah and and
00:37:43.460
i'd always be like oh i just turned it on you know yeah of course my dad one time my dad one time i was
00:37:50.920
sitting there watching tv and my dad comes into the room and it was a commercial break and he goes
00:37:55.240
what were you watching and i was like uh i don't know and he was like it's off it's off go outside
00:38:00.940
you're done yeah that's over and and and and i think that's that's what it's going to take but i mean
00:38:08.880
especially for children but also for if people want to be able to create and maintain meaningful
00:38:15.980
relationships and fulfilling lives that's the key because just having your basic needs met does not
00:38:23.480
mean you will have a fulfilling life that's the problem like idiocracy yeah it's well idiocracy it was
00:38:30.500
all for it anyway but because they didn't have food um but in uh in a life where all your basic needs are
00:38:39.620
met you will be unfulfilled because that is also one of your basic needs and so we have to recognize
00:38:44.360
that fulfillment is one of our basic needs and so this is the difference between pursuing pleasure
00:38:50.560
and pursuing joy for example so you should pursue joy very different you should pursue very different
00:38:56.080
fulfillment instant gratification um pleasure that's you know that's um uh all this huxley right
00:39:04.160
that's that's brave new world all the pleasures are at your fingertips but you know what everyone's
00:39:09.960
depressed no one's fulfilled because to your to your point as well uh you know you do need those
00:39:16.700
you do need those lives you do need though you need to live those uh my brother um kevin just uh he
00:39:23.660
sent me a message and a video uh just the other day where he said he said i went out and um he's like
00:39:29.420
i have these thick weeds in my backyard and my my lawnmower wasn't um yeah it wasn't wasn't working
00:39:35.560
on it well so i got a scythe and like a classic like like soviet union 1930s style scythe restored
00:39:44.260
it and he's using that to just he's he's scythe in those weeds you know he's he's he's reaping he's
00:39:51.080
reaping grimly uh he isn't very grim about it actually he was kind of happy but it's it's and kevin
00:39:56.220
works with his hands um on a daily basis and so you know it's it's he there is something to be said
00:40:02.760
for that there's something to be said for manual labor for doing those tasks that that you build
00:40:08.340
from and i think it is something that's innately built into the human condition and so if you don't
00:40:13.480
have uh you look at you know say the elites who have you know the one percenters who have all their
00:40:20.720
basic needs are met and for them you know they've had you know servants have been around forever uh you
00:40:26.320
know this idea of someone can do the shopping and the cooking and the cleaning that's you know that's
00:40:31.080
the elites have had that since time immemorial right um but you know even you think caesar was
00:40:37.680
doing his laundry washing all those togas um well he had to he really had to wash one uh well i guess
00:40:45.340
he couldn't do it but maybe brutus yeah i bet that one just i bet they checked it you think you think
00:40:50.900
they checked it probably yeah or maybe they buried him in i'm not sure actually it'd be interesting
00:40:55.180
you know you know what we could do is we could ask chat gpt and i bet it would give us an answer
00:40:58.980
chat gpt does not go know what happened to julius caesar's toga after he was brutally stabbed and
00:41:05.240
betrayed i'm gonna ask gpt just all right i'm gonna ask you right now i'm gonna ask you right
00:41:09.740
now because i'm signed in i'm gonna ask you right now i'm gonna ask you what happened to julius
00:41:16.580
all right it is not clear what happened to julius caesar's toga after he was stabbed on march 15 44 bc
00:41:22.440
as there are no historical records that describe its fate however it is possible that the toga was
00:41:26.880
either taken by one of the assassins or by someone else present at the scene may have been left behind
00:41:30.820
or later disposed of by caesar's followers or the authorities it is worth noting that in ancient
00:41:35.080
rome the toga was a symbol of roman citizenship and was worn by male citizens in public the toga
00:41:39.540
candida which is a bright white toga was traditionally worn by candidates for public office including caesar
00:41:45.180
during his political career however the toga candida was also worn by those who sought to portray
00:41:50.700
themselves as victims of injustice or oppression some cases may have been wearing such a toga on
00:41:58.040
the day of its assassination so what happened to caesar's toga okay that's pretty it tells you we
00:42:03.940
don't we don't have the records but that's that's pretty impressive which is what we just said we just
00:42:08.540
said that we don't know what happened to caesar's toga chat gpt doesn't know what happened to caesar's
00:42:13.440
toga and then covered it up with a whole bunch of extraneous information that was not asked for
00:42:17.720
okay but that was also useful and if we were writing an end here we do it here we are doing
00:42:24.540
a podcast about ai and ai just don't answer a question that no it didn't it didn't answer the
00:42:30.400
question it just gave us other information because we don't have an answer but that's my point is it
00:42:34.600
right into the world like a search okay would you feel this way about engines you know i'm not a big
00:42:40.620
fan of search engines but because they always give you biased results no and i'm not i'm not i'm not
00:42:46.840
arguing about the bias question right okay that's i'm not talking about bias but i'm saying would
00:42:52.020
you be opposed to using a search engine as opposed to going to the library and using the dewey decimal
00:42:56.240
system no but i do miss the dewey decimal system because i'm a little old school about that and i
00:43:01.220
kind of wish i had a card catalog and that my books in my home were organized according to a card
00:43:05.500
catalog i do wish that so i will just throw that out there and i'm not i'm neither not i'm not
00:43:12.600
against it and i love it but i am saying though that if i went to search engine of your choice i
00:43:18.900
imagine that if we spent five ten minutes searching we would still end up at this answer my point is
00:43:23.940
it was able to give us this answer in five ten seconds well that's useful sure that's useful
00:43:31.740
as an information gathering tool is useful and if we continue to look at it as an information
00:43:37.440
gathering tool then that's great but we already saw that chat gpt you can't if you can't get them
00:43:44.160
to um to uh renounce their position you just start chipping away at different pieces of it sure yeah
00:43:50.280
so chat gpt also encouraged a man to commit suicide and he did right we saw that that's an issue uh
00:43:58.460
whatever it was it was some kind of ai talked him into it uh talked him into committing suicide
00:44:07.300
we have people who find friendships yeah well if there's demons in there i mean and there are
00:44:14.840
and there's some kind of intelligence talk other people into committing suicide there was that girl
00:44:19.940
boyfriend there are there are people that use the internet sure these are human beings but if what
00:44:28.120
we have is a if we have an ai tool yeah if we have an ai tool that is instructing people to kill
00:44:36.100
themselves then we have a problem with our tool a hammer is not going to tell you to kill yourself
00:44:40.940
a gun doesn't tell you to kill yourself right these are tools they don't tell you to do that okay but
00:44:47.340
even in that case i imagine he still had to use a hammer or a gun or or something right
00:44:54.620
yeah i don't know how he went about doing it for sure but either way he's still ultimately responsible
00:45:00.340
for that how is he ultimately responsible if the woman who told her boyfriend to kill himself
00:45:05.320
ended up in jail because i think what she got was something like a um it's the idea she sent him text
00:45:13.320
messages it's like she's an accomplice well so is this ai uh again which goes back to exactly what i said
00:45:21.440
i'm not dismissing the fact that these can be dangerous i'm also saying they can be useful
00:45:26.140
the same way that nuclear energy and nuclear technology that we said at the very beginning
00:45:30.580
that oppenheimer and that those nuclear bombs they killed a lot of people a lot of people who
00:45:35.660
were innocent and now germany is shutting down its nuclear energy and california well no right but
00:45:41.460
not a good idea um actually not a good idea just talked about this on full send but in addition to
00:45:47.980
that it can also heat homes it can power hospitals it can lower costs it can it's it's currently
00:45:54.700
defending the united states throughout the entire united states navy um it is the same technology that
00:46:00.000
is doing so yeah and i'm very pro i'm pro nuclear technology for sure and it's true you know we know
00:46:06.160
from i think jaca lul that if the technological if the technology is invented it will be used we know
00:46:12.280
that for sure i do think however that human beings what i would say it's like checkoff's gun
00:46:18.680
yeah i think i have technological society right here well who knows anyway um but yeah you need that
00:46:26.180
need that decimal system i know i need the card catalog what i really miss is the subject catalog that's the
00:46:32.780
best one um right because it's like it's all there anyways my point is that i don't think human beings
00:46:40.820
are necessarily ready for what's in store we are not educating children about tech instead we're
00:46:46.820
educating them about being born in the wrong body we're not giving kids the tools that they need
00:46:52.780
right i mean think about your fire example going back to prometheus so when kids are going to learn
00:46:58.700
about fire they're taught about fire they do boy scouts or you teach them how to use matches you teach
00:47:04.180
them how to use the stove i started teaching my son how to cook for himself at like eight years old
00:47:09.720
because he was interested and i was like sure let's learn how to make some scrambled eggs we'll go for
00:47:13.620
it um i think that that's what you do you teach kids about this stuff we're not teaching kids about
00:47:20.200
the dangers of technology we're not teaching them how to use it we're just handing it to them
00:47:24.360
as though it's part of their personal lives and then it becomes part of their personal lives
00:47:28.720
so if we're going to teach kids about fire and how not to burn down their homes why aren't we
00:47:33.540
teaching them about tech and how not to burn down their souls couldn't agree more just so you know
00:47:40.260
libby i've been hit all of my assignments have been chat gpt for like the last six months just fyi
00:47:45.500
i for a fact no that's not true but it could be it could be ladies and gentlemen no i don't think
00:47:54.900
so we're just about out of time libby where can people follow you or what are your coordinates
00:47:59.620
you can find me on twitter at libby emmons and of course you must check out the postmillennial.com
00:48:06.640
and humanevents.com if you want to go ad free you can subscribe and we would love it and you will
00:48:12.360
love it and it's the postmillennial.com slash subscribe ladies and gentlemen as always you
00:48:18.880
as always you know what i mean you can simply show up whatever the issue is