#276: Utopia is Creepy
Episode Stats
Words per minute
166.26862
Harmful content
Hate speech
1
sentences flagged
Summary
In his new book, Utopia is Creepy, Nicholas Carr argues that the future envisioned by many in Silicon Valley is, well, kind of creepy. And he suggests a middle path toward technology that doesn t reject it fully but simultaneously seeks to mitigate its potential downsides.
Transcript
00:00:00.000
Brad McKay here and welcome to another edition of the Art of Manliness podcast. A few weeks
00:00:19.240
ago, I had futurist Kevin Kelly on the podcast to discuss the technological trends that are
00:00:23.660
shaping our future from driverless cars to artificial intelligence that will make new
00:00:27.560
scientific discoveries. Kevin paints a pretty rosy picture of what's to come. My guest today
00:00:32.480
sees a different side of the coin and argues that the future envisioned by many in Silicon Valley
00:00:36.740
is, well, kind of creepy. His name is Nicholas Carr, and he's the author of several books that
00:00:41.340
critique the wide-eyed utopianism of technologists. In his book, The Shallows, he reported on the
00:00:45.440
research that shows how Google is making us dumber. In The Glass Cage, he explored the science and why
00:00:50.220
outsourcing our work and chores to computers and robots might actually make us miserable and
00:00:54.560
unsatisfied in life. And in his latest book, Utopia is Creepy, Carr pulls together all the
00:00:59.860
essays he's written over the years on how the rapid changes in technology we've seen in the past few
00:01:04.060
decades might be robbing us of the very things that make us human. Today on the show, Nicholas and I
00:01:08.980
discuss why he thinks our utopian future is creepy, how the internet is making us dumber, and why doing
00:01:14.060
mundane tasks that we otherwise would outsource to robots or computers is actually a source of
00:01:18.440
satisfaction and human flourishing. We finish our discussion by outlining a middle path approach
00:01:23.140
technology, one that doesn't reject it fully, but simultaneously seeks to mitigate its potential
00:01:27.900
downsides. After the show's over, check out the show notes at aom.is slash utopiaiscreepy,
00:01:33.260
where you can find links to resources where you can delve deeper into this topic.
00:01:46.720
So I've long been a fan of your work, The Shallows, The Glass Cage. Your new book is Utopia is Creepy,
00:01:52.820
which is a collection of blog posts slash essays you've written over the years about
00:01:57.840
technology's influence on our cognition, how we think, our culture, our autonomy, the gamut.
00:02:05.260
So let's start off with this broad question. One of the criticisms you make at Silicon Valley
00:02:09.140
in particular is that they're not just selling us gadgets and software, and that's what we think
00:02:13.140
they're selling us, but you argue they're also selling an ideology. What is that ideology,
00:02:19.780
you know, and why do you think it's bad for human flourishing?
00:02:24.520
It's an ideology that has deep roots in American culture and American history. There's a strain of
00:02:32.380
technological utopianism that runs through U.S., United States thinking going back a couple of
00:02:40.760
years. And it assumes that technological progress will bring us to, will solve all our problems and
00:02:54.340
bring us to some kind of paradise on earth. And second, and more insidious, I think it assumes that
00:03:00.900
we should define progress as technological progress rather than, and I think this is a better way to do
00:03:07.440
it rather than looking at technology as a tool that gets us to some broader, that moves us forward to
00:03:14.880
some broader definition of progress, cultural, social, economic, or whatever. With Silicon Valley,
00:03:21.200
I think it gives this long tradition of techno-utopianism a new twist and kind of a new ideology that is all
00:03:30.180
about venerating the virtual over the real. So I think, on the one hand, Silicon Valley is very
00:03:39.180
materialistic. It wants to measure everything. And what can't be measured is kind of not even worth
00:03:44.960
keeping track of or giving any value to. But at the same time, it hates the materiality of the world
00:03:52.120
and even the materiality of the human body. It wants, it believes that by virtualizing everything,
00:03:57.860
by running the world with software, and perhaps even creating a new virtual world out of software,
00:04:04.560
will solve the kind of messiness, the emotion, emotionalism, and so forth that characterizes
00:04:12.240
human beings, and also the messiness that characterizes the physical world. So I think
00:04:18.760
Silicon Valley has this kind of misanthropic ideal that physicality is the problem that we need to
00:04:27.660
solve. And if we can turn everything into algorithms and even turn ourselves into artificial intelligence,
00:04:33.460
we'll be better off. Right. They're all about making things frictionless.
00:04:38.760
Right. And it turns out, I would argue that friction is what gives interest and fulfillment
00:04:46.440
and satisfaction to our lives. It's coming up against the world and figuring it out, figuring out
00:04:52.300
how to act with agency and autonomy, gaining talents and skills, all these things that emerge from coming
00:04:59.900
up against hard challenges and coming up against friction. This is, I think this is what gives interest
00:05:04.960
to our lives. And I think the tech industry sees all of this as something to get rid of. The less friction
00:05:15.060
there is, the more, you know, everything will run efficiently and we won't come up against challenges
00:05:21.020
or hard work or things that might make us fail. But it seems to me that that's a recipe for removing
00:05:29.340
satisfaction and fulfillment from our lives. Right. And that's why you think utopia is creepy,
00:05:34.360
or at least how Silicon Valley envisions it? Well, I think beyond Silicon Valley, I would argue that
00:05:41.480
all kinds of visions of utopia tend to be creepy. There's this famous concept of the uncanny valley
00:05:50.320
in robots. So what that says is that the more humanoid a robot becomes, the creepier it becomes,
00:05:59.840
because we're very, very good at, we're social beings, we humans. And so we're very good at picking
00:06:07.760
up signals from other living things. And it becomes immediately clear when there's a robotic
00:06:15.720
attempt to mimic a human being that this is not a human being and we get creeped out. And that's one
00:06:21.500
of the big problems that roboticists have as they try to create humanoid robots is that these always
00:06:28.180
seem creepy to us. And that's the uncanny valley that it's very hard for roboticists to cross.
00:06:34.180
And I think something very similar happens in portrayals of utopias. Because almost by definition,
00:06:42.140
there's no tension in a utopia, no friction. Everybody behaves very well. Everybody is on their
00:06:50.620
best behavior all the time. And when you see that, all of a sudden you realize that people begin in
00:06:58.000
utopias, people act very robotically. They're very efficient. They have no messy emotions. They don't
00:07:04.740
get angry. And I think that that characterizes utopias in general, which is one of the reasons
00:07:11.460
that in science fiction, we're much more drawn to dystopian portrayals of the future as this horrible
00:07:17.380
mess. Whereas attempts in fiction or in movies or whatever to create a vision of utopia always turns
00:07:24.700
out to be more repellent than the dystopias because we don't see any human qualities there.
00:07:31.900
And I think the ideal that Silicon Valley has, the utopian ideal where everything is very efficient
00:07:37.780
and runs on software, is very much this kind of creepy ideal that in order to achieve it,
00:07:44.500
you have to drain human beings of what makes them human.
00:07:49.060
And why do you think they still have that drive? I mean, why don't they see the creepiness of it?
00:07:52.780
Like, I look at them like, man, that's weird. Like, why don't they see it?
00:07:57.240
Well, I, you know, I think some of it is, comes from their personalities that I don't think in,
00:08:04.000
you know, this might, this is a generalization, but I think it's some, to some extent true. These,
00:08:10.360
a lot of the leaders in Silicon Valley have spent most of their lives interacting with the world
00:08:16.540
through computer screens and that suits their personality. They're not necessarily, uh, open to
00:08:22.160
ambiguity, ambiguity, ambiguity, or to, uh, messy emotions or the kind of social conflicts that,
00:08:30.580
that come whenever you engage, um, with people face to face and with the world. So I think they're,
00:08:37.460
I think they come from, you know, I think their ideals reflect a personality that is very comfortable
00:08:45.980
with computers and very comfortable with software and programming. But when things aren't programmed
00:08:53.300
and things happen unexpectedly and perhaps inefficiently and ambiguously, they're, they draw
00:09:01.300
away from those things. So, so I think in, in some sense, what the, the world that Silicon Valley
00:09:07.020
wants to create, and it's, uh, to me, it's a very robotic world is the world that these people
00:09:13.440
actually want to live in. That's interesting, which is, it's just strange because like, you know,
00:09:18.200
the internet, the, the, one of the promises of the internet would like, it would be this sort of,
00:09:22.660
uh, it would be sort of utopia, you know, it's sort of utopia where you could see have different
00:09:26.440
types of viewpoints, different types of ideals all together that anyone can access. But the
00:09:31.260
way it's worked out is we have these people at the top who are actually structuring the internet
00:09:36.200
in a way that suits their personality and the way they like things. And we have to go along with it.
00:09:41.240
Right. And, you know, one of the things I try to, I've, when I put together Utopia is Creepy,
00:09:48.200
this collection, one of the things I did is read through my blog going back a dozen years. And,
00:09:52.900
and I realized that a lot of, a lot of the dreams about the internet and, and ones that a lot of us
00:09:59.940
held when, when we first started going online have not only haven't panned out, but in many cases,
00:10:06.020
the opposite has happened. So we, we thought that the, you know, by going online, we'd bypass
00:10:12.480
centralized, uh, centralized hubs of power, whether it's political power, media power, economic power.
00:10:21.280
And we'd have this great democratization where everyone would have their, their own voice. And,
00:10:26.420
and we'd listen to lots of different viewpoints. And what's happened is, is really, we've seen
00:10:31.720
a, an incredible centralization of power, power over information. So you get a handful of, of
00:10:38.820
companies like Google and Facebook, uh, and, and Amazon and so forth that control now huge amounts
00:10:45.500
of the time people spend online and huge amounts of the information that they get. And, and, and so
00:10:51.760
more and more of our experience is being filtered by these companies. And need, needless to say,
00:10:58.460
they're, they're motivated not only by their ideology, but by their desire to make money. These
00:11:03.840
are profit-making companies, of course. Um, and so I, I think a lot of the, a lot of the feelings about
00:11:10.880
democracy, democratization of information about people broadening their viewpoints, uh, has not
00:11:17.920
panned out. And, and I think what we're, what we're learning is that when we're bombarded by
00:11:22.980
information, the way, way we are these days, we actually become less open-minded and more polarized
00:11:29.160
and more extreme in our views. And I think we saw that, you know, in the recent election, and I think
00:11:34.140
we see it in political discourse that, that, that our hopes for, for the, our hopes about how society
00:11:41.860
and ourselves would adapt to having this constant bombardment of information just haven't panned out.
00:11:47.480
And, and now we have to struggle with consequences that we didn't foresee.
00:11:52.060
Right. And I think one of the insidious things about the internet, or at least the way it's
00:11:55.900
structured, is that it gives us the illusion that, um, we have freedom, right? Like we can spout our
00:12:01.600
opinion on Facebook or Twitter, and we think we're participating in democracy and that we're, you know,
00:12:07.360
expanding our viewpoints, but you argue, I mean, I think you just made the point that it actually is,
00:12:12.720
is an illusion. Like it actually, it reduces our autonomy and it reduces our agency.
00:12:17.660
I think, I think that's right. It, in some of that is, is simply because we become more reflexive
00:12:24.440
when we have, when we have to process so much information, so many notifications and alerts and,
00:12:29.640
and messages so quickly that we have to, we have to deal with it in a superficial fashion. And so we may
00:12:35.860
think we're, we're, you know, being participative if we click a like button or retweet something,
00:12:42.080
but really this is a very superficial, uh, way of being, uh, being involved and participating.
00:12:49.080
And it's on the terms, uh, determined by the technology companies, by the social media platforms.
00:12:55.700
Um, you know, it's in their interest to get us kind of superficially taking in as much information
00:13:02.660
as possible because that's how they collect behavioral information. That's how they get
00:13:06.680
opportunities to show us ads. Um, and, you know, I would argue that in this, that on the one hand,
00:13:14.560
you know, there's the great benefit of the internet, which is that it does give us access to,
00:13:19.000
to information and to people that used to be hard to access or, or impossible to access.
00:13:24.460
On the other hand, what it's stolen from us is kind of the contemplative mind, the calm mind
00:13:32.000
that takes this information, takes our experiences and our conversations in quietly make sense of
00:13:38.620
them. And I think that, I think that ultimately, you know, you need that space in which to think
00:13:48.120
by yourself and without interruption, without distraction, in order to develop a rich point of
00:13:53.700
view and hence, you know, express yourself and express yourself and communicate yourself in a,
00:13:59.900
in a rich way, rather than this reflexive way that we've adapted to online, which does give us this
00:14:06.960
illusion that we're constantly participating, constantly part of the conversation, but really
00:14:12.060
kind of ends up narrowing our perspective, makes us more polarized, makes us quicker to reject,
00:14:20.340
you know, information that doesn't fit with our existing worldview. Um, so I do think there's
00:14:25.760
this kind of illusion of thinking and illusion of participation that often isn't the reality of
00:14:34.100
what's going on. Right. And along that lines of participation, um, you know, one of the benefits
00:14:39.920
that technologists tout about the internet is that it makes, it democratizes the, the ability to create
00:14:45.040
content, right? We're no longer just consumers. We're also creators, but what people forget is
00:14:49.780
that like, you're creating that for the company that you're kind of working for the company for
00:14:53.960
free, right? When you create Amazon reviews or create YouTube videos or create content on Facebook
00:14:59.180
or Twitter. That that's right. And this is something I, I coined a term digital sharecropping
00:15:05.960
to a kind of an ugly term, but I think it, it describes this, that, that what, you know,
00:15:12.100
whether it's Google search engine or whether it's Amazon reviews or whether it's the entirety of
00:15:16.560
Facebook or Twitter, essentially the content that these companies use to make money off of is the
00:15:23.380
content we create. So, so similar to a sharecropping system where, you know, a plantation owner would
00:15:28.640
give a, a poor farmer, a little plot of land and some tools, and then would take most of the economic
00:15:33.580
value of any crops that were grown, uh, were given by these social media platforms, these little,
00:15:39.100
uh, little plots of land to express ourselves and to develop our profiles and, and to share and so
00:15:47.180
forth. But all of that creativity and that goes into that is monetized by the company. So, so we become
00:15:54.620
these people who, who create the content without getting any compensation for it, any monetary compensation
00:16:02.180
for it that allows companies like Facebook and Google to become enormously rich. Um, and that's
00:16:08.420
not, you know, that's not to dismiss some of the opportunities that the, the internet, the web does
00:16:14.640
really give us to express ourselves. I mean, I've, as I say, I've written a blog for a long time and I
00:16:19.560
enjoy that. And I, I feel like I've, you know, been able to clarify my own thoughts as well as speak to
00:16:26.300
an audience that I might not have, but I do think we need to recognize the kind of economic dynamic
00:16:32.480
that underlies a lot of what we do online and how in a very real sense, even though we, we don't notice
00:16:39.900
it, we are being exploited and manipulated, um, and kind of outside of our own consciousness, sometimes
00:16:48.520
we're kind of employed without pay to create these huge, very, very powerful and very rich companies
00:16:55.140
in, in, as well as very rich owners of them. Right. I thought another criticism you made was funny is that
00:17:01.180
the idea that, Oh, if you democratize that, well, I suddenly have all this great stuff, this great content,
00:17:06.160
but like most of the content that's sell, you know, amateur created is crap. Like, I mean, it's, I know
00:17:11.800
it's mean to say, but like Instagram comedians are the worst. I don't understand why people think that's
00:17:19.240
funny, but apparently it's a thing. Yeah. And it, it kind of, sometimes it shows that the audience,
00:17:27.720
you know, when they get free stuff, well, and they, they'll, they'll look for the most superficial
00:17:32.960
kind of buzz and, and that'll be enough because they don't, that they're, nobody's encouraged to
00:17:38.600
spend time kind of developing taste or, or thinking too carefully about things. Um, and you know,
00:17:44.260
this is, this kind of dream that everybody, you know, is going to be a great writer or a great
00:17:51.620
filmmaker, um, or a great musician, you know, unfortunately it's just not true. And, and I
00:18:00.360
think a lot of people understand that. Uh, I certainly understand that, you know, I'm not
00:18:05.040
going to be a great songwriter and so forth. And so, so again, this is another illusion that the web
00:18:10.340
sometimes gives us that, that it's always better. What the web tells us is this kind of myth of
00:18:17.940
participation that, you know, if you're just passively reading something or watching a movie
00:18:22.980
or a TV show or listening to a podcast, there's something wrong with that. I would argue that
00:18:27.600
that's exactly the opposites that a lot of the greatest satisfactions come from being a member of
00:18:34.320
an audience of good stuff in that we, we shouldn't feel that if we're not, we should, we shouldn't
00:18:41.060
mistake kind of a rich experience of other people's creative work as a passive experience.
00:18:47.520
It's actually very active, um, as anybody who's read a great novel or watched a great play or anything
00:18:53.160
knows. And so, you know, the web and, and a lot of the companies on the web kind of encourage us to,
00:19:01.060
uh, to think that we have to be active and participative and creative all the time. Well,
00:19:07.140
that's very important, but let's not lose sight of, uh, of the great pleasures of being an audience of,
00:19:13.300
of really good stuff. So in the shallows, um, you, you take a look at how the web has changed our
00:19:19.500
brains and you, you talked about, you began the book talking about how you've noticed your brain
00:19:24.300
change over the years. Like you can't focus as much. Um, it's hard to sit through and read a book
00:19:30.000
for a long period of time. Um, and one of the arguments you make in that book is that every
00:19:34.400
information technology, we're talking in the alphabet, uh, read the book itself and the internet
00:19:40.720
carries with it an intellectual ethic. Um, what's the ethic of the internet, of the internet?
00:19:46.880
Yeah. And what I mean by that is that, that all of these media technologies incorporate,
00:19:53.280
encourage us to think in a particular way. Um, and, and also not to think in other ways that they
00:20:00.680
don't support and that this is the ethic. I think what the, what the digital technologies in general
00:20:07.140
and the internet specifically, it, it values information gathering as an end in itself. Um,
00:20:16.100
and so what it says is the, the more information that we have access to, the faster that we are able
00:20:21.740
to process it, the more, you know, the more intensively it bombards us, the better that
00:20:28.440
more information is always better. And what, what's lost in that, I think is what everyone
00:20:34.680
used to understand, which is that information gathering very, very important, but it's only
00:20:39.660
the first stage in developing knowledge. Um, and, and certainly the first, uh, in early stage
00:20:45.720
in developing wisdom, if we ever get to that, but knowledge isn't about just gathering information.
00:20:51.300
It's about making sense of information, uh, going out, uh, having experiences, learning stuff,
00:20:57.740
finding, reading the news, uh, taking in information, but then backing away from the flow of information
00:21:04.140
in order to weave what you've learned together into personal knowledge. And this is, this is what's
00:21:10.980
lost, I think, in the ethic, the intellectual ethic of the internet. This sense that there are times
00:21:17.060
when you have to back away from the act of information gathering, if you're going to think
00:21:23.280
deeply and if you're going to develop a rich point of view, if you're going to develop a rich store
00:21:27.780
of knowledge, uh, you can't do it when you're actively distracted and interrupted by incoming
00:21:33.740
information. So I think, I think the, the internet is very, very good as a tool for information
00:21:41.800
gathering. But what it encourages us to do is to think that we should always be in the process of
00:21:47.180
gathering information. And I think that's the danger that, that the web presents.
00:21:54.660
Not very well sometimes. I mean, and this is also something I talk about in the shallows. I think as
00:22:01.500
human beings, we are there, we have this primitive instinct to want to know everything that's going
00:22:07.300
on around us. And I think this goes back to, you know, caveman and cavewoman days when you wanted,
00:22:13.100
you wanted to scan the environment all the time because that's how you survived. Um, and so we
00:22:18.340
bring this deep information gathering compulsion into this new digital world where there's no end
00:22:25.040
to information. And as a result, and I think we see, all of us see this in ourselves. We become very
00:22:30.780
compulsive about wanting to know everything that's going on, you know, uh, on Facebook or, or in news
00:22:38.420
feeds or through notifications and so forth. And so we, we kind of constantly pull out our phone or our
00:22:44.260
computer and look at it, even if it's completely trivial information. So I think there's this deep
00:22:49.620
instinct that, that the net in, in, in technology companies tap into, uh, that can, that can become
00:22:57.140
kind of counterproductive that, that keeps us gathering information and glued to the screen.
00:23:04.000
And so to me, the, the only way I found to combat this is to resist some technology. So for instance,
00:23:14.460
I, I don't have, I'm not active on Facebook or on most social media. And it's not because I don't
00:23:21.100
see the benefits of social media. It's because I know that these systems are designed,
00:23:27.140
to tap into this instinct I have to want to be distracted and interrupted all the time.
00:23:32.520
And in order to avoid that, I just have to say, no, I don't, uh, I'm going to lose the benefits of
00:23:38.020
Facebook. I mean, one thing you, you realize when you're not on Facebook is for instance, nobody wishes
00:23:43.040
you happy birthday anymore because you're not on Facebook. Um, but nevertheless, it does seem to me
00:23:48.060
that in order, if you value kind of the contemplative mind, the introspective mind, the ability to follow
00:23:55.740
your own train of thought without being interrupted sometimes, then you have to shut down some of
00:24:00.900
these services and make the social sacrifices that are inherent in shutting down services that
00:24:07.640
increasingly are central to people's social lives. So it's, it's a very, at this point, it's a very
00:24:13.900
difficult challenge to kind of bring more balance into your behavior, into your mind, into your intellect.
00:24:21.320
And, but to me, at least my hope is that I can raise awareness that there are sacrifices that
00:24:28.740
are being made here and we should be a little more diligent, I hope, in figuring out which of
00:24:33.960
these technologies are really good for us or making us more intelligent and happier and more fulfilled
00:24:39.600
in which are simply tapping into this compulsive behavior that we often evidence.
00:24:44.600
So, you know, one assumption that technologists have is that we'll, we'll be able to find technology
00:24:50.600
to fix problems, even problems caused by technology. I mean, do you think someone in Silicon Valley
00:24:56.240
will come up with something to fix the problem of the distractibility of the internet or?
00:25:02.080
Um, I think, I think there are technologists who are trying to do that. I mean, I think we've seen
00:25:07.900
an increasing awareness among the public and, and among, you know, people in Silicon Valley or in
00:25:15.780
other, uh, technology companies outside of Silicon Valley that this is a problem that, that we have
00:25:22.600
created a system that, you know, has huge benefits and huge potential, but increasingly it is keeping
00:25:30.200
people distracted and thinking superficially and often, you know, polarized and unable to, uh,
00:25:38.780
you know, you know, give credence to people's points of view that don't fit their own. And so I think you
00:25:44.540
see kind of attempts to create, uh, apps or other software programs that reduce the intensity, the flow of
00:25:52.740
information that, that, that kind of, uh, vary the flow of information, turn off, you know, some feeds
00:25:59.940
at some, at, at times when people might be, might, might get more out of thinking without distraction
00:26:09.080
and being alone with their thoughts than, than looking into a screen, um, kind of creating a more
00:26:14.720
unitasking, uh, environment where there aren't lots of windows and lots of tabs and lots of notifications
00:26:22.140
going. The problem is that, that these are a hard sell because we've adapted ourselves very, very
00:26:31.260
quickly to, um, to this kind of constant bombardment of information in this sense that we're missing out
00:26:39.120
if we're not on top of everything that's happening all the time. So I, I do think, you know, and I think
00:26:45.640
we, we see this historically that, that often technology rushes ahead and creates problems that
00:26:52.560
were unforeseen and you can solve some of those problems with, with new technology. We certainly
00:26:57.620
see it in, in driving, for instance, with the creation of seatbelts and all sorts of technologies
00:27:03.740
that kind of make cars safer and, and so forth. But it can be, there's always this kind of tension
00:27:12.660
between the momentum a technology gains as it moves ahead and as we adapt ourselves to it and,
00:27:19.980
and, and the need to sometimes back up a little bit to redesign things, uh, to, to better fit with
00:27:27.920
what, you know, makes us interesting, well-rounded people. So, so I'm, I think, I think there are ways
00:27:36.320
to, to deal with some of these problems technologically through better design of systems, better
00:27:42.560
design of software. The question is, will we, as the public, accept those, uh, those, those
00:27:51.940
technological advances or are we stuck in this pattern of behavior that, that has been inspired
00:27:57.700
by the technology and the companies that are dominant in the technology?
00:28:02.620
Right. The other issue is there's, there's really no money in that, right?
00:28:05.040
That's, yeah, that's, as long as, I mean, one of the big issues is that we have set up the web and
00:28:13.740
social media and so forth as an advertising based system. Um, if you, if we were paying for these
00:28:19.680
things, you know, there, there was a time, uh, in the, in the error of the personal computer where if
00:28:26.480
you wanted to do something with your, with your PC or your Mac or whatever, you'd go out and you'd buy a
00:28:31.380
piece of software and you'd install it and then you'd use it for whatever you wanted to accomplish.
00:28:37.040
And that was actually a pretty good model. And we've abandoned that model for a model of, you know,
00:28:42.860
give it to me free, but distract me with ads and, and collect information about me. Um, and getting
00:28:49.000
away from that, you know, would mean actually having to pay for stuff. And, and we've, we've so
00:28:56.020
adapted ourselves to the idea that, you know, everything is free, uh, that boy, it's getting
00:29:02.600
people to pay for something that you, that they could get for free. It's a really, really a hard
00:29:07.420
All right. So, so in the glass cage, you take a look at artificial intelligence and this is the
00:29:13.360
stuff that creeps me out, uh, the most is AI. Um, I had Kevin Kelly on the podcast last week,
00:29:20.580
talked to him and he's pretty like, he's gung ho about this and it's great, but you, AI gives you pause.
00:29:28.420
Well, for, for a number of reasons. And, and again, you know, I, I don't want to, I don't
00:29:35.540
want to come off as just reactively against, uh, progress in computing and progress in AI,
00:29:42.440
because I, I think there are ways that we can apply, um, artificial intelligence that would,
00:29:47.660
that would be very good. And that, that would help us out and would help us avoid some of the,
00:29:52.560
uh, some of the flaws in our own thinking and our own perspectives. Um, but I, first of all,
00:30:00.220
the AI has, the, the definition of AI has gotten really fuzzy. Um, so it's hard to know, you know,
00:30:06.640
these days technology companies call pretty much everything AI, but where I, where I see the,
00:30:12.340
the problem with artificial intelligence, as it begins to substitute for human intelligence
00:30:19.060
in analyzing situations, making judgments about situations, making decisions about it,
00:30:26.260
is that it begins to steal from us, our autonomy, our agency, and also steals from us opportunities
00:30:33.520
to build rich talents of our own. Um, and I, I think we can see this in a, in a simple way with,
00:30:41.200
um, uh, navigational technologies, um, you know, Google Maps or, or, or GPS systems in your car
00:30:50.860
that on the one hand, they, they make it very, very easy and kind of mindless to get from one place
00:30:57.260
to another. But as a result, we don't develop our own navigational skills. And also we don't
00:31:03.040
pay attention to our surroundings. And so don't develop a sense of place. And it turns out that
00:31:08.620
those types of things, the ability to make sense of, of space and of place and to be attuned to your
00:31:14.400
surroundings is really pretty important to us. I mean, we are physical creatures in a physical world
00:31:20.320
and we have evolved to be part of that world. And so what we don't in, in our drive to, to make
00:31:26.300
everything more convenient and easier, often we sacrifice the things we take for granted,
00:31:33.400
which are all about learning to navigate the world and have agency in the world and have autonomy in
00:31:39.320
the world. We kind of take those for granted. And so we're very quick to lose them in order to gain a
00:31:44.860
little bit more efficiency or a little bit more convenience. And it does strike me that, you know,
00:31:49.480
beyond the kind of doomsday scenarios or the utopian scenarios of the singularity and, you know,
00:31:55.260
computers overtaking human intelligence at a practical level, the danger is that as computers
00:32:01.880
become more able to sense the environment, to, to analyze things, to, to make decisions that will
00:32:09.760
simply become dependent on them and we'll lose our own skills and our own talents in those regards,
00:32:16.380
you know, our own ability to make sense of the world and to, and to overcome difficult challenges,
00:32:21.840
we'll simply turn on the machine and let the machine do it. And unfortunately that that's a
00:32:27.260
scenario that gives us great convenience and great ease, but also I think, and this goes back to
00:32:34.660
something we talked about earlier, also steals from us the opportunity to, to be fulfilled as human
00:32:41.140
creatures in a physical world. Yeah. Like with self-driving cars, like I, I still don't get it.
00:32:46.960
Cause like, I enjoy driving. Like, I don't know why I'd want to give that up. Everyone's like, well,
00:32:51.500
it's safer. You can be more productive. It's like, I actually enjoy driving.
00:32:55.600
And that's true of the, I completely agree with you. And you know, the last thing, even though I
00:33:01.080
realized that there are ways, and this has been a long story with, with automobiles, there are ways
00:33:05.780
for technology to make driving safer. And I think that's very, very important. The fact is that,
00:33:10.780
you know, most people in, it's like 70 to 80% of people actually enjoy driving. And, and, and it's
00:33:19.940
not like they're blind to, you know, the dangers and, and to traffic jams and to road rage and all
00:33:26.640
the horrible things that come with driving, but there's something very pleasant about driving,
00:33:31.440
about being in, it's actually one of the rare times that, that we as individuals are actually
00:33:38.080
in control of a very sophisticated machine. And, and there's pleasure that comes with that.
00:33:45.060
And there's this sense of autonomy and a sense of agency. And in, in, in, in some ways, this is a
00:33:51.840
kind of a, a microcosm of the Silicon Valley view. Silicon Valley dismisses, I think Silicon Valley
00:33:58.400
is totally unaware or of the pleasure that people get from things like driving. Um, and so that
00:34:07.440
leads them, that leads them to simply see driving as a problem that needs to be solved because there
00:34:13.620
are accidents, because there are inefficiencies, because there are traffic jams. Um, all of that
00:34:18.960
is what they focus on. Um, and so they, you know, their desire is to relieve us of what to them is this
00:34:25.880
horrible, horrible, horrible chore of driving a car. And so they, they don't realize that for a lot
00:34:33.200
of people. And that driving is really a great pleasure and, and owning a car and all of, all of
00:34:39.460
that. And, and to me, that, that kind of, uh, puts in a nutshell, the, the, the tension between the
00:34:48.180
Silicon Valley ideal and how people actually live and how they get some satisfaction out of life.
00:34:53.400
Right. So again, it's, uh, this idea that they're giving us freedom, but in the process there,
00:34:58.920
we have to give up freedom to get that freedom, right? They're giving us freedom. They're freeing
00:35:05.100
us from that, which makes us feel free. I think you could, you could say. And then, but then we find
00:35:10.540
out that we actually enjoyed those burdens when it's finally taken away. We feel existentially
00:35:15.520
empty and we're like, Oh, I don't do anything. Right. And, and, and I do think that there is,
00:35:20.540
you know, some evidence. And I think this both comes from psychological studies, but also from
00:35:25.940
our own experience that, that when we're freed of labor and freed of, of effort, we actually become
00:35:34.060
more anxious and more nervous and more unhappy. And, you know, it turns out that it's, it's the,
00:35:40.240
it's the chores that's, that software frees us from that are often the things that bring us
00:35:46.340
satisfaction that in our, in our life, that, that the experience of facing a difficult challenge
00:35:52.480
in developing the talents required to overcome that challenge, that's very deeply satisfying.
00:35:57.760
And yet, if you look at the goal of, of software programmers these days, it's to find any place
00:36:04.920
where human beings come up against hard challenges and have to spend lots of time overcoming them
00:36:09.780
in kind of automating that process. Um, uh, so there, you know, that, that's why in, in many cases,
00:36:17.600
we think our lives are going to be better when we hand over something, some chore or some task or
00:36:23.980
some job to a machine, but actually we just become more nervous and anxious and unhappy.
00:36:30.780
Right. What, what's your take on virtual reality? Cause I mean, it's crazy. Like I remember back in
00:36:35.880
the nineties, like I'd go to the science museum and they had the VR thing. Um, you could go through
00:36:41.140
the human digestive system. That was the thing. Um, and it was like, and then I thought this is
00:36:46.080
the future. This is amazing. And then like it died, didn't go anywhere. Now we're seeing this
00:36:49.960
resurgence. Um, is, does virtual reality give you pause? Do you think it's going to catch on this
00:36:55.560
time? I think it's going to, well, I mean, there's the question, there's the kind of physical
00:37:00.300
question of how long can people be in a, you know, virtual environment without getting nauseous or
00:37:07.300
dizzy or whatever. And so let, let's assume that that will be solved, that, that we'll figure out
00:37:13.880
how to create, you know, systems of, of virtual reality that actually don't, that, that are actually
00:37:20.400
pleasant to be in. Well, I, I think it, I think it will have successful applications. I mean, I can see
00:37:28.420
it in gaming. I can see it in certain marketing aspects. You know, if you want to, uh, if you're
00:37:34.800
looking to buy a house or something or rent an apartment, you'll be able to put on your virtual
00:37:38.780
reality goggles and walk through the space. Uh, you can certainly see applications in pornography,
00:37:43.900
which will probably be one of the first to come along. But what I don't think will happen, I,
00:37:48.560
I think there's, there's this belief, Mark Zuckerberg, I think from Facebook has, has stated it,
00:37:54.900
that virtual reality goggles will become kind of the next interface for computing. So we'll,
00:38:00.760
we'll spend lots of time with our goggles on or some kind of virtual in, in some kind of virtual
00:38:07.120
reality, um, in order to, you know, socialize. And that's, that's what social media will become
00:38:14.240
and what personal computing will become. I don't think that that's going to happen because,
00:38:17.980
and I think you see signs of this from like the failure of Google Glass, that there's something
00:38:23.100
about, I think there's some deep instinct within us and within probably all animals that resists
00:38:31.480
having something else control our senses, something else control, something get in the way of our field
00:38:38.640
of vision. I think we can do it for brief periods. We can do it when we're playing a game or, or when
00:38:43.460
we want to accomplish a particular task that, that can be accomplished through virtualization of,
00:38:49.400
of space or whatever. But I don't think we're going to see people walking around, uh, with virtual
00:38:55.700
reality goggles or even with simpler, uh, device projection devices. I think, uh, I think I,
00:39:03.120
I'm very dubious about kind of smartphones being replaced, for instance, by virtual reality systems.
00:39:11.420
Right. Because it looks goofy and looks creepy.
00:39:13.100
It looks goofy and it looks creepy. And it also, it feels, you feel vulnerable. You feel weird
00:39:18.900
when, when you're, when you're seeing, you know, something that some, some, when you're cut off from
00:39:26.260
the actual real world and embedded in a world that somebody else is manipulating. I, I mean,
00:39:33.020
it, it's disorienting and it's also, I think we're repulsed by it after a while.
00:39:38.480
Well, Nicholas, people who are listening to this and they're, they agree with you. Like,
00:39:41.740
yeah, utopia is creepy. Like, I don't want any, I'll, I'll take some of this, uh, utopia that
00:39:46.920
they're offering because there's some benefits to it, but there's parts of it. I just know,
00:39:50.600
I don't want to go. Is it possible to opt out? Um, I guess it is possible. You, you, you don't do
00:39:55.860
social media. Um, yeah. Any other ways to opt out?
00:40:00.020
I mean, I do a little, for instance, I'm on Instagram, but I have a private account with,
00:40:05.180
you know, a handful of close friends and family members and it's really good. I mean, it's,
00:40:10.080
you know, if you restrict, if you place certain restrictions on social media, I think it can be
00:40:16.080
very, very valuable and very fun. So I'm not, you know, I'm not arguing for total opting out if,
00:40:23.160
as if that were even possible. I, I mean, I think one thing we know about the internet and computers
00:40:29.540
and smartphones is that it's actually very, very hard to live these days without, without those
00:40:36.200
kinds of tools because society as a whole, our social systems, our work systems, our educational
00:40:42.080
systems have rebuilt themselves around the assumption that everybody's pretty much always
00:40:48.240
connected, or at least has the ability to connect very frequently. So I don't think, you know, some
00:40:53.400
people will, will opt out totally just as some people opted out of television totally and so forth.
00:40:58.740
But those, I think those will be people on the margins. For most people, I think it's really,
00:41:04.780
the challenge is really more about developing a, developing a sensibility of resistance rather
00:41:14.240
than a sensibility of rejection. And, but, you know, often tech, techies will, will quote Star Trek and
00:41:21.780
say, resistance is futile. You know, the Borg of the internet is going to take us all over. So just give
00:41:26.580
into it. I think that's absolutely the wrong approach. I, I think it's valuable to resist
00:41:33.500
these kinds of powerful technologies. And, and this is a powerful technology. It's a media technology
00:41:41.060
that wants to become the environment in which we exist. And I think it's important to resist.
00:41:47.160
And by resist, I mean, instead of, instead of being attracted to whatever's new, to the latest novelty,
00:41:53.880
to the latest gadgets, the latest app, always pause and say, you know, how am I going to use this? How
00:42:00.180
do other people use this? Is this going to make my life better? Am I going to be, am I going to be
00:42:06.540
happier? Am I going to feel more fulfilled and more satisfied if I adopt this technology? Or am I just
00:42:13.520
going to be more distracted, more dependent on, uh, technology companies, uh, more, less able to
00:42:21.560
follow my own train of thought, uh, less well-rounded. And I think if we just start asking these
00:42:27.640
questions, um, and everybody's going to have different answers to these, but if we start asking
00:42:33.060
to these questions, I think we can be, become more rational and more thoughtful in what technologies
00:42:39.340
we adopt and what technologies we reject. And ultimately I think that's the only way to kind of
00:42:45.760
balance the, the, the benefits and the good aspects of, of the net and all, all related technologies
00:42:53.120
with the bad effects. And, and by now I think we all know that there are bad effects, that this isn't
00:43:00.360
just a story of, you know, uh, everything getting better. It's, it's a, it's a story about costs and
00:43:08.100
benefits and we have to become better at balancing those. And that really does mean becoming more
00:43:13.980
resistant and more skeptical about the technology in the promises being made about the technology.
00:43:19.500
Well, Nicholas, this has been a great conversation. Where can people learn more about your book and
00:43:23.220
your work? Um, well, you can go online. Yeah. I have, uh, my personal site is nicholascar.com where
00:43:31.280
you can find out information about my books and links to my articles and essays that I've written
00:43:35.840
in, in my blog, um, which I, I still write though, not as intensively as I used to is called rough
00:43:41.660
type. And you can find that at rough type.com. Awesome. Nicholas Carr. Thank you so much for your
00:43:47.080
time. It's been a pleasure. The pleasure was all mine. Thank you. My guest today was Nicholas Carr.
00:43:51.840
He's the author of several books, including the shallows, the glass cage and utopia is creepy. All of
00:43:56.900
them are available on amazon.com and bookstores everywhere. You can find more information about
00:44:00.320
Nicholas's work at nicholascar.com. That's car with two R C A R R. Also check out our show notes
0.97
00:44:06.380
at aom.is slash utopia is creepy. We can find links to resources where we can delve deeper into this
00:44:11.560
topic. Well, that wraps up another edition of the art of manliness podcast. For more manly tips and
00:44:27.660
advice, make sure to check out the art of manliness website at art of manliness.com. The show is
00:44:31.980
recorded on clearcast.io. If you're a podcaster who does remote interviews, it's a product that I've
00:44:36.380
created to help avoid the skips and static noises that come with using interviews on Skype. Check it
00:44:41.660
out at clearcast.io. As always, we appreciate your community support. And until next time, this is Brett