#267: The Technological Forces That Are Shaping Our World
Episode Stats
Summary
In this episode of the Art of Manliness podcast, my guest today has written a book where he lays out his idea of what the future looks like and lays out the 12 technological forces that are shaping our future. His name is Kevin Kelly, and he s the founding Executive Editor of Wired Magazine and the former Editor of the Whole Earth Catalog. He s also a consultant on Minority Report as a futurist and he spent his career thinking and writing about how technology, particularly the web, intersects with culture, business, and politics. And in his latest book, The inevitable, Kevin takes a look at 12 technological trends that are steering our future and provides a glimpse of what that future might look like.
Transcript
00:00:00.000
brett mckay here and welcome to another edition of the art of manliness podcast where we're living
00:00:19.180
in a time which the landscape is changing quickly thanks to technology steady jobs provided a living
00:00:24.660
for our fathers and grandfathers no longer exist and jobs that didn't exist 10 years ago are now
00:00:29.920
providing paychecks for hundreds of thousands of people and even the way we consume has changed in
00:00:34.460
the past 10 years thanks to streaming digital services and rental services like uber and airbnb
00:00:38.900
but where are these technological trends taking us how will they shape the future in 10 20 and even
00:00:45.280
30 years down the road well my guest today has written a book where he lays out his idea of what
00:00:49.540
the future looks like his name is kevin kelly he's the founding executive editor of wired magazine and
00:00:54.160
the former editor of the whole earth catalog he's also a consultant on minority report as a futurist
00:00:59.400
and he spent his career thinking and writing about how technology particularly the web how it
00:01:04.480
intersects with culture business and politics and in his latest book the inevitable kevin takes a look
00:01:09.740
at 12 technological forces that are shaping our future provides a glimpse of what that future
00:01:14.340
might look like today on the show kevin and i discuss the process he uses in making predictions
00:01:19.100
about the future the misconceptions he thinks people have about artificial intelligence why people are
00:01:25.040
likely going to own less stuff in the future and the business opportunities that will emerge as time
00:01:30.080
marches on we also discuss the technological trends that worry kevin the most so if you're looking for
00:01:36.860
a roadmap to navigate the brave new world we're entering you don't want to miss this podcast after
00:01:41.820
the show's over check out the show notes at aom.is inevitable kevin kelly welcome to the show hey it's
00:01:57.120
my pleasure thanks for inviting me i'm really excited to have you here because you came out with a new book
00:02:01.480
called the inevitable um and it's all about the 12 technological forces that are shaping our future
00:02:07.820
and before we get into um the specifics of your book let's talk about your career in general because
00:02:14.300
it's fascinating you um worked for whole earth magazine you were the founding executive editor of
00:02:19.180
wired magazine so you've been spending your career thinking and writing about technology particularly the web
00:02:25.480
and how it's influencing culture economics business law governments etc and oftentimes you make
00:02:34.760
predictions about where you see these trends are going to take us so i'm curious as a futurist as a
00:02:41.960
prophet which is a hard job to have um what's the process you go through when you you sort of lay out
00:02:48.180
your vision of where these trends are going with technology and how it's going to shape our future
00:02:53.240
because it's easy to get wrong in fact it's very likely that i'm wrong um even today um
00:03:00.760
um the general rule of thumb would be um uh anything specific um is basically inherently unpredictable um
00:03:13.480
what i'm trying to look at are the the kind of the bias the leaning in technologies which derive from
00:03:23.000
the fact that they're physical systems and that they run according to the law of physics or chemistry
00:03:28.680
and um that kind of constrains where they can go so i i look for those biases which will shape
00:03:39.560
the larger form and direction but um and that's all that you can kind of really um predict
00:03:50.040
so so i would say in a certain sense um like a quadruped four legs on an animal
00:03:57.640
that's that's something that's inevitable and just like we have four wheel vehicles that kind of
00:04:05.640
larger form is inevitable but you know a particular quadruped like a zebra is inherently unpredictable
00:04:12.840
so telephones were inevitable once you had electricity and wires but the iphone was not
00:04:20.040
um the internet was inevitable and would occur on any planet in any political regime once you have
00:04:27.320
have telephones but you know um twitter is not so um what i'm looking for are these um inherent
00:04:38.680
trends biases leanings that are occur in their kind of directions and i'm trying to identify those and
00:04:48.280
where i look for those is in places where the technology is unsupervised outwalled prohibited or
00:04:57.880
or used uh unsupervised you know by kids it's it's sort of where you can listen to it
00:05:03.880
kind of be what it wants to be and it kind of exhibits um
00:05:10.040
its true color so to speak and you can see this you can listen to where it wants to lean so so that's
00:05:15.320
where i'm looking i'm looking kind of at the edge of technology to see where the center will go
00:05:21.240
right so you're not like looking for specifics because that's hard to predict it's impossible
00:05:25.640
to predict and so if people want to know well is is apple going to see will continue us like
00:05:32.200
we cannot tell nobody can tell us it's there's so many variables that impact on that you know business
00:05:39.560
decisions executive personalities market weather that those are inherently unpredictable but if we
00:05:46.760
can say um what's the general directions of say like you know um mobility or phones in general or
00:05:55.720
screens then we can do something with that and that's actually very powerful so if 40 years ago
00:06:03.320
you had truly believed in moore's law that for the next 50 five decades computers would get
00:06:15.400
twice as fast and half as cheap every 18 months for the next 50 years if you truly believe it that's all
00:06:22.360
you needed to know to you know make a lot of money to invent lots of things to bend the culture to
00:06:31.320
you know um harvest tremendous abundance just if you just really believed that that was the direction
00:06:39.960
it was going and you kind of you worked on that and that was all you need you didn't know need to
00:06:43.960
know anything about ibm or apple you just needed to know that every year they would get twice as fast
00:06:50.200
half as cheap and you know half the size and so that was that was the trend that was
00:06:57.480
beginning to show itself back then and that's the kinds of things i tried to identify in my book okay
00:07:05.880
so let's talk about some of these trends in deep detail um the first one you talk about is becoming is
00:07:12.360
the trend um what do you mean by that and how does becoming manifest itself today in our technology
00:07:19.000
so the the general large-scale trend that that's either a sign of or a subset of is the fact that
00:07:30.520
you know beginning at least 50 years ago we have been moving to a world that's becoming more liquid more
00:07:40.440
more more about processes rather than products uh services rather than products things that
00:07:52.440
um are constantly uh changing being changed diversion numbers i mean the idea that you buy something is
00:08:00.840
version one and you then you get version two that's that's a very contemporary recent idea
00:08:06.200
um and and and the idea that the the thing itself remains the same and it changes itself over time you
00:08:14.120
know it gets its updates it's it's updated that also is part of this shift a large scale um from things
00:08:23.720
that are static and fixed to things that are moving and updatable and becoming something else and so there is a
00:08:33.960
sense in which everything is mutable everything is pliable everything is
00:08:42.280
upgradable and instead of you know trying to produce a fixed product like a car we think about
00:08:51.480
transportation services it's like you don't really care what it is you just want to move from a to b
00:08:56.200
how you get there serve not as important to i mean what the the the actual form or shape of that is not as
00:09:05.240
important as the other you know um benefits and so in general we're moving away from things that are
00:09:14.920
tangible to the intangible that's another way of talking about this dematerialization where the value
00:09:20.200
the things that we find most valuable the things that are not fixed in a um you know a rigid
00:09:29.640
physical form but actually more the value is in the intangible aspects so part of that shift means that
00:09:41.480
one that services become more important than products and two um everything
00:09:50.200
is being upgraded all the time and that's a new that's a new relationship that we have to
00:09:59.880
um come to terms with the stuff that we have that that we're in this um that it's always
00:10:06.280
um that when you buy something you're just going to assume that it's going to get better or improved
00:10:10.840
or change i just bought um an amazon echo dot which is this little thing it's an ai interface basically
00:10:18.920
that you talk to i know that there'll be firmware updates and the thing will get smarter over time
00:10:27.160
and so i mean in some senses you know i'm anticipating that i'm i'm banking on that
00:10:32.760
and um the way the cars are going these days it's not just your phone will upgrade itself is that even
00:10:39.560
cars will get these upgrades over time and we we have to understand that because there's also downsides to
00:10:45.640
that which is that um if if you don't upgrade everything together something will break so
00:10:53.320
when you upgrade one thing it usually requires that everything else in that ecosystem also come along
00:10:59.160
and so we are kind of in this constant um state of having to keep it's like gardening is having to keep
00:11:07.080
everything going and weeding and having hygienic purges and cleaning things and there's this there's
00:11:14.040
this kind of active gardening approach to things rather than just owning it and the third way that it
00:11:21.880
changes is is that it it forces us whatever our age is to be a newbie a perpetual newbie we're always
00:11:32.680
having to learn in some cases kind of relearn um these basic uh technoliterary skills where where you
00:11:42.840
know how to use a phone um how to use a computer we think we know today but there's a massive upgrade or
00:11:50.840
it's a new platform shift and we have to relearn it or there's a very complex software thing and
00:11:56.600
two years later there's they move the menu and we have to kind of learn again or there's a new brand
00:12:02.200
new um programming language that we have to learn even though we thought that we have mastered the last one
00:12:11.560
so there is this state of being lifelong learning uh of of always being the newbie and having to
00:12:23.800
start from scratch again and um that is going to be the default
00:12:29.000
for everybody in the future where um it's just not if you're 60 it's like even if you're 16
00:12:39.000
um you're going to be learning um something new next year that you didn't have to know the year before
00:12:47.320
right i thought that that last point is i've even noticed that in my own life you know i'm 30
00:12:51.400
and us 30 year olds often make fun of our parents like why don't you get the internet like why don't you
00:12:56.200
get email why don't they come over and help you but like even now i'm noticing there's like snapchat
00:13:00.920
for example that's right that's like i tried downloading snapchat like i don't know how this
00:13:04.120
works and so i'm just like i talk to like these teenagers can you guys can you show me how this
00:13:09.000
works i felt like parents probably felt yeah exactly and and you'll be you'll like oh i'm not going to
00:13:14.200
bother with gab or whatever it is and you know and and um you're kind of opt out things and you'll be
00:13:20.360
seen as kind of old fuddy-duddy because you didn't try you didn't even try and bother to to try it
00:13:28.440
and the well all this constant upgrading i mean is this eventually going to lead us to some sort of
00:13:33.720
utopia and i know everyone thinks utopia sounds great but like whenever i see like utopian futures
00:13:39.400
i'm like man it looks really boring yeah i mean that's why most of the hobby of movies about the
00:13:44.840
future are all dystopian because um it's dramatic it's um uh it makes good storytelling and in fact
00:13:52.360
i think the future will be boring uh because it will work and um the uh but i don't think it's be utopia
00:14:02.920
i i think it's as you say i think uh if we try to imagine utopia as everything perfect and static and um
00:14:14.840
it would not change very much and we'd be totally bored out of our minds but it also simply doesn't
00:14:20.200
work that's that's the reality so so there's really no fear of it but i think that it's an incorrect
00:14:27.640
vision of where to aim for i think a better vision of where to aim or or or efforts is what i call
00:14:35.960
protopia which is this idea that we're just trying to progress to progress to to move forward
00:14:45.080
in an incremental tiny creeping improvements and that that minor improvement every year when it's
00:14:55.800
compounded over decades or centuries become civilization so this is a big thing but we won't
00:15:02.200
even see it really except in retrospect because it's a one percent improvement in the world is drowned
00:15:09.320
out overwhelmed by the news of disasters and all the other ills that are present and there are many of
00:15:17.080
them and a lot of them are actually brought about by the new technology themselves and so
00:15:22.760
it is hard to see a one percent improvement overall on average and yet that i think that is
00:15:30.920
is that's something good to aim for is is that um if we can keep improving one percent a year
00:15:38.520
on average overall then we compound that annually and we have something magnificent
00:15:44.520
over the long term right so the kaizen effect right like is that from toyota the manufacturers improve
00:15:52.120
one percent and then yeah right um and as you were talking about that sort of the the maintenance and
00:15:58.360
like upgrading that seems like there's an opportunity for a business possibly absolutely
00:16:06.440
it used to be an old joke in the very dawns of the software industry which was software is free but
00:16:12.280
the manual is a thousand dollars and that's still true in a certain sense um this is another kind of
00:16:18.920
trend where you know you can't stop the copying of things copies themselves are ubiquitous perfect and
00:16:27.480
basically worthless and so you have to sell things that you can't copy well and the the idea of sort of
00:16:34.120
training or guiding or giving guidance to things that are coming that may be free and and that the the
00:16:43.960
value the the thing that's in short supply the thing that becomes precious is the context the
00:16:52.040
understanding what to do with it um so yeah you can get this thing it costs nothing but how do you use
00:16:59.320
it how do you maximize it how do you bring make it beneficial to you that actually may be something
00:17:04.600
you're willing to pay for right all right so let's talk about the next force which is um
00:17:09.240
cognition or cognitive cognizing what you call it it's artificial and cognifying cognifying right
00:17:14.840
artificial intelligence this is freaking a lot of people out right elon musk and stephen hawking
00:17:19.800
they're saying that artificial intelligence could possibly the last invention humans make because it
00:17:25.000
will destroy us um i'm kind of freaked out about it uh we have an amazon echo in our family and it's
00:17:31.240
just really weird to see my kids talk to alexa um asking her for the weather and asking her for jokes
00:17:38.040
um it is it is sort of weird and uh discongruous um should people be freaked out about artificial
00:17:46.360
intelligence like that it's going to destroy all the jobs and it will become sentient and skynet
00:17:52.040
is born be born i i have divided somewhat arbitrarily these trends into 12 and they're kind of like braided
00:18:02.200
tributaries of a river you could kind of cut them in different ways but um i think by far
00:18:07.880
the most important the most fundamental the most disruptive the most beneficial of all those
00:18:15.480
trends is this cognification cognifying which which i use as the word to make things smarter
00:18:20.600
we have a lot of intellectual cultural baggage around the idea of artificial intelligence and the
00:18:25.880
main the main distraction of that word is that we tend to think of it as a human-like intelligence and
00:18:32.200
one of the things i really try to stress in the book is that ai for various scientific reasons
00:18:41.480
is not like human thinking the only way you can have human-like thinking is if it runs on human-like
00:18:48.280
tissue and as these run along other things emulation is not perfect because of
00:18:54.920
time and space and therefore these intelligences are different and that's actually in uh their benefit so
00:19:01.400
the reason why we want to put an ai into our cars to self-drive is because they are not driving like
00:19:07.000
us humans last year last 12 months humans around the world killed one million other humans driving
00:19:14.920
humans should not be allowed to drive we're just terrible drivers and so um we want these ais to drive
00:19:23.560
because they're not being distracted worrying about what they left the stove on they're just driving and
00:19:28.200
they're engineered and optimized to drive well and but that's kind of it's kind of an inhuman way your
00:19:34.840
calculator is inhumanly smart in arithmetic way smarter than you are google is inhumanly smart in its memory
00:19:46.760
it has memorized every single word on six trillion web pages that's how it finds stuff and so
00:19:56.440
we're making we're going to add further levels of complexities and our you know our own minds are
00:20:03.960
basically a suite of a symphony of different modes of thinking there's deductive reasoning there's
00:20:10.920
inductive there's symbolic uh uh reasoning there's long-term memory there's there's there's dozens of different
00:20:19.160
modes of thinking that go into our minds and we'll add more types and we'll invent whole new types of
00:20:28.200
thinking just like we invented new ways of flying we made artificial flight not by imitating the flapping
00:20:35.160
wings of a bird but you know we make a flat barn door and put it on a jet engine and if you can do
00:20:41.400
flying that way that's a new way of flying we will invent new ways of thinking and these will all be different
00:20:46.680
than humans and so the first thing is is that um we are likely to invent hundreds maybe thousands of
00:20:57.880
different species of thinking and they're all alien to us um and that's the virtue because in this new
00:21:05.080
economy the the real wealth the real value is being generated by thinking different and this is a
00:21:12.440
increasingly difficult challenge when we're all connected so so the more we're connected together
00:21:20.440
the more difficult it is to think differently and the more valuable it is and ai is actually
00:21:25.960
going to help us think differently because they fundamentally think differently so they will be
00:21:30.920
very creative but their creativity would be a little bit different than ours and that's actually going to
00:21:34.840
be an advantage and so um we are going to employ them i think these different kinds of these alien
00:21:41.880
intelligences um sometimes to solve problems say in business or science that are basically beyond
00:21:50.440
our own human capability of intelligence to solve by ourselves and so we'll have a two-step process
00:21:57.480
the first step is we're going to invent a different kind of mind that together will work with us to help
00:22:03.400
solve these problems and this idea of working teaming up with intelligences actually has a term now
00:22:11.720
it's called a centaur it's being used in the military it was invented in the chess field where um you could
00:22:19.240
have an open version of a chess match where you could play as a ai you could play as a chess master or
00:22:25.880
you could play as a team of ai and human and that was called a centaur and the remarkable thing is in the last
00:22:34.520
four or five years that the best chess player on this planet is not an ai it's not a human it's a
00:22:43.240
centaur it's a team of ai and humans and that is the model that we're going to have with the ais
00:22:51.320
they are already as i'm suggesting smarter than we are in certain dimensions but intelligence is not a
00:22:57.720
single dimension thing it's not like you know it's like a decibel getting louder like iq it's multi-dimensional
00:23:05.080
and um they are it's we have versions of it already smarter than we are but we're going to be working
00:23:15.000
with them to solve all these problems and um much of what we do in our lives or daily work
00:23:23.080
can be turned over to these other intelligences but you know there's so much still that we don't
00:23:32.280
know what we want that we are going to rely on our own selves our own humans to discover these new
00:23:38.200
things that we want and to make these new jobs and tasks that we will then give to to the robots and so
00:23:45.080
i i i think the the vision is yeah they're getting smarter and smarter but there's a different kind of
00:23:51.880
of intelligence and we're going to be working with them together to keep inventing new things
00:23:59.960
that we want to be made efficient and productive and then we give them to the bots and so in a
00:24:05.080
certain sense the human the job for humans becomes to invent new jobs to give to robots
00:24:12.120
and i think an interesting point you made too is there's not going to be a single ai because i think
00:24:16.280
that's what a lot of people imagine is like the machine stops so central computer that just does
00:24:22.360
everything but you say there's going to be multiple types of ai yeah and and it's you know there will be
00:24:29.240
these big ai companies whether they're google or amazon there'll be something like that there'll be some
00:24:34.600
network effects where the smarter the ai is the more people use it the more people use it it gets
00:24:40.680
smarter and smarter so there is this network effects this snowballing avalanche where everybody's kind
00:24:45.800
of drawn to a few um winners in a particular kind of intelligence and so um there will certainly be
00:24:56.120
a kind of ai that you ask questions for and that's what it that's what it does and so that will become
00:25:02.120
the kind of ubiquitous presence like alexa that's always there that we're constantly asking questions or
00:25:10.200
having it do things that kind of personal assistant but that way that's just one type and even though
00:25:15.560
it would be ubiquitous and common and everybody will will know about it there'll be another kind of ai
00:25:21.320
that maybe is really good at um doing science of working with scientists and trying to uh understand
00:25:28.440
really where you know detect these patterns of of reality to to understand what's going on um there'll be
00:25:35.000
maybe maybe another ai that would be um you know optimized for um driving and it could be an entirely
00:25:44.120
different mixture of types of thinking and the thing about this is this is there's an engineering maxim
00:25:53.800
that holds true here which is that you cannot optimize everything in a system there's always trade-offs
00:26:01.320
you just because of the because of reality of fixed times and resources and energy that you can't make
00:26:07.480
something an ai that's optimized in intelligence in every dimension at once that's just an engineering
00:26:16.200
fallacy so there's always these trade-offs where this particular one will be better here
00:26:23.400
and because it's going to be a little bit less in this department and for this function that's good so
00:26:30.280
it's like you can't have an organism in life that's optimized in survival in every direction it's
00:26:37.640
that's just not how reality works and the same thing with these ais is that there'll be different ones
00:26:45.400
optimized for different purposes for answering questions whenever so i i i think one of the the
00:26:52.600
mistakes is this idea of a general purpose intelligence human intelligence is often pointed to as a general
00:27:00.040
purpose intelligence human intelligence is not general purpose at all it is evolved over millions of
00:27:06.600
years for our survival on this planet when we begin to populate the the space of all possible
00:27:13.800
intelligences and we'll do that by making different kinds of thinking in ai and maybe someday we'll
00:27:20.600
discover other ones in the universe but as we imagine this space we're going to find out that our
00:27:26.040
intelligence is not like at the center it's not general purpose it's actually way out in a corner
00:27:30.600
it's it's it's a very specific niche kind of intelligence and um i think the i the the the thing that
00:27:41.240
misleads us the the incorrect notion is this idea of general purpose intelligence um and that's just um
00:27:49.560
that's like us believing that we're at the center of uh the solar system and it revolves around us that's
00:27:55.240
that's that's a very parochial view that comes to the fact that we have not really had much contact
00:28:00.840
with the other kind of intelligence so we assume that ours is the general purpose one right and this
00:28:06.440
this will be disruptive i mean going back to like self-driving cars um truck driving is the biggest job
00:28:13.960
in the united states the most common the most common job yeah right and it's probably a truck driver
00:28:18.600
listening to this right now um i mean how do how are we going to manage when there's a bunch of laid
00:28:25.320
off truck drivers because there's a an artificial intelligence yeah well so all those trucks that
00:28:30.840
have to drive themselves they all still need to be repaired they all still need to be i mean the
00:28:36.680
even the ai in them has to be maintained and upgraded um they're these are very complicated machines
00:28:44.360
and so i think our our driving of them moves up a level so instead of actually physically sitting
00:28:50.840
behind the wheel you are you're you're taking care of them you're guarding them you are you are um
00:28:57.240
directing them in other ways there's um uh there's plenty i mean moving things around in the physical
00:29:05.800
world will still require plenty of human attention because um in the beginning we don't know what we want
00:29:13.960
and humans are the best way to find out what we want and what robots and bots
00:29:20.280
ais are good for is in things that have to do with efficiency and productivity and so anything any
00:29:28.280
job any task where productivity or efficiency is important will go to the bots but that leaves
00:29:35.160
some of the most interesting and valuable things that we do um or have the aren't dependent on
00:29:42.120
uh efficiency or productivity like like innovation um uh like exploration like human relationships none
00:29:51.000
of those are efficient they're inherently inefficient and that's actually what humans are turns out to be
00:29:57.400
really good we're really good at the inefficient stuff we're really good at wasting time we're really
00:30:01.000
good at exploring dead ends and and trying stuff and that has nothing to do with being intellectual or
00:30:08.520
having a college degree it has a lot to do about being human and so i think there's going to be plenty of
00:30:17.240
new opportunities created even for like a truck driver who by the way aren't dumb they're smart they're
00:30:27.560
trucks like farming has become a very high-tech um industry even today um there'll be plenty of opportunities
00:30:36.760
for um moving and retraining people um to these new uh new roles and by the way um we know how as a
00:30:50.280
species as americans even we know how to retrain people on a mass scale and we do it in the military
00:30:57.560
all the time the military is fantastic at taking people who don't have many skills and giving them very
00:31:03.000
high-tech skills quickly in large massive scale so we have that ability uh what we need is the political
00:31:13.160
will to to do that okay um so the next trend uh you talked about book is flowing uh what do you mean by
00:31:23.960
by flowing well this goes a little bit back to what i was saying before about um the the large-scale
00:31:31.240
shift from things that were monumental fixed solid to things that were intangible and process and um
00:31:40.920
upgrading in in fluid and so um as we move from the solid to the to the liquid things have to flow and
00:31:52.040
there's a sense in which um uh the flows become more important than than than than the the fixed
00:32:07.960
becomes less of a fixed printed object and more of something that might be updated over time have
00:32:13.800
footnotes or have some way you can interact with it or in some ways is uh is deepened by having
00:32:20.520
um hyperlinks into it um or embedded uh other media or ways in which it can be extracted and cut and
00:32:31.400
pasted and add um comments and so all these things that we have in the digital world are ways in which
00:32:40.280
things flow and um there's also the shift in the very culture away from
00:32:49.560
the fixed text that we had um in things like constitution law books the scripture um the great
00:32:59.560
authors that was the center of of of western culture and in eastern culture to some extent as well was was
00:33:06.440
the fixed page the text on the book the the immobile uh black and white precise letters on a on a on a
00:33:15.000
printed on a book and and we're now moved that during that time we were kind of people of the book and now
00:33:22.520
we've moved to become people of the screen because the screen is now the center of our culture and the
00:33:28.760
screen is this thing that that that that is forever changing is this kind of this this face that has this
00:33:37.640
flow of pixels across that are that are ephemeral that are passing that are that are moving never to be
00:33:45.800
the same and so there is a sense in which instead of getting truth from the authors and authority we have
00:33:51.960
to actually kind of assemble our own truth um looking at this network of related facts trying to discern
00:33:59.640
the experts and the anti-experts the fact and the anti-fact the counterfact where we're it it's a whole
00:34:07.320
different approach to deciding what's true and what we know and so that's one of the consequences of
00:34:15.880
this flows and as we as you do businesses as you make business as you make products we have to
00:34:21.080
understand that the flows is where it's happening this the sense of constant upgrades the sense of
00:34:29.480
the fleeting ephemeral so we move to you know up um twitter facebook walls updates flows streaming of media
00:34:41.400
um this is in this this is all that movement of this kind of eternal constant
00:34:49.080
changing streams of things and um that's the reality of the environment that you're going to work in
00:34:59.880
and make a new product in and i mean that sounds though uh like psychologically and intellectually
00:35:06.760
exhausting that we have to go to all multiple sources and figure out what is truth so i mean how
00:35:13.800
i guess like it goes to your next point is like filtering is going to be an important aspect
00:35:17.320
of the future right yeah yeah you go to the constitution like that's what the constitution
00:35:23.080
says now it's like well i don't know because things are changing all the time i don't know what is
00:35:29.160
right so so so the filter the filter the thing about filters is like your recommendation engines and
00:35:35.000
you know your amazon you feel like this you like that those are all kind of filters or your
00:35:39.320
filter um you filter your your streams um by choosing your friends and who you follow but
00:35:45.000
then there's also the filtering that facebook or twitter is doing because you can't see all of it so
00:35:49.560
they're filtering and there's some the the the one of the many dangers of or the downsides of filtering
00:35:57.480
is the fact that you can um have something called overfitting which is basically um if you are only ever
00:36:03.720
seeing um that which you already like in other words people who like these things will like this one here
00:36:12.760
um if you're only ever seeing that then you kind of spiral into this sort of um cloistered
00:36:26.120
um ignorance basically and you and you you in terms of biology you're kind of you're you you have
00:36:35.240
a local premature optimization you you you've optimized yourself onto a peak onto the top of
00:36:43.560
a mountaintop that's not the highest peak so you're stuck basically you get stuck and um
00:36:51.400
so that's the danger of all this kind of filtering so a lot of people say well we need less of it and
00:36:55.480
and then and what i'm suggesting is no this is not the trend the trend is that there's going to be
00:36:59.480
more of it and that the way you kind of compensate for this filter these filter bubbles and this
00:37:06.200
overfitting and this kind of premature optimization of being seeing only that which you already know you
00:37:12.600
like is actually other types of filters so we're going to see more and more filtering that is inevitable
00:37:21.560
what we want is smarter and smarter kinds of filters different approaches different ways in
00:37:28.600
which they relate to us different ways that we relate to them ways in which you try to know what's
00:37:33.480
going on and and it's basically a literacy a type of literacy and understanding that everything's being
00:37:39.320
filtered and we have to become good at it and um good at it means that we need kind of um tools
00:37:50.760
that will take us that will show us things that we didn't know we wanted and that's sort of what
00:37:56.120
we got in the old world with tv and radio where they were playing you were listening to them and there was
00:38:03.160
some producer or editor who was saying well we'll do this and after this we'll do this one because the thing
00:38:10.040
is cool and you weren't you had no control over that and that was actually part of the benefit was that you were
00:38:16.600
being told and shown new things and so we need to reintroduce some of those kinds of dynamics into our
00:38:24.760
digital filtering where we are deliberately um encountering things that we either disagree with
00:38:34.040
or didn't even know existed and um that's where we're going so so the short answer about filtering
00:38:44.360
was yeah there's filtering it's we need it because there is far far more being produced every day
00:38:52.920
than we can deal with i mean not just and a lot of it's junk but there's even far far more great stuff
00:38:58.360
being produced every day that we could ever pay attention to so we need ways to get through this
00:39:06.440
abundance and um uh there'll be sophisticated ways and we can constantly improve and there's huge
00:39:15.960
opportunities for people to invent new ways to do this um and uh but we're not going to get away from
00:39:26.600
having more more filtering so that's a non-starter is thinking well we'll just turn off the filters
00:39:33.240
no it doesn't work we just have to turn on more of them right that was interesting you talked about
00:39:37.960
how tv back in the 60s and 70s was you know introducing new stuff you never thought you'd be interested
00:39:44.040
but i thought like the early days of the web was often like that um i remember surfing around you
00:39:48.520
just end up in the weirdest places you can still do that now you can is it great there's a great page
00:39:53.560
calls like the the was a random there's a kind of a wikipedia of the day a random wikipedia page
00:39:59.320
it's like just go there and you'll find these amazing things you had no idea like this so there are tools
00:40:05.880
like that that can do that still um and i think we haven't we're not done with the idea
00:40:13.960
of having the kind of the strong producer editor who's curating a stream of things that are outside
00:40:22.280
of anything we would encounter and yet might appeal to us at some level and and so i think this idea of
00:40:31.480
you know like you know the magazine or the the show or whatever it is will continue because there's
00:40:38.600
still an appetite for having that really kind of informed cool curator saying you know how about
00:40:47.320
this yeah i know you like this one you like this one but have you ever heard of this one this is cool
00:40:51.240
this is cool and so we'll follow along to some degree in some part of our lives um to that so that i think
00:40:58.680
that that role is not gone and will continue and people will pay for that um you know in different
00:41:08.840
different ways okay so uh you also imagine a world where we're surrounded by screens like there's gonna
00:41:15.160
be screens in our mirrors on our walls there's gonna be paper that that's actually that are actually
00:41:21.000
screens um but people today have a lot of qualms about screens like you know we want to limit screen time
00:41:27.400
because you know screens make us easily distractible and anti-social yeah and i i do too i try to try to
00:41:34.360
limit but but it's it's um so there's there's another general rule about technology which is that
00:41:45.720
we tend to think of technology as anything that was invented after we were born
00:41:51.400
when in fact most of the technology that surrounds our lives you and i if we just look around wherever
00:41:57.240
as a listener just look around where you are most of the technology in your life is old it's
00:42:03.320
wood uh you know plywood maybe or concrete or um electrical wires there's a road outside um asphalt
00:42:14.120
there's it's ancient in some cases it's old and that forms the bulk of the technology in life most of it
00:42:22.760
and the new technologies is kind of a thin additional supplemental layer on top just like
00:42:31.880
most of your brain that you're listening that you're using right now to listen to me most of that brain
00:42:37.320
is reptilian mammalian or older it's um it's doing you know non-conscious things it's the bulk the
00:42:49.720
the majority of your brain is just you know doing things like learning you know to navigate to see
00:42:55.880
to breathe to to um react to you know hunger all those things and that the the layer that you kind
00:43:03.480
of identify yourself as this very thin layer of consciousness is a thin membrane around it's it's
00:43:09.720
it's it's minority it's not it's not a lot and that's the same with a new technology this digital
00:43:14.920
technology in itself it comes in layers and very rarely does the old go away so it's on top of
00:43:24.120
everything that's already existing so this new stuff is always going to be there in the context
00:43:32.680
of all the other things so we still have all these other options and so the screen
00:43:38.120
most flat surfaces i think we're going to go to the point where basically any flat surface
00:43:42.120
will become a screen because it becomes so cheap that you can you can make a wall out of it you
00:43:47.320
can make sight of a building you can put on clothes but at the same time um you know there'll be clothes
00:43:55.240
that don't have screens on them they'll be um uh you know there'll be areas where we go that you know
00:44:02.840
like nature that don't have screens and they will continue to be powerful for us and useful
00:44:10.680
because they aren't that because they're different and um uh and i think even though the screens will
00:44:18.760
become even more ubiquitous than they are and more important in understanding who we are um they will
00:44:26.360
always be in the context that we'll have time away from them that we'll come to see is as valuable as
00:44:32.760
the time on to them and so um the reason why the sabbath is a very powerful idea is not because
00:44:40.600
with the traditional sabbath was you didn't work one day a week it was not because work was bad it was
00:44:47.960
because work was good it was because it was you would keep working and so you you kind of we'd leave the screen
00:44:57.800
not because the screens are bad but because they're good because they're too good and so
00:45:03.960
we and we don't we leave the nature we go back to the city not because nature is bad but because
00:45:11.080
that difference that delta that shift that other way of doing and seeing the world
00:45:18.760
is valuable and that's so we will constantly add even more ways of seeing the world or doing
00:45:27.640
and the screen ubiquitous screen is one of those but it becomes even more valuable in the context that
00:45:33.960
we have say wilderness or or something in between a garden so i think while screens will become ubiquitous
00:45:41.240
taking time away from screens will become ever more valuable yeah so you think we'll develop uh
00:45:47.720
some sort of cultural hygiene i guess if you to manage those screens i don't think we really have
00:45:52.680
that figured out right now um you know i i think we're doing better than we than people say everybody
00:45:59.160
is wringing their hands over and everybody is sort of not happy necessarily with their choices but i think
00:46:06.360
if you actually look if you know people who have kids you know it's like they all all the families
00:46:15.160
with kids and screens are all um limiting screen time i mean in some fashion or another and they have
00:46:21.080
different rules i think i think there's a sense right now because it's a little brand new social media
00:46:26.200
is less than this is 2 000 days old right so it's like we shouldn't expect ourselves to have this figured
00:46:33.320
out yet it's going to take a couple of cycles a couple of revs to really understand what it's going
00:46:38.680
and i think right now we're just kind of you know people are saying well what are you doing what are
00:46:42.120
you doing is does that work does that work and and yeah that's we're doing this experiment right now
00:46:47.160
and but i think in general um everybody gets the idea that yeah you want to you want to limit it's like
00:46:52.920
sugar you just have to it's good you don't want too much of it how much is too much we don't know we'll
00:47:00.040
try it we'll try this we'll try that and um the other thing of course is that i think it's really
00:47:05.800
unfair to judge a technology strictly by how the youth use it because youth by definition are excessive
00:47:14.440
obsessive they're going to you know even if nothing else changed as kids grew older they would they would
00:47:23.720
use the social media differently just because as they aged and so we have all these dynamics going
00:47:31.240
on and i think that um the proper response right now is to be is to have this as an experiment and
00:47:39.560
to say i don't you i don't know what the right thing is but we'll keep trying stuff with our family
00:47:44.120
or with my friends or whatever it is um so i so i don't think of it as a problem i think of this as an
00:47:51.320
opportunity as as an experiment yeah i think that's a good point that last one because like
00:47:57.320
video games i played video games like all the time when i was a kid but i haven't played a video
00:48:01.400
game in three years yeah yeah exactly where i was like totally obsessed with science fiction reading
00:48:08.120
books you know i had my head in the book and you know there was plenty of uh of um rants from the
00:48:15.240
18th century about the old fuddy duddies um complaining about kids going up and reading books by themselves
00:48:23.640
those things it's just just immoral just the the um the mark of um low class you know whatever it
00:48:33.800
was it was it was completely frowned upon and you know i was doing the same thing i was i was uh lost in
00:48:42.200
escaping and i don't know i was just i was in this obsessive world of science fiction but then i didn't
00:48:47.320
read any for for a couple decades almost i'm reading more now i'm coming back to it but it's i think
00:48:56.440
um there there's phases and technologies um appeal at certain different phases in people's lives
00:49:04.440
and we'll sort that out as we go along so another trend um that i think a lot of people are seeing right
00:49:09.720
now but you think it's going to accelerate even more is this idea of um ownership being replaced
00:49:14.920
with accessibility right so instead of owning cars you'll summon an uber or do a ride share or something
00:49:21.800
like that or yeah it has to do with again with this flowing state where we value services more than
00:49:28.040
products and so if you can have instant access to the good that you want anywhere you are instantly
00:49:38.680
then the question is well why would you want to own it because owning has a lot of responsibilities
00:49:43.720
like you need to back it up or clean it or store it secure it and so many other things that you don't
00:49:50.680
have to do if you're just borrowing it accessing it and so um in the digital realm it's much it's very
00:49:57.240
easy to kind of make that access possible where you you know you're connected so you could have a movie
00:50:04.360
book or music there's really almost no reason to to need to own it with some caveats and um uh
00:50:16.040
i'm probably pretty typical where i you know basically stopped buying very many movies or music
00:50:24.360
and moving to the place where books are almost the same way where you subscribe to this aggregator
00:50:30.120
a netflix or an amazon or something and you have access to any book music game movie that you want
00:50:38.360
anytime and then you just use it and uh you get it back and so the question was well can that extend to
00:50:45.400
the physical and and and uber is one example showing that it actually can if you can supply something on
00:50:53.800
demand even a physical thing when everybody when someone some is it that that can be good or better
00:51:03.240
than than owning it so the question is well how what else could you do physically could you do clothes
00:51:10.680
where you kind of don't own clothes but you subscribe to clothes and clothes come to you on a regular
00:51:16.600
basis and you use them and then they move on either cleaned or passed on or whatever um and so then you
00:51:24.440
know um uh could you have it you know with like one hour delivery with amazon and other kinds of things you
00:51:32.600
could suddenly have all kinds of things that are almost simultaneous looks like for many people having
00:51:36.920
something available within an hour is actually something that's better than what they own it
00:51:42.440
might take you an hour to find something down in your basement or in your storage container or something
00:51:46.440
like that and um and oftentimes the kinds of lines we have an hour is plenty of time to get what it is
00:51:54.840
that we wanted um that we maybe formerly owned and so 3d printing uh and other types of kind of instant
00:52:07.160
delivery all move in this direction where um even physical things would take on some of the attributes
00:52:16.680
of being better having access to rather than owning and of course even workspaces and office spaces is
00:52:25.960
and you know we have lots of startups who are doing the same thing which is why own when you can just
00:52:31.640
have access to something airbnb that um is available and in many cases is superior to owning and since
00:52:40.280
ownership has been so foundational in capitalism this is a big shift um there's lots of consequences
00:52:47.400
that we haven't even worked through yet of what happens if people
00:52:53.560
generally don't own things well obviously somebody has to own something in order to i mean there has to
00:52:59.240
be whoever somebody's owning these things that we're using there may there may be less of them in total
00:53:05.640
but there's still ownership and the question is how is that ownership distributed and there's lots of
00:53:10.600
issues but um that's where we're going yeah well you mentioned there there were some caveats that like
00:53:17.880
have been accessible what would the i mean when would ownership be like that would be better than
00:53:22.760
yes so so one of the issues like um for a lot of people like say with music movies and books is
00:53:29.080
the issue of um if they're taken away or modified and this is and we're increasingly seeing this like
00:53:37.320
i don't know uh i've been i'm one of the very first ever netflix subscribers so i've been netflix for a
00:53:43.880
long time but it's just amazing to see things on netflix come and go like they're there and then they're
00:53:48.280
not there it's like so you um if if if there really was where you had or was everything available all the
00:53:57.160
time that wouldn't be an issue but there is an issue where things can be taken away and so if you
00:54:04.600
have some um if you have so what i'm what i'm what i think is going to happen is there'll be people who
00:54:11.720
have certain areas that they really really care about and are doing something in and then they want
00:54:17.400
that ownership for control purposes um and um there's also the issue of like what you can do with things
00:54:25.480
that you don't own and that's another issue of um you know uh what they call them terms of service
00:54:33.160
and stuff where you're prohibited from modifying things um that because you don't own them and
00:54:41.800
that's that's something that i think maybe a cultural default i think that's easier to imagine um
00:54:49.160
um if if that really does become a problem then you would just have you know the the aggregators
00:54:55.320
would would would move to allow that um that ability to to modify um where i think right now what they
00:55:05.880
say is well consumers aren't really demanding it um so so that's so so the caveat is i think um
00:55:12.600
um there is this uh issue of removability and modification that you don't get with access but
00:55:20.600
you could um so so i think technically there's no reason why you couldn't make a system like that
00:55:28.040
um but but it does not exist right now yeah with the um why most of my reading is done on
00:55:34.600
ebooks on the kindle but there's like a book i really really really enjoy i'll buy a physical copy sure
00:55:40.280
right right and i think that if and again going back somebody has to own these things and so
00:55:46.040
you might own something that you care about and that may actually become the business or you know
00:55:50.600
it's like somebody will own an apartment if you can have airbnb somebody has to own it and so there
00:55:55.400
may so there'll be um you know people who like to own apartments and run them you know a lot of these
00:56:00.680
airbnbs are run by people who have more than one there and um they like that they they do that
00:56:06.840
they're good at it and so they will own that but the rest of the lives they may not own other things
00:56:11.320
they may not own a car but someone else who cares about cars may own cars and so i think um uh ownership
00:56:18.600
doesn't disappear i think it's just kind of distributed differently okay um so there's yeah there's a lot
00:56:24.120
of business opportunities there let's talk about the trends you are worried about and then and maybe
00:56:30.600
kind of end on a positive note that maybe things are going to be better than we think they are
00:56:33.720
how does that sound yeah so the kinds of things uh there's a lot of as you can tell i'm extremely
00:56:41.880
optimistic about technology i think there's far more opportunities than there are um detriments
00:56:48.520
but but i want to say that most of the problems we have today are caused by technology from last from
00:56:55.880
the past and almost all the tech all the problems in the future are going to be caused by technologies
00:57:01.720
from today so i i believe a very techno-centric view of the world i believe that each new technology
00:57:08.520
invention creates almost as many problems as solutions and so the question will you like well
00:57:12.760
that's just uh is that just a kind of a wash and no what we get out of that is we get um this one
00:57:17.960
percent difference we get slightly more um opportunities so we get we get the opportunity to even choose that
00:57:27.080
that we didn't have before and so i i see what the gift of technology is is increasing opportunities and that's
00:57:34.280
what i believe in but there are some um i mean i think there are some choices in technology that decrease um
00:57:43.480
opportunities and that's like weaponization when you make a weapon of something you're you're using it to decrease
00:57:49.960
killing hurting other people decreases their choices and i'm and um i think the weaponization of new
00:57:58.200
technologies like ai and robots is something that i'm concerned about and um i you know it's going to happen
00:58:08.360
and so one of the questions is how do we make it civil how do we
00:58:13.880
you know how do we make how we make these new rules and so when my concern about say cyber warfare
00:58:21.640
and that is that there's no rules right now we don't have any agreement or consensus about what's
00:58:26.360
permissible and what's not and that goes to like hacking at the nation state um the us is is offensively
00:58:34.120
hacking just like china and russia are but but none of them are admitting it and because there's no
00:58:39.320
no admission there's no agreement on well you can't do this and you can do that you can't take
00:58:45.080
out the electrical system and hurt or or the banks whatever and so there's no agreement and my fear my
00:58:51.480
worry is that there'll be some horrible disaster or maybe several before there's any ownership and
00:58:59.800
admission and movement on a consensus about what we accept and don't accept and so i'm concerned about
00:59:08.520
that that that aspect of um the ai and stuff is is as we weaponize them that we be civil about it
00:59:18.680
well kevin this has been a great conversation where can people learn more about your work and your book
00:59:23.000
um i have a website that's based around my initials it's kk.org kk.org um you know i post everything
00:59:32.520
there um you can find more about the book translations of the book um other things i've written
00:59:38.440
like a graphic novel about angels and robots um i have a cool tool site where we review one cool tool
00:59:46.280
a day um and then on social media i'm usually kevin to kelly kevin number two kelly um kevin kelly's
00:59:56.040
very common name unfortunately so kevin to kelly on twitter and facebook and i for me it's mostly
01:00:03.480
outgoing i don't read much but i do post um and google plus is the same i think
01:00:11.480
um so uh my best way to reach me is email which has been public for 30 years easy to find on my
01:00:19.560
website well kevin kelly thanks so much for your time it's been a pleasure thanks for having me
01:00:23.880
thanks for the great questions um and i really appreciate your enthusiasm for the book my guest
01:00:30.440
today was kevin kelly he's the author of the book the inevitable it's available on amazon.com and
01:00:34.280
bookstores everywhere you can find more information about kevin's work at kk.org also check out our show
01:00:39.960
notes at aom.is inevitable where you can find links to resources where you can delve deeper into this
01:00:56.680
well that wraps up another edition of the art of manliness podcast for more manly tips and advice
01:01:01.000
make sure to check out the art of manliness website at artofmanliness.com our show is edited
01:01:05.560
by creative audio lab here in tulsa oklahoma if you have any audio editing needs or audio production
01:01:09.560
needs check them out at creative audio lab dot com we appreciate your support reviews on itunes and
01:01:14.360
stitchers really helps us out a lot so thank you if you do that and until next time this is brett