#66 — Living with Robots
Episode Stats
Words per Minute
169.46265
Summary
Kate Darling is a researcher at the MIT Media Lab and a fellow at the Harvard Berkman Center. She focuses on the way technology is influencing society, specifically robot technology. But her background is in law and the social sciences, and she's one of the few people paying attention to this. And this is, along with AI, going to become increasingly interesting to us as we integrate more and more autonomous systems into our lives. For today's podcast, I really enjoyed speaking with Kate. As I think I said at some point, the phrase child-sized sex robots was not one that I was ever planning to say on the podcast, much less consider its implications. But we live in a strange world, and it appears to be getting stranger. So to help us all figure that out, I am here with Kate Darling. In order to access full episodes of the Making Sense Podcast, you ll need to subscribe at Samharris.org. There you ll get access to your favorite podcatcher, plus other subscriber-only content, including the latest updates on the latest shows on the show and much more! Sam Harris This is a podcast that doesn t run ads. Please consider becoming a supporter of the podcast by becoming a subscriber. You'll get a better sense of what we're doing here, and a better chance to listen to the podcast in the future. If you're not a subscriber yet, you'll need to become a supporter by becoming one. It's made possible entirely through the support of our podcast, we're making sense by the podcast. We don't run ads, and therefore, they'll be made possible by you'll be better of it, by a better of this podcast, Thank you, I'll thank you, she'll get it, she gets that chance to help you, too of a good thing, like that chance, better of that s that helps you, he gets it, he'll hear it, too he'll get that chance of that, he says it, says him, he's got it, I says that he'll really helps it, that's good of that guy, she says it...and he's really good of it...he'll also says it so it's that he's actually he's good like that, she's really he's that really he says that it's really really he really says it like that...he's really that he really is that really thing, right he's not really that thing, he really he'll actually he really has it, so he'lleeeeeeeeeeeeedeedeedeeeeedeeeeedeedeeeeded it, etc.
Transcript
00:00:10.880
Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680
feed and will only be hearing the first part of this conversation.
00:00:18.440
In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:24.140
There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:30.520
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:35.880
So if you enjoy what we're doing here, please consider becoming one.
00:00:52.660
Kate is a researcher at the MIT Media Lab and a fellow at the Harvard Berkman Center.
00:00:57.680
And she focuses on the way technology is influencing society, specifically robot technology.
00:01:04.040
But her background is in law and in the social sciences.
00:01:08.260
And she's one of the few people paying attention to this.
00:01:11.420
And this is, along with AI, going to become increasingly interesting to us as we integrate
00:01:17.560
more and more autonomous systems into our lives.
00:01:26.340
As I think I said at some point, the phrase child-sized sex robots was not one that I was
00:01:32.340
ever planning to say on the podcast, much less consider its implications.
00:01:37.920
But we live in a strange world, and it appears to be getting stranger.
00:01:42.180
So to help us all figure that out, I now bring you Kate Darling.
00:02:00.100
I'm continually amazed that we can do this, given the technology.
00:02:04.200
But I first learned of you, I think, in a New Yorker article on robot ethics.
00:02:19.120
So perhaps just take a moment to say how you got into this.
00:02:22.840
Yeah, robot ethics is, it is kind of a new field.
00:02:26.240
And it sounds really science fiction-y and strange.
00:02:29.300
But I, so I have a legal and social sciences background.
00:02:35.220
And at some point, about five and a half years ago, I started working at the media lab at
00:02:44.880
And I made friends with them because I love robots.
00:02:53.260
And we realized that I was coming at the technology with, you know, some questions that they hadn't
00:03:02.360
And we realized that together, there were some things that, you know, some questions that
00:03:09.900
were worth exploring, that when you bring people who really understand how the technology
00:03:14.340
works together with people who come at this from kind of a, you know, policy or social sciences
00:03:19.040
or societal mindset that, that can be interesting to explore.
00:03:25.060
It seems strangely named, but everything that comes out of it is incredibly cool and super
00:03:35.180
The media lab is kind of, to me, it's this building where they just stick a bunch of people
00:03:40.660
from all sorts of different fields, usually interdisciplinary, or as they call it, anti-disciplinary.
00:03:46.620
And they give them a ton of money and then cool stuff happens.
00:03:52.680
So there's everything from like economists to roboticists, to people who are curing blindness
00:04:02.440
It's really a mishmash of all sorts of very interesting people working in fields that aren't,
00:04:10.240
don't really fit into the traditional categories of academia that we have right now.
00:04:14.760
And so now your main interest with robots is in how our relating to them could well and
00:04:23.760
may, in fact, inevitably change the way we relate to other human beings.
00:04:29.180
I'm totally fascinated by the way that we treat robots like they're alive, even though
00:04:34.080
we know that they're not, and the implications that that might have for our behavior.
00:04:37.960
I must say, I'm kind of late to acquire this interest.
00:04:41.060
Obviously, I've seen robots in science fiction for as long as I've seen science fiction, but
00:04:47.480
it wasn't until watching Westworld, literally a couple of months ago, that I realized that
00:04:52.680
the coming changes in our society based on whatever robots we develop are going to be far
00:04:58.580
more interesting and ethically pressing than I realized.
00:05:02.380
And this has actually nothing to do with what I thought was the central question, which is,
00:05:10.720
That is obviously a hugely important question and a lot turns ethically on whether we build
00:05:16.740
robot slaves that are conscious and can suffer.
00:05:20.220
But even short of that, we have some really interesting things that will happen once we build
00:05:25.480
robots that escape what's now called the uncanny valley.
00:05:29.060
This, I'll probably have you talk about what the uncanny valley is.
00:05:32.600
And I think even based on some of your work, you know, you don't even have to get all the
00:05:37.060
way out of the uncanny valley or even into it for there to be some ethical issues around
00:05:42.220
how we treat robots, which we have no reason to believe are conscious.
00:05:46.400
In fact, you know, we have every reason to believe that they're not conscious.
00:05:49.460
So perhaps before we get to the edgy considerations of Westworld, maybe you can say a little bit
00:05:55.700
about the fact that your work shows that people have their ethics pushed around even by relating
00:06:02.880
to robots that are just these bubbly cartoon characters that nobody thinks are alive or conscious
00:06:10.580
Yeah, we are so good at anthropomorphizing things and it's not restricted to robots.
00:06:16.740
I mean, we've always had kind of a tendency to name our cars and, you know, become emotionally
00:06:22.040
attached to our stuffed animals and kind of imagine that they they're these social beings
00:06:28.640
But robots are super interesting because they combine physicality and movement in a way that
00:06:39.140
So I think that it's just it's it's so interesting to see people treat even the simplest robots like
00:06:47.660
they're alive and like they have agency, even if it's totally clear to them that it's just a
00:06:53.600
So, you know, long before you get to any sort of complex humanoid Westworld type robot, people are
00:07:00.840
People feel bad for the Roomba when it gets stuck somewhere just because it's kind of moving around
00:07:07.760
And I think it like it goes further than just being primed by science fiction and pop culture
00:07:14.600
Like, obviously, you know, we've we've all seen a lot of sci fi and Star Wars and, you
00:07:19.660
know, we we probably have this inclination to name robots and personify them because of
00:07:24.640
But I think that there's also this biological piece to it that's even more and that's even
00:07:32.280
So one of the things that that we've noticed is that people will have empathy for robots,
00:07:39.020
or at least some of our work indicates that people will empathize with robots and be really
00:07:43.980
uncomfortable when they're asked to to destroy a robot or do something mean to it, which which
00:07:55.700
Because obviously it's kind of an artificial situation to hand people a robot that is cute
00:08:03.540
But there are robots being used in I think there isn't it like a baby seal robot that
00:08:08.940
you're giving people with with Alzheimer's or autism is contact with these surrogates
00:08:18.480
Does that pose any ethical concerns or is that just if it works on any level, it's intrinsically
00:08:25.920
I think there is something unethical about it, but probably not in the way that most people
00:08:31.460
So I think, you know, intuitively it's a little bit creepy when you first hear that, oh, we're
00:08:36.400
kind of we're using these baby seal robots with dementia patients and we're giving them
00:08:41.120
the sense of nurturing this thing that isn't alive.
00:08:44.040
That seems a little bit wrong to people at first blush, but I honestly, so if you look
00:08:52.160
at what these robots are intended to replace, which is animal therapy, it's interesting to
00:09:02.040
And no one no one complains about animal therapy for, you know, dementia patients.
00:09:08.080
It's something that we often can't use because of hygienic or safety or other reasons.
00:09:13.340
But we can use robots because people will consistently treat them sort of like animals and not like
00:09:19.160
And I also think that, you know, for the ethics there, it's important to look at some of the
00:09:24.800
So with the baby seal, if we can use that as an alternative to medication for calming distressed
00:09:31.300
people, I'm I'm really not so sure that that's really an unethical use of robots.
00:09:39.460
So one of the things that does concern me, though, is that this is such an engaging and
00:09:45.160
or in other words, manipulative technology that and and, you know, we're seeing a lot
00:09:51.260
of these robots being developed for kind of vulnerable parts of the population, like the
00:09:56.100
A lot of kids toys are have increasing amounts of this kind of manipulative robotics in them.
00:10:01.600
So I do wonder whether, you know, the companies that are making the robots might be able to use
00:10:07.520
that in in ways that aren't necessarily in the public interest, like get people to buy
00:10:12.840
products and services or manipulate people into revealing more personal data than they
00:10:21.980
But those are more people doing things to other people rather than, you know, something intrinsically
00:10:26.700
wrong about treating robots like they're alive.
00:10:31.700
Have any companies with toy robots or elder care robots done anything that seems to push
00:10:38.540
the bounds of propriety there in terms of introducing messaging that you wouldn't want in that kind
00:10:45.440
Yeah, I don't know any examples of like people trying to manipulate the elderly as of now.
00:10:50.460
But I mean, we do have examples from, you know, the porn industry and having very manipulative
00:10:56.480
chatbots that try and get you to sign up for services.
00:11:01.940
So we do have a history of companies trying to use technology in advertising or or say,
00:11:08.980
you know, the in-app purchases that we see on iPads where there have been consumer protection
00:11:14.620
cases where, you know, kids were buying a bunch of things.
00:11:18.560
And now, you know, companies have had to implement all of these safeties so that it requires, you
00:11:23.640
know, parental override in order to purchase stuff.
00:11:27.100
Like there's a history of, you know, we know we know that companies, you know, serve their
00:11:31.380
own interests and any technology that we develop that is engaging in the way that robots already
00:11:38.780
are in their very primitive forms and will increasingly be, I think, might pose a consumer
00:11:46.540
Or you could even, you know, think of governments using robots that are increasingly entering into
00:11:52.300
our homes and very intimate areas of our lives.
00:11:55.520
Governments using robots to, you know, collect more data about people and essentially spy on
00:12:01.900
So there's this basic fact where any system that seems to behave autonomously doesn't
00:12:08.300
have to be humanoid, doesn't even have to have a lifelike shape.
00:12:14.860
As you said, it could be something like a Roomba.
00:12:16.480
If it's sufficiently autonomous, it begins to kindle our sense that we are in relationship
00:12:23.100
to another, which we can find cute or menacing or whatever we feel about it.
00:12:29.200
It pushes our intuitions in the direction of this thing is a being in its own right.
00:12:36.020
I believe you have a story about how a landmine diffusing robot that was insectile, like spider-like,
00:12:42.340
could no longer be used, or at least one person in the military overseeing this project felt
00:12:47.740
you could no longer use it because it was getting its legs blown off.
00:12:51.440
And this was thought to be disturbing, even though, again, we're talking about a robot that
00:12:55.040
isn't even close to being the sort of thing that you would think people would attribute
00:13:01.660
And then, of course, with design, you can really start influencing that, right?
00:13:05.740
So whether people think it's cute or menacing or whether people treat it as a social actor,
00:13:10.960
because there's this whole spectrum of, you know, you have a simple robot like the Roomba,
00:13:14.800
and then you have a social robot that's specifically designed to mimic all of these cues that you
00:13:21.580
So we're seeing increasingly robots being developed that specifically try and get you to treat it
00:13:30.060
Are there more robots in our society than most of us realize?
00:13:35.400
What is here now and what do you know about that's immediately on the horizon?
00:13:39.140
Well, I think what's sort of happening right now is we've had robots for a long time, but
00:13:45.100
robots have been mostly kind of in factories and manufacturing lines and assembly lines and
00:13:51.840
And now we're gradually seeing robots creep into all of these new areas.
00:13:56.300
So the military or hospitals, we have surgical robots or transportation systems, autonomous vehicles,
00:14:06.420
A lot of people now have Alexa or Google Home or other systems in their homes.
00:14:13.200
And so I think we're just seeing an increase of robots coming into areas of our lives where
00:14:17.940
we're actually going to be interacting with them in all sorts of different fields and areas.
00:14:23.900
So what's the boundary between, or is there a boundary between these different classes of robots?
00:14:29.600
Yeah, I don't think there's any, you know, clear line to distinguish these robots.
00:14:34.900
Also in terms of, you know, the effect that they have on people, you know, you see, depending on how
00:14:39.800
a factory robot is designed, people will, you know, become emotionally attached to that as well.
00:14:46.420
And we also, I mean, by the way, we don't even have a universal definition of what a robot is.
00:14:51.020
Some of the robots I picture, like the robots I was picturing on an assembly line are either
00:14:56.740
fixed in place, and we're just talking about arms that are constantly moving and picking
00:15:00.840
things up, or they're kind of moving on tracks, but they're not roving around in 360 degrees
00:15:09.520
I trust there are other robots that do that in industry as well.
00:15:13.900
But like, so one question is, you know, is the inside of a dishwasher, is that a robot?
00:15:17.680
Like, is that, is that movement autonomous enough?
00:15:20.620
It's basically what the factory robots are doing, but we call those robots, we don't call
00:15:25.600
There's just this continuum of machines with greater and greater independence from human
00:15:32.060
control and greater complexity of their routines, and there's no clear stopping point.
00:15:37.740
Let's come back to this concept of the uncanny valley, which I've spoken about on the podcast
00:15:41.820
What is the uncanny valley, and what are the prospects that we will get out of it anytime
00:15:47.880
Yeah, the uncanny valley is a, you know, somewhat controversial concept that you can design something
00:15:56.940
that is lifelike, but if, as soon as you get too close to, I think for the uncanny valley,
00:16:05.300
If you get too close to something that looks like a human, but you don't quite match what
00:16:12.680
So people will like the thing, the more lifelike that it gets.
00:16:15.660
And then once it gets too close, like the likability of it drops, it's like zombies or something
00:16:21.940
like something that's human, but not quite human really creeps us out.
00:16:25.560
And then it, it, it, it doesn't go back up again until you can perfectly like absolutely
00:16:34.520
And I think I, I like to think about it more less in terms of the uncanny valley and more
00:16:42.920
So I, I think that if we see something that looks human, we expect it to act like a human.
00:16:48.200
And if it's not quite up to that standard, I think it disappoints what we were expecting
00:16:56.900
And, and that seemed, that's a principle that I see in, in robot design a lot.
00:17:00.880
So a lot of the really, I think, compelling social robots that we develop nowadays are
00:17:06.360
not designed to look like something that you're intimately familiar with.
00:17:10.380
Like I have this robot cat at home that Hasbro makes, and it's the creepiest thing.
00:17:15.700
It, like, cause, cause it's clearly not a real cat, even though it tries to look like
00:17:20.960
And it, so it's very, it's very unlovable in a way, but I, I also have this baby dinosaur
00:17:27.780
robot that is much more compelling because I've never actually interacted with a two week
00:17:34.720
So it's much easier to suspend my disbelief and actually imagine that this is how a dinosaur
00:17:41.760
Um, so yeah, it's so, so it's interesting to see how, you know, the, the whole Westworld concept,
00:17:48.880
you know, before we, we could even get there, we would really need to have robots that are
00:17:53.360
so similar to humans that we wouldn't really be able to tell the difference.
00:17:58.120
What is the state of the art in terms of humanoid robots at this point?
00:18:02.000
I mean, we are, I've never actually been in the presence of any advanced robot technology
00:18:08.620
There are some Japanese androids that are, that are pretty interesting.
00:18:15.100
I don't think that like, to me, they're not out of the Uncanny Valley yet, but there's also
00:18:19.920
some conversation about whether the Uncanny Valley is cultural or not.
00:18:24.000
And also I think some research on that, which I don't think is very conclusive, but it might
00:18:28.680
be that in some cultures, you know, like in, in Japanese culture, people are more accepting
00:18:35.100
of robots that are, that look like humans, but aren't quite there because, you know, people
00:18:42.160
say that there's this religious background to it, that the Shinto religion, the belief
00:18:48.180
that objects can have souls makes people more accepting of robotic technology in general.
00:18:52.940
Whereas in Western society, we're more creeped out by the, this idea that a thing, a machine
00:19:01.180
could, you know, resemble a living thing in a way, but I'm, yeah, I'm, I'm not really sure.
00:19:06.780
And, and I mean, you should check, check out the androids that, that Ishiguro in Japan is making
00:19:13.420
He made one that looks like himself, which is interesting to think about, you know, his own
00:19:24.260
I think, you know, just from a photograph, you might not be able to tell the difference,
00:19:29.720
So do you think we will get to a Westworld level life likeness long before we get to
00:19:39.100
the AI necessary to power those kinds of robots?
00:19:43.780
I mean, do you have any intuitions about how long it will take to climb out of the uncanny
00:19:49.740
I, I honestly, I'm not as interested in, you know, how do we completely replicate humans?
00:19:55.840
Because I see so many interesting design things happening now where that's not necessary.
00:20:01.280
Like we can create, we can already with, and, and, and robotic technology is very primitive
00:20:07.460
I mean, robots can barely operate a fork, but we can create characters that people will treat
00:20:15.580
And while it's not quite Westworld level, if we move away from this idea that we have to create
00:20:22.500
humanoid robots and we create, you know, a blob or, you know, some, we have a century of animation
00:20:29.880
expertise to draw, draw on in creating these compelling characters that people can relate
00:20:37.860
And I think that that's, you know, much more interesting.
00:20:41.200
I think much sooner than, than Westworld, we can get to a place where we are creating
00:20:45.980
robots that people will consistently treat like living things, even if we know that they're
00:20:51.560
I guess my fixation on Westworld is born of the intuition that something fundamentally
00:20:56.260
different happens once we can no longer tell the difference between a robot and a person.
00:21:03.660
Maybe this change and all of its ethical implications comes sooner when, as you say, we have a blob
00:21:10.500
that people just find compelling enough to treat it as though it were alive.
00:21:15.880
It just seems to me that Westworld is predicated on the expectation that people will want to
00:21:21.120
use robots in ways that would truly be unethical if these robots were sentient.
00:21:28.600
But because on assumption, or in fact, they will not be sentient, this becomes a domain
00:21:34.640
of creative play analogous to what happens in video games.
00:21:39.380
If you're using a first-person shooter video game, you are not being unethical shooting the
00:21:46.640
And the more realistic the game becomes, the more fun it is to play.
00:21:52.180
And there's this sense that, I mean, while some people have worried about the implications
00:21:56.100
of playing violent video games, all the data that I'm aware of suggests they're really not
00:22:01.480
bad for us and crime has only gone down in the meantime.
00:22:04.820
And it seems to me that there's no reason to worry that as that becomes more and more
00:22:08.600
realistic, even with virtual reality, it's going to derange us ethically.
00:22:13.420
But watching Westworld made me feel that robots are different.
00:22:18.500
Having something in physical space that is human-like to the point where it is indistinguishable
00:22:24.800
from a human, even though you know it's not, it seems to me that we'll begin to compromise
00:22:35.660
We'll not only feel differently about ourselves and about other people who mistreat them, we
00:22:41.180
will be right to feel differently because we will actually be changing ourselves.
00:22:47.140
You'd have to be more callous than, in fact, most people are to rape or torture a robot
00:22:55.640
that is, in fact, indistinguishable from a person because all of your intuitions of being
00:23:01.360
in the presence of personhood, of being in relationship, will be played upon by that robot even though
00:23:06.660
you know that it's been manufactured and, let's say, you've been assured it can't possibly
00:23:13.440
So the takeaway message from watching Westworld for me is that Westworld is essentially impossible.
00:23:19.040
I mean, we would just be creating a theme park for psychopaths and rendering ourselves more
00:23:25.620
and more sociopathic if we tried to normalize that behavior.
00:23:29.360
And I think what you're suggesting is that long before we ever get to something like
00:23:34.380
Westworld, we will have, and may even have now, robots that, if you were to mistreat
00:23:40.560
them callously, you would, in fact, be callous.
00:23:44.000
And you'd have to be callous in order to do that.
00:23:46.420
And you're not going to feel good about doing it if you're a normal person.
00:23:49.620
And people won't feel good watching you do it if they're normal.
00:23:53.620
Yeah, I mean, we already have some indication that people's empathy does correlate with how
00:23:59.760
they're willing to treat a robot, which is just super interesting.
00:24:02.820
If you'd like to continue listening to this conversation, you'll need to subscribe at
00:24:08.120
Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along with
00:24:12.840
other subscriber-only content, including bonus episodes and AMAs and the conversations I've
00:24:19.660
The Making Sense podcast is ad-free and relies entirely on listener support.