Making Sense - Sam Harris - May 26, 2017


#78 — Persuasion and Control


Episode Stats

Length

40 minutes

Words per Minute

154.81721

Word Count

6,320

Sentence Count

348

Misogynist Sentences

3

Hate Speech Sentences

3


Summary

Zeynep Tufekci is a contributing opinion writer at The New York Times, and she is an associate professor at the University of North Carolina at Chapel Hill with an affiliate appointment in the Department of Sociology. She is also a faculty associate at the Harvard Berkman Center for Internet and Society, and was previously a fellow at the Center for Information Technology Policy at Princeton University, and her research interests revolve around the intersection of technology and society. Her academic work focuses on social movements, privacy and surveillance, and social interaction. She s also increasingly known for her work on big data and algorithmic decision-making, and on her work as a computer programmer. And she s the author of Twitter and Tear Gas: The Power and Fragility of Networked Protest, and we get into many interesting topics here, relevant to information security and things like WikiLeaks, and the fake news phenomenon, all increasingly relevant as we depend more and more on the internet and draw our beliefs about reality from what happens there. In this episode, we talk about how she came to be an expert on the sorts of things we're going to talk about in cybersecurity and social persuasion, and organizing movements through social media, and how she got her start in the field. She s originally from Turkey, and formerly a programmer, but has taken that background in interesting and increasingly relevant directions. We don t run ads on the podcast, and therefore, it s made possible entirely through the support of our listeners. This is a podcast made possible by the support from our listeners, and so you can enjoy what we re making sense of the podcast. Please consider becoming a supporter of the Making Sense Podcast! by becoming a member of the MMS Podcast. You re getting a better listening experience, and you re getting access to more episodes, and more of the things you re reading and listening to more of what s going to make sense, not less of it, making sense, and less of a podcast like that, right here? Thanks for listening to the podcast? - Sam Harris - making sense? (Make Sense Podcasts: The Making Sense: A podcast by Sam Harris, the podcast by the MMMing Sensei, the MYS, the making sense Podcasts? . (featuring Sam Harris) Sam Harris ( ) ( ) ( ) ( ) . ( , ) . ( ( ), ( ).


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.620 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.900 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.620 Today's guest is Zeynep Tufekci.
00:00:49.940 She is a contributing opinion writer at the New York Times, and she is an associate professor
00:00:54.540 at the University of North Carolina at Chapel Hill, with an affiliate appointment in the
00:00:59.240 Department of Sociology.
00:01:00.820 She is also a faculty associate at the Harvard Berkman Center for Internet and Society, and
00:01:05.640 was previously a fellow at the Center for Information Technology Policy at Princeton University.
00:01:10.760 And her research interests revolve around the intersection of technology and society.
00:01:16.420 Her academic work focuses on social movements, privacy and surveillance, and social interaction.
00:01:21.520 She's also increasingly known for her work on big data and algorithmic decision-making.
00:01:27.900 And she's originally from Turkey and formerly a computer programmer, but has taken that background
00:01:35.280 in interesting and increasingly relevant directions.
00:01:40.140 And she's the author of Twitter and Tear Gas, The Power and Fragility of Networked Protest.
00:01:45.800 And we get into many interesting topics here, relevant to information security, and things like
00:01:54.300 WikiLeaks, and ransomware attacks, the fake news phenomenon, all increasingly relevant as we depend
00:02:05.480 more and more on the internet and draw our beliefs about reality from what happens there.
00:02:10.560 So, without further preamble, I bring you Zeynep Tufekci.
00:02:21.460 I am here with Zeynep...
00:02:23.500 Jesus, Zeynep.
00:02:24.640 I know.
00:02:25.720 I swear to you that no bungling is a fail because there is no baseline.
00:02:31.860 So, go for it.
00:02:32.680 The mind was willing, but the tongue failed.
00:02:36.300 I don't blame you.
00:02:37.560 It really is.
00:02:38.280 It's an isolate language.
00:02:40.060 It has no relatives either.
00:02:41.560 We're a freak of nature language, so...
00:02:43.860 All right.
00:02:44.380 Well, I am here with Zeynep Tufekci.
00:02:47.420 Zeynep, thanks for coming on the podcast.
00:02:49.560 Thank you for inviting me.
00:02:51.240 So, we met at Banff at the TED Summit where we actually, we were in the same session.
00:02:58.020 We both gave talks on AI.
00:02:59.720 I gave a talk on sort of the further future and possible, very scary outcome.
00:03:06.260 And you gave a talk on the present, the way in which AI is becoming increasingly a topic
00:03:15.360 of concern.
00:03:16.640 We're not talking hypothetical human intelligence AI.
00:03:20.180 We're talking specialized AI that can do many good things, but also many undesirable things
00:03:27.000 if we're not careful.
00:03:27.740 Well, I'm sure we'll touch on that.
00:03:29.900 But before we do, just introduce yourself to people, because you have a kind of an interesting
00:03:34.700 backstory.
00:03:35.580 You didn't get into this by the most conventional path.
00:03:39.040 How did you come to be an expert on the sorts of things we're going to talk about in cybersecurity
00:03:44.160 and social persuasion and organizing movements through social media?
00:03:50.040 Who are you, Zeynep?
00:03:50.940 That's a very existential question to begin with.
00:03:56.060 Who am I?
00:03:57.400 Well, I'm not sure who I am, but I can describe the path that took me here.
00:04:05.080 As you say, it is a little unconventional, partly because I'm the product of a historical
00:04:10.100 transition, right?
00:04:11.120 I'm still of the generation that grew up without the internet, especially since I grew up in the
00:04:16.960 Middle East, and I grew up in Turkey, which at the time was ruled in the aftermath of a
00:04:23.740 military coup, which had brought about very heavy censorship.
00:04:28.400 So I grew up watching a single TV channel, which made me acutely aware of censorship, especially
00:04:33.360 since that TV channel didn't show us anything that seemed to be in the news or relevant to
00:04:38.060 the country.
00:04:38.540 Instead, we would watch Little House on the Prairie, because that's what they showed, and
00:04:43.260 it made no sense in Turkey, but it didn't matter.
00:04:45.480 And I started out as a kid really interested in math and science and physics and all the
00:04:53.980 things that kind of geeky kids are interested in, and I really enjoyed learning about it.
00:05:02.720 But early on in my life, I got terribly concerned about the ethical implications of technology, especially
00:05:09.860 since what happens to many kids like me happened to me too.
00:05:13.400 I call it the atom bomb problem for kids into science.
00:05:19.340 At some point, your excitement hits this wall because you learn about the atom bomb and that
00:05:24.820 it was enabled by great physics.
00:05:27.900 The very physics that you admire and think is amazing is also what enabled this, and so
00:05:32.960 you go into this tailspin.
00:05:34.380 And to my, I guess, kind of failure of imagination, I thought computers would be a great topic for
00:05:42.600 me because they would have fewer ethical implications.
00:05:46.420 And I also needed to get into a job quickly because not only did I grow up in Turkey, I grew
00:05:52.760 up in a pretty dysfunctional, broken home.
00:05:55.540 I needed to start working as soon as I could, started working literally as a 13-year-old and
00:06:02.000 then as a programmer as early as a teenager, 16, 17.
00:06:06.520 So I had this sort of very unusual path that I found myself in a technical job in a country
00:06:13.000 still under pretty significant censorship and closed public sphere without the internet.
00:06:20.140 And I found myself, because of my technical job, I found myself sort of glimpsing the future,
00:06:28.800 kind of this parallel existence where I'd work at IBM, which had this amazing intranet
00:06:35.380 that allowed me to talk with people around the world almost as an equal, right?
00:06:39.920 And, you know, I'm a, I mean, here, there I am, a teen girl, kind of with all that goes
00:06:44.960 into it in a country like that, pretty much anywhere in the world too.
00:06:48.640 But I'm on the intranets kind of as a person and taken seriously.
00:06:55.720 It was just, you know, the early promise of the internet people kind of laugh at right
00:06:59.140 now.
00:07:00.060 It had a reality to it.
00:07:02.060 So I sort of just got enchanted by this possibility.
00:07:07.460 And I was also fairly interested in how do we bring about change?
00:07:11.040 How do we bring more freedom?
00:07:12.140 How do we bring more compassion and reason to the world?
00:07:16.160 And I thought, this is great.
00:07:17.160 This is going to change everything.
00:07:19.500 And I really wanted to study it.
00:07:21.300 And I switched to sociology kind of along the way, but not knowing what exactly would make
00:07:28.800 sense, right?
00:07:29.480 I was just trying to find my way.
00:07:32.660 And because such things often happen in the United States, I found a way, I kind of stumbled
00:07:37.700 into graduate school in the United States, trying to understand all this better.
00:07:43.060 And in the meantime, as I was struggling with trying to understand and think through, the
00:07:48.280 world was progressing.
00:07:49.720 You know, we started having, you know, more and more digital connectivity.
00:07:53.160 And sometime, I think around 2004, I stopped having to explain why computer science, computer
00:08:01.140 programming and sociology and social science were related because Facebook happened.
00:08:06.000 And it was, I think, the first time that a lot of people who are not specialists kind of had this
00:08:13.340 very visceral reaction to how their social world is being changed by this new platform.
00:08:20.580 Questions of privacy and other things became very prominent in people's mind.
00:08:24.960 And then fast forward a little bit, Arab Spring happened, which is exactly what I studied, social
00:08:31.220 change and social moods.
00:08:32.300 I started studying that.
00:08:33.840 And then the Giza Park protests happened, which again happened three blocks from my place of
00:08:40.140 birth, that close to home.
00:08:42.400 So I went there.
00:08:43.860 And now I'm, you know, trying to focus on the future and understand how the methods in both
00:08:52.520 artificial intelligence, like machine learning and the Silicon Valley business models and
00:08:57.360 the world we are in, politically speaking, what does this intersection mean?
00:09:01.860 You know, how do we understand the rise of authoritarianism?
00:09:04.840 How do we think about technology's role in all of this?
00:09:07.720 And the security part that you mentioned, I got into it partly because I work with so many
00:09:12.520 people in social movements and journalists that they're kind of like the canary in the
00:09:17.140 mind was the insecurity in the internet affects them earlier on because they're targeted.
00:09:24.040 So I got into that part too.
00:09:27.000 I guess I'm a mutt in, you know, all of these particular fields.
00:09:31.260 It's that intersection.
00:09:32.640 And it turns out it's a relevant intersection.
00:09:35.220 And so here I am, to the degree one can answer this question of what I'm studying right
00:09:40.600 now and what I'm thinking about right now.
00:09:42.020 Well, it's all too relevant and only becoming more so.
00:09:44.920 And as you say, the first blush of enthusiasm for the internet connecting us all as an unambiguous
00:09:51.960 good, that has faded.
00:09:54.520 And now we're discovering that as this technology connects people and empowers us, it's also
00:10:01.740 fragmenting us in ways that are fairly difficult to correct for.
00:10:06.040 And it's creating new levers of influence that could lead to more authoritarian control
00:10:12.860 and perverse forms of persuasion.
00:10:16.740 And you told me in the setup to this that you were worried about something you've called
00:10:20.000 surveillance capitalism.
00:10:22.380 How do you think about that?
00:10:23.320 What is surveillance capitalism?
00:10:24.560 So here's what I think about this.
00:10:26.520 We have this scary convergence of a couple of events.
00:10:31.540 One of them is the business model on the internet for the sort of platforms that most of us use,
00:10:40.640 like the Facebooks and Twitters of the world.
00:10:43.220 It's capturing our attention and persuading us to, at the moment, click on ads.
00:10:50.780 So there's an enormous amount of brainpower going into how to make us buy 0.003 more shoes
00:10:56.640 per person on average.
00:10:57.860 You have this whole infrastructure that is collecting our data, that is doing hundreds
00:11:04.540 of thousands of dynamic tests on the platform just to persuade us to act in a particular
00:11:11.080 way for commercial reasons, right?
00:11:14.560 To make us purchase things.
00:11:16.840 And this is happening increasingly through technologies that are like machine learning, which is a form
00:11:26.340 of computer programming that is different than the past in that we don't program it anymore.
00:11:33.180 We feed the machines a lot of data and they create these large matrices and calculate certain
00:11:38.920 things.
00:11:39.420 And just like the brain, that we can't really see what a person is thinking if we slice their
00:11:44.540 brain.
00:11:45.420 With machine learning, you don't really see exactly what's going on.
00:11:48.700 It just spews out classifications.
00:11:50.820 It says, do this, do that, do this, do that.
00:11:53.140 It's probabilistic, but it works pretty strikingly well for the things that we're using it for.
00:12:00.840 But it needs data to work, which means that we have a business model that is set both to figure
00:12:08.940 out how to exactly push our buttons and also to use an enormous amount of data that is surveilled
00:12:19.440 from us symmetrically.
00:12:22.120 You don't get to see what they have.
00:12:24.400 And this enormous amount of data can also be used to deduce things about us that we haven't
00:12:32.200 disclosed, right?
00:12:33.340 It's not just invading our privacy directly.
00:12:35.540 When you have that much data, you can use computational inference to figure out who you think is the
00:12:41.320 troublemaker, who's depressed, who might be on a manic swing in a manic depressive cycle.
00:12:48.180 You can figure all these things out even if people don't disclose them or even know them,
00:12:52.660 right?
00:12:54.040 So this is kind of where things are at, this convergence.
00:12:59.060 And the thing I fear is that this is a perfect setup for authoritarians because it allows
00:13:07.980 them to survey the population and to nudge them and shape the opinions using this amount
00:13:17.340 of information that's asymmetric that can figure things out and using machine learning at scale.
00:13:21.760 That means you're like individually experimented on, figured out how to exactly scare you, how to
00:13:27.300 fear monger, how to, uh, when you're vulnerable, uh, and what you're vulnerable for.
00:13:33.760 And then this will come into politics as well.
00:13:37.000 And there's nothing wrong with persuasion as a form of politics, but it's not happening openly,
00:13:42.480 right?
00:13:42.860 It's happening person by person.
00:13:44.680 It's happening in the dark.
00:13:46.700 Uh, you don't see what other people are seeing.
00:13:49.120 You don't see what is being targeted at you.
00:13:52.860 And think of China, right?
00:13:55.360 With, uh, hundreds of millions of people online and it's not like they censor everything.
00:14:00.240 They censor a few things, but we know from research, they usually don't censor government
00:14:04.780 criticism.
00:14:05.940 I feel like it might've even made them more stable because an authoritarian's blind spot
00:14:09.900 is not knowing what people are up to.
00:14:12.060 And this is perfect for knowing exactly what people are up to and individually pushing their
00:14:17.980 buttons.
00:14:18.340 So I find this really ironic that the Silicon Valley business model and the Silicon Valley
00:14:25.040 workforce, which is uniquely, um, liberal or progressive or libertarian in general, pro-science,
00:14:34.840 empirically oriented, you know, they're geeky in many ways.
00:14:39.340 And I say it as a positive, I find that's my tribe too.
00:14:42.580 We may well be building the infrastructure of authoritarianism.
00:14:48.560 And I think they're under this impression that they'll never lose control of these tools,
00:14:53.640 that they built them and they won't let them be used for evil, so to speak.
00:14:58.640 And I look at history and that's never how it works.
00:15:03.220 Um, you build the infrastructure, it gets taken over by the people with money, with power,
00:15:08.740 with, uh, authority.
00:15:10.160 So that's kind of what I've started really worrying about.
00:15:14.000 Uh, my first book was about social moon, social change and digital tech and the complexities
00:15:18.820 there.
00:15:20.460 I'm now thinking, let's look at this from the point of view of power, the powerful, not
00:15:24.520 the challengers.
00:15:25.480 You know, we spent a lot of time thinking about digital media and digital technology and challengers
00:15:29.060 really need to start thinking about digital technology and the powerful and how they're converging
00:15:35.500 historically.
00:15:36.180 Well, let's take parts of this problem.
00:15:38.880 That's all fascinating.
00:15:40.100 And I've been thinking a lot about the way in which digital media is co-opting our attention
00:15:46.240 and causing us to spend our lives in ways that we will later regret.
00:15:51.320 And, and actually had another guest on the podcast, Tristan Harris, who spoke a lot about
00:15:55.300 that.
00:15:55.700 He's great on that.
00:15:56.560 Yeah.
00:15:56.720 Yeah.
00:15:57.320 Yeah.
00:15:57.620 But I haven't really thought as much about the authoritarian misuse of this.
00:16:03.380 I mean, obviously the, the, there's a lot in the news and a lot of talk about fake news
00:16:08.780 and the Russian meddling in our election.
00:16:10.920 And we should probably get to that.
00:16:12.980 So there, there's been obvious political issues here, but what's your view on social media in
00:16:20.740 particular?
00:16:21.140 I mean, I noticed you use Twitter with a fair degree of enthusiasm.
00:16:24.940 I see you have 74,000 tweets.
00:16:27.480 I do.
00:16:28.140 It's, I saw some of my research area.
00:16:30.260 So it's kind of, it's a special thing that I'm usually watching things on Twitter too.
00:16:36.080 So I have this deal thing.
00:16:37.740 I may be keeping an eye on part of my research project.
00:16:40.960 I think I would use it less if it weren't part of my research.
00:16:44.020 In fact, I don't do Facebook research as much and I'm on Facebook a lot less, partly because
00:16:50.240 as Tristan points out, it's a medium designed to capture your attention, right?
00:16:57.980 And it's a medium, like every incentive there is to try to capture your attention.
00:17:02.560 And there are times when I'm fine with that, but how do you keep autonomy and agency in an
00:17:09.200 architecture that's designed to get you to do something that maybe you don't want to
00:17:15.500 do if you ask me in the morning, right?
00:17:17.200 I might be wanting to do it then, but if you ask me in the morning, is this how much of my
00:17:22.800 day I want to spend?
00:17:23.780 So I try to sort of judge that.
00:17:26.600 And outside of my own research and my job, researching this stuff, I try to be sort of
00:17:30.720 more mindful of when am I not going to be on this?
00:17:33.620 And how am I going to relate to these technologies that I know are designed to grab me?
00:17:43.080 One of the things I've started trying to do is not use services if there's an alternative
00:17:47.940 that I don't pay for.
00:17:51.460 I feel like I want to be the consumer.
00:17:54.560 I want to be the one they're catering to rather than being the person whose attention they want
00:18:00.440 to grab so they can sell to people trying to manipulate me into buying 0.003 more shoes.
00:18:06.860 So that's kind of, it's part, and the problem is, of course, it's part of life.
00:18:11.280 I work with refugees and I do, you know, I try to sort of, the unluckiest people, right?
00:18:17.900 I try to sort of see if I can be of some help.
00:18:21.060 And I couldn't do that work if I weren't on Facebook because that's where the groups are
00:18:25.160 and that's where the organization goes on a lot of times.
00:18:28.560 So, you know, to be in the civic world today, you use these platforms because that's where
00:18:34.420 billions of people are.
00:18:35.920 On the other hand, they're not designed with the kind of goals I have in mind when I'm engaging
00:18:41.380 the world.
00:18:42.220 And it's this huge challenge.
00:18:43.820 It's this huge tension.
00:18:45.440 And it feeds into what I just said, which is that the people in power are increasingly
00:18:49.900 looking at this world and saying, what can we do with this?
00:18:54.420 How can we use it to consolidate power?
00:18:58.980 Do you have any thoughts about what recently happened in the election and the role that
00:19:03.840 social media played there?
00:19:05.420 And then the larger fake news phenomenon and just this issue about, with respect to how
00:19:10.140 we are getting siloed in echo chambers.
00:19:13.880 Absolutely.
00:19:14.260 It's like it's an illusion of being open to information.
00:19:17.980 But in fact, people are just ramifying their worldview by use of these tools.
00:19:25.440 Yeah.
00:19:25.640 So let me just say a couple of things.
00:19:27.300 With a lot of these tools, if you talk to the companies, the first thing they will tell
00:19:31.780 you is, that's what people are doing.
00:19:34.360 You know, that's not us.
00:19:35.400 Now, on the one hand, it's certainly true that this is driven by people, right?
00:19:45.600 This is like, you would not, it would not be fair to say that, you know, the social media
00:19:50.040 platforms are generating this from whole cloth.
00:19:54.620 They're not.
00:19:55.080 It's more like we have certain human tendencies.
00:19:59.780 We have, you know, if we see something that we agree with, it's more pleasant.
00:20:05.400 If we see something we're angry, it's kind of captures our attention.
00:20:12.680 And you see this in the research on perception, right?
00:20:15.780 When you look at a crowd, you're a lot more likely to notice the angry face.
00:20:21.840 Because, I mean, in an in-person thing, if you think about the evolutionary process, the
00:20:26.440 place of seeing, if you live in a small group, it kind of probably made sense to know exactly
00:20:32.340 who was mad at you, because that could be a threat.
00:20:34.740 So there are all these things that we already have tendencies for.
00:20:39.360 I liken it to having an appetite for sugar and salt, right?
00:20:43.020 It's a perfectly reasonable thing, given our evolutionary history, to be into sugar and
00:20:49.320 salt.
00:20:49.700 The problem is, very rapidly, without any time to adjust, forget evolutionarily, culturally,
00:20:55.040 we have shifted to a world where we're supplied with, not only we're not just supplied with
00:21:03.840 extra sugar and salt, social media platforms use sugar and salt to keep you there, kind of
00:21:10.420 like a salt lake used to shoot animals.
00:21:12.840 But instead of shooting us, they're just capturing our attention.
00:21:18.120 They're selling our shoes.
00:21:18.980 And that's, I think, a big part of what's going on with the election, too.
00:21:23.840 What happened is we got siloed, of course.
00:21:28.940 And because of my work and because of my sensitivities to authoritarians, I guess, I started following
00:21:37.380 social media, Trump social media, very early on.
00:21:39.980 Because I thought, whoa, this is an interesting thing.
00:21:42.920 And I argued he was liable when everybody was laughing at him, exactly because I was following
00:21:47.660 his base on social media.
00:21:50.640 And a couple of things happened.
00:21:52.260 I saw how and why he resonated.
00:21:55.720 I also saw an enormous amount of misinformation that ranged from distortions to fake news sort
00:22:02.160 of proliferate there.
00:22:03.240 I also saw that once when I wasn't making a conscious effort to follow these people, which
00:22:10.920 I did as a part of work, I did it every day for, you know, almost two years now.
00:22:16.440 Like if I went on Facebook, I had friends who were Trump supporters, although they were in
00:22:22.540 the minority because, you know, I'm a college professor in a blue part of a purple state.
00:22:27.440 And it kind of makes sense for most of my friends not to be Trump supporters.
00:22:30.980 But I have friends from middle school and elsewhere, and some of them, turns out, were sympathetic
00:22:35.520 to Trump.
00:22:36.880 I never saw their posts, right?
00:22:39.220 I just sort of thought about it, you know, halfway through.
00:22:43.060 And I'm like, whoa, do I not have a single person I friended on Facebook?
00:22:47.480 Because I friend lots of people, and Facebook is not very personal for me.
00:22:51.560 And I had to hunt them.
00:22:52.940 I guessed them.
00:22:53.740 And I hunted their posts down.
00:22:55.800 And yeah, there were people who had sympathy.
00:22:58.720 And Facebook's algorithm never showed it to me.
00:23:03.260 And I'm guessing it's not, I mean, obviously, it's not a conscious decision.
00:23:06.720 Once again, it's these machine learning algorithms.
00:23:08.380 They know that if you give people sugar and salt, which I just, in case of Facebook, for
00:23:14.820 me, it's cuddly stuff or outrageous stuff, right?
00:23:20.180 Babies, cats, cute things, happy news.
00:23:23.460 Also things we're angry about, outrageous.
00:23:25.920 Babies eating cats.
00:23:26.920 I think those both polar sides attract attention.
00:23:32.060 So they just feed us that.
00:23:34.140 And I think that's really destructive, especially given it's a way to make money for people.
00:23:39.800 So you could just be a spammer and figure out, hey, look, I can just feed people fake news
00:23:45.900 about Hillary Clinton.
00:23:47.440 That's what a lot of people did.
00:23:48.800 I interviewed a bunch of these people.
00:23:50.280 Some of them were even liberals.
00:23:51.600 They were just like, it works.
00:23:53.240 It spreads.
00:23:53.980 And we make money from Facebook.
00:23:55.180 So not only does it allow, not only does the algorithm kind of amplify this, it allows
00:24:01.620 you to make a lot of money from doing exactly this.
00:24:05.780 And I'm not saying mass media was ever perfect.
00:24:08.680 Many failures there.
00:24:10.140 But this is a new onward world to have no checks on, no ethos against this kind of misinformation.
00:24:20.760 So about four or five years ago, five years ago now, I wrote this article for the New York
00:24:30.540 Times worried about the Obama campaign's use of data, right?
00:24:35.840 Because they were already sort of developing all these methods to target people and to try
00:24:40.800 to persuade people using statistical data they had on them.
00:24:44.820 And I said, look, you know, I understand campaigns want to win, but this kind of, you know, asymmetric
00:24:50.720 accumulation of data where it goes far beyond just, you know, which magazine you're subscribed
00:24:57.100 to.
00:24:57.660 And the kind of smart targeting has the potential to gerrymander us down to the person and have
00:25:10.060 politics only be about people who are persuadable and all these sort of downside effects of having
00:25:16.080 the public sphere become more and more private and more and more asymmetric in how it operates.
00:25:22.600 And I got a little pushback from people in the Obama campaign and people in the data science
00:25:27.960 world.
00:25:28.340 And one of the things I was told was, one, I was told this will always be on the side of
00:25:34.780 people we like, people told me.
00:25:38.860 They said, you know, this is something that, you know, people who like science, people who
00:25:44.140 like data, people are empirical.
00:25:45.860 Um, this is only going to be their tool because the other side, they told me it doesn't like
00:25:52.560 data.
00:25:52.900 They can't do this.
00:25:54.320 The second thing I was told was, um, this is just a form of persuasion, no different than
00:26:00.440 any other.
00:26:01.400 Now, fast forward just four years after that.
00:26:05.760 And what I saw was, um, in the 2016 election, the Ted Cruz's data people ended up being Donald
00:26:16.940 Trump's data people.
00:26:18.180 And I'm going to recount something they claimed they did.
00:26:21.840 Now, I don't have access to the internal data, so I can't vouch they did this, but I have some
00:26:27.280 independent evidence that they at least tried.
00:26:30.240 But just outlining is enough to explain what the issue is.
00:26:34.360 So they claim that they used people's Facebook likes and other kind of indicators, social
00:26:43.040 media data, uh, or whatever it is they use because the social media data is very good
00:26:47.580 for this to try to figure out people's psychological profile.
00:26:51.700 Now we know from research that if I just have, um, say what you liked on Facebook or even just
00:26:59.540 your tweet stream, uh, we can guess using these sort of complex algorithms, we can guess with
00:27:05.540 pretty high probability, um, where you fit on the big five personality traits like neuroticism,
00:27:13.060 extroversion, et cetera.
00:27:14.900 We can guess your sexual orientation.
00:27:16.720 We can guess whether you're religious and what religion.
00:27:19.360 We can guess a lot of things, even if you never disclose them, right?
00:27:23.540 These are not things that you needed to have put on your profile.
00:27:26.060 So we can figure this out.
00:27:27.640 And we know also from research that some people will vote more authoritarian if they're scared.
00:27:33.980 Other people get pissed off at fear mongering.
00:27:36.720 And the problem with advertising on TV is, you know, you're advertising to everyone at the
00:27:43.960 same time, right?
00:27:45.680 But what if you could go on Facebook and target only the people that would be prone to a particular
00:27:52.720 kind of message, say fear mongering.
00:27:54.900 Now, again, because Facebook won't tell us, we don't know the exact story here, but Donald
00:28:00.620 Trump's campaign claims, they try to demobilize particular segments of the population against
00:28:08.500 voting.
00:28:09.580 So it's important.
00:28:11.180 This isn't persuasion.
00:28:12.100 They weren't trying to persuade them to vote for him.
00:28:15.040 They were just trying to tell them Hillary Clinton's just as bad, stay home.
00:28:20.860 And for example, one of the targeted constituencies was black men in Philadelphia and Philadelphia
00:28:25.760 was, you know, just very little difference in Pennsylvania, which was a major electoral
00:28:32.960 gain for Trump, right?
00:28:34.680 And I have independent confirmation they did target black men in Philadelphia.
00:28:38.960 We've come across instances.
00:28:41.020 So what they tried to do was to demobilize those people.
00:28:44.880 What did they tell them?
00:28:46.260 We don't know, right?
00:28:47.440 Only Facebook knows.
00:28:49.280 Did they tell them things that were correct, things that were false, things that were completely
00:28:54.160 made up of whole cloth?
00:28:56.060 Were they just scary commercials?
00:28:57.320 Who knows?
00:28:58.020 They were just targeted at them.
00:29:00.280 And so the census data from the election just came in.
00:29:05.760 And it's pretty clear that the biggest difference between 2012 and 2016 is the black turnout in
00:29:17.740 the country was depressed in lots of places.
00:29:20.760 Now, clearly, there are multiple possible explanations for this.
00:29:23.540 It could be the Obama effect has worn off, right?
00:29:27.840 It's kind of reasonable to expect the first African-American president would gather a bigger
00:29:32.320 share and enthusiasm from the African-American population.
00:29:36.200 It could be that part of it is these strict voter suppression-oriented laws that cut the amount
00:29:42.920 of hours, that cut the number of voting machines in minority districts.
00:29:48.100 It could be the gerrymandering.
00:29:50.300 It could be the voter ID laws that are especially problematic with elderly black people who don't
00:29:55.420 necessarily have the birth certificates and et cetera.
00:29:58.420 But it could also be this.
00:30:00.400 We could also have a world in which large segments of the population were psychologically profiled and
00:30:08.180 otherwise profiled and silently targeted through Facebook dark ads in a way that would push their buttons and do it one by one.
00:30:18.120 Like, if you needed people to figure out what everybody needed, you'd never manage it.
00:30:22.980 Because to target 100,000 people, you'd need 10,000 people.
00:30:26.580 Whereas right now, we're at a world where machine learning is designing machine learning experiments to experiment on us.
00:30:33.340 It's already out of our control, right?
00:30:34.740 And you can do this at scale.
00:30:35.740 You can figure out people one by one using this technology.
00:30:40.400 So what if that is part of what swung a very close election?
00:30:44.880 Clearly, it's multi-causal.
00:30:46.200 So anything could have swung it.
00:30:47.380 But what if this is part of what made the difference?
00:30:49.560 Now, this is a small example.
00:30:51.920 And the question, I mean, the objection I hear to this is they probably didn't manage this.
00:30:57.640 My answer is, well, we don't know.
00:31:01.440 And if they didn't manage it, this is where things are going.
00:31:05.200 You see what I'm saying?
00:31:06.000 This is what my concern with surveillance capitalism meets authoritarianism is,
00:31:11.880 that the business model of capturing your attention, profiling you,
00:31:16.060 and trying to persuade you to buy that extra shoe is very compatible with a manipulative public sphere
00:31:24.400 where you don't get to see what is even contested because it's so segmented person by person.
00:31:30.740 And then buttons are pushed person by person.
00:31:33.460 That make sense?
00:31:34.540 Yeah, yeah.
00:31:35.140 It's all very interesting.
00:31:36.000 I think people, most people at first glance, will understand what's wrong with targeting people,
00:31:42.900 however individually, with fake information, with lies, with fake news stories,
00:31:49.040 and persuading them that way.
00:31:51.440 That's clearly a problem, and we have to figure out some way to correct for it.
00:31:54.860 But as you said earlier, persuasion is just persuasion.
00:31:59.640 There's nothing wrong in principle with persuasion.
00:32:01.740 And so it may not be clear to people why there is a special concern around the segmentation
00:32:10.220 of the population with these tools when you are validly persuading them.
00:32:15.800 Well, even if you're validly persuading people, right?
00:32:19.280 Even if you're just sort of, I mean, in some ways, obviously, this is just more of what
00:32:25.560 just political campaigners and marketers and everybody have always tried to do, right?
00:32:31.020 In many ways, there is no difference from what they try to do.
00:32:36.760 The big difference is it's doable now, right?
00:32:41.040 This is what past marketers, you can go back and you can look at, you know, sort of how political
00:32:45.780 campaigns have always tried to do this.
00:32:48.000 I'm just reading this, Rick Pearlstein's biography of Goldwater, and he's got a campaign manager
00:32:53.660 that's saying, the indifference, we got to target the indifference, and he has to figure
00:32:57.920 out who they are and what's, you know, how to target these people.
00:33:05.900 They had baseball bats.
00:33:07.420 They could advertise on TV.
00:33:09.440 They could just try to send a message out.
00:33:12.220 And it was really difficult to send the message out to one person and not the other, and to
00:33:18.740 push one person's buttons without upsetting the other.
00:33:22.160 And also, because it was public, if you put out an ad like that on TV, it was plausible that
00:33:28.320 the other side would mobilize and say, this isn't true.
00:33:33.520 Here's how to do this, right?
00:33:34.960 It's all possible that, you know, we could have this contestation.
00:33:39.400 And if you go back to the idea of the public sphere, right, it was never as, you know,
00:33:46.160 nice and as clean as the Habermasian version of it, where people are just having recent discussions
00:33:53.160 regardless of who they are and their status.
00:33:56.340 But it was really sort of, at least in ideal, we would have this world.
00:34:03.960 Right now, it's gone exactly in the opposite direction.
00:34:09.420 Instead of sort of wishing to persuade us like that and only having baseball bats to act
00:34:15.260 with, they have scalpels that they can use to get at us one by one, right?
00:34:22.760 So instead of baseball bats that would both provoke a reaction and weren't as effective,
00:34:27.840 they have quiet scalpels that they can do this with without provoking the reaction,
00:34:33.040 without being public, and without sort of having us be able to oppose it.
00:34:39.160 And so that's kind of my worry is that, yes, we have antecedents of this as we have everything,
00:34:47.040 but it's now effective.
00:34:49.180 And it's also asymmetric.
00:34:50.780 I don't ever see what data they have on me.
00:34:54.020 I don't ever see what they're trying to push my buttons, right?
00:34:59.520 I don't have any meta idea of, like, I don't have perspective.
00:35:04.100 And I don't have defenses against it because if it was, you know, if I had defenses against it,
00:35:10.300 let me liken it this way.
00:35:11.680 When movies first came out, people ran away when they saw a train coming at them on the screen,
00:35:19.660 right?
00:35:20.580 Now, right now, if you see a movie, you know, and there's a train or a car coming at you,
00:35:24.280 you don't even flinch, right?
00:35:25.500 You know, it's a movie screen and nothing's coming at you.
00:35:28.280 For the ordinary person, it was perfectly understandable to be scared of this new phenomenon
00:35:33.960 and not understand how to deal with because it wasn't, you know, it was just so novel.
00:35:38.900 And if you look at the early history of moviemaking,
00:35:42.340 you see that it was greatly intertwined with extreme, violent, racist, fascist ideologies.
00:35:51.180 If you look at people like, say, Lene Riefenstahl, this German filmmaker, actress,
00:35:58.260 who was great behind the camera.
00:35:59.960 She invented a lot of the shots.
00:36:01.400 If you watch ESPN, she's probably invented half their shots
00:36:04.180 covering first the Munich Olympics for Hitler.
00:36:07.440 But that craft got adopted into authoritarianism
00:36:12.780 because it was very impressive and very effective in persuading the masses
00:36:16.960 in ways that isn't as apparent to us now
00:36:21.420 because we kind of got used to the format
00:36:23.280 and we have a lot more cynicism and defenses against the format.
00:36:27.240 So that's where I think we are with these sort of dark technologies
00:36:30.720 asymmetrically aimed at persuasion and manipulation
00:36:33.960 is that we don't really understand their power.
00:36:36.380 We don't get to see it.
00:36:38.280 It's all private data.
00:36:39.660 Like, so we don't get to see Facebook knows what happened last election,
00:36:42.740 not telling anyone, not letting any independent researchers kind of add it.
00:36:48.360 And we don't have a way to defend ourselves against it.
00:36:53.140 And people will say, you know, I'm not manipulated.
00:36:55.580 I'm not manipulated.
00:36:56.480 And everybody thinks that, but, you know, we're all people.
00:36:59.100 We're all persuadable in particular ways.
00:37:02.200 And if there's a science and a craft of doing it with massive surveillance of us
00:37:06.680 and testing of us and finding the exact way,
00:37:09.060 we're all going to be vulnerable.
00:37:11.060 And I think that's where we are is that, in fact, if you look at it,
00:37:16.840 Facebook's business model is telling advertisers and political campaigns
00:37:21.320 that it's a great platform for persuading people.
00:37:24.640 And it's telling us it's a lousy platform.
00:37:28.400 It won't change any minds.
00:37:29.520 It's just us.
00:37:30.700 And, like, both of those things can happen at the same time.
00:37:34.700 And I think it works to a degree.
00:37:37.160 And I think we need to sort of really think about how do we deal with this new threat to
00:37:44.900 free conversation that is not so asymmetrically controlled.
00:37:50.400 Well, listen, with 74,000 tweets, Zeynep, I would say the AIs have already gotten to you.
00:37:57.100 You might have a problem.
00:37:59.020 I'll just point them at you.
00:38:01.240 When they come for me, I'll say it was them, it was them.
00:38:04.040 I think I only have 6,000.
00:38:05.760 Well, yeah.
00:38:06.460 So the thing is, they probably have my number in terms of what kind of a person I am,
00:38:12.620 a lot of things.
00:38:14.340 Although, on the other hand, I study these things a lot.
00:38:18.160 So I'm always watching, like, every time I'm advertised, every time there's a dynamic
00:38:22.520 change, every time something happens, I'm constantly trying to probe and get at it.
00:38:27.420 And despite that, I wouldn't trust myself to be immune to it at all.
00:38:34.200 And that's the reason, I mean, there's a strong reason to construct, for example, I think places
00:38:40.060 for children that are free of advertisements directed at them.
00:38:43.700 I think children don't have yet, like, especially younger children, don't have the way to assess
00:38:51.940 the credibility.
00:38:53.160 And it's something that part of, you know, parenting is to teach them how to assess manipulative
00:38:58.940 messages directed at them.
00:39:01.500 So it starts from protecting them to educating them.
00:39:04.760 And hopefully, by the time they're out in the world on their own, they realize manipulative
00:39:10.400 messaging.
00:39:11.340 And I feel like it's the same thing, except this is on steroids.
00:39:14.880 This is much more effective, much more data-based, much more empirically strong and machine learning
00:39:20.780 based ways of manipulating us that we don't yet have means to defend ourselves properly because
00:39:30.300 we don't even have a full picture of what's going on.
00:39:33.760 The degree to which our economy depends on advertising, in particular the digital economy, it's really
00:39:40.720 stunning.
00:39:42.540 And most people are fairly oblivious to the downside, apart from not liking some annoying
00:39:50.480 ads, but they don't see how the incentives get aligned perversely.
00:39:55.480 Absolutely, absolutely.
00:39:56.480 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:40:07.720 SamHarris.org.
00:40:09.220 Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along
00:40:13.620 with other subscriber-only content, including bonus episodes and AMAs, and the conversations
00:40:19.120 I've been having on the Waking Up app.
00:40:21.240 The Making Sense podcast is ad-free and relies entirely on listener support.
00:40:25.700 And you can subscribe now at SamHarris.org.
00:40:28.500 Thank you.
00:40:38.260 Thank you.