Making Sense - Sam Harris - October 17, 2020


#220 - The Information Apocalypse


Episode Stats

Length

47 minutes

Words per Minute

154.26039

Word Count

7,381

Sentence Count

335

Misogynist Sentences

1

Hate Speech Sentences

15


Summary

Nina Schick is an author and broadcaster who specializes in how technology and artificial intelligence are reshaping society. She has advised global leaders in many countries, including Joe Biden, and is a regular contributor to Bloomberg, Sky, CNN and the BBC. She speaks seven languages and holds degrees from Cambridge University and University College London. Her new book, Deep Fakes, explores the terrain we re about to discuss: the epidemic of misinformation and disinformation in our society now, and the coming problem of deep fakes, which is, when you imagine it in detail, fairly alarming. In this episode, Nina talks about her background and how it intersects with her interests in history and politics, and how she became interested in the emerging field of information warfare. She also discusses how she got her start as a writer and journalist, and why she thinks it s important to have a formal relationship with technology and AI in order to make sense of the world we live in. To find a list of our sponsors and show-related promo codes, go to gimlet.fm/sponsorships/Making Sense. To learn more about our sponsorships and support our efforts to make the podcast a great place to live up to our best selves, visit makingsense.org/sponsorship . Thanks for listening and supporting the podcast! Sam Harris and her team at The Making Sense Podcast Thank you for listening to the podcast? -Sam Harris and his team at Samharris.org This podcast is all about making sense and understanding the world through words and data and connecting it with people through stories and connecting them through them through the internet and social media and connections through the digital world through the world and the power of human connections and experiences and experiences through the connections and connections and culture and connections, and all of the things that matter more of that becomes a good day in the making sense, and more of it becomes a real thing, and a better of that in the good things, everywhere and a good thing, a better than that, a place and a place, a good place, and so much more, a real place, more of a thing, thank you, thanks to you, really and a lot of things, really really, really less of that, really than that etc., thanks, really, truly, really more of things like that, good things really, etc. - Thank you, Sarah Ravelled


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.340 This is Sam Harris.
00:00:10.400 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.280 feed and will only be hearing partial episodes of the podcast.
00:00:18.340 If you'd like access to full episodes, you'll need to subscribe at samharris.org.
00:00:22.980 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:27.580 other subscriber-only content.
00:00:30.000 And as always, I never want money to be the reason why someone can't listen to the podcast.
00:00:34.980 So if you can't afford a subscription, there's an option at samharris.org to request a free
00:00:39.480 account, and we grant 100% of those requests.
00:00:42.720 No questions asked.
00:00:46.400 Welcome to the Making Sense Podcast.
00:00:48.840 This is Sam Harris.
00:00:51.820 Okay, no housekeeping today.
00:00:54.740 Today I'm speaking with Nina Schick.
00:00:56.980 Nina is an author and broadcaster who specializes in how technology and artificial intelligence
00:01:03.960 are reshaping society.
00:01:06.060 She has advised global leaders in many countries, including Joe Biden, and she's a regular contributor
00:01:12.700 to Bloomberg, Sky, CNN, and the BBC.
00:01:17.400 Nina speaks seven languages and holds degrees from Cambridge University and University College,
00:01:23.900 London, and her new book is Deep Fakes, which explores the terrain we're about to discuss.
00:01:30.920 We talk about the epidemic of misinformation and disinformation in our society now and the
00:01:37.020 coming problem of deep fakes, which is, when you imagine it in detail, fairly alarming.
00:01:43.300 We get into the history of Russian active measures against the West, the weaponization of the migrant
00:01:49.560 crisis in Europe, Russian targeting of the African American community, Trump and the rise of political
00:01:56.320 cynicism, QAnon, the prospect of violence surrounding the presidential election, and other topics.
00:02:04.000 Anyway, this is all scary stuff, but Nina is a great guide through this wilderness.
00:02:11.680 And now I bring you Nina Schick.
00:02:18.900 I am here with Nina Schick.
00:02:20.940 Nina, thank you for joining me.
00:02:22.940 Thanks for having me, Sam.
00:02:24.460 We have a lot to talk about.
00:02:26.300 Yeah, you have a very interesting background, which I think suggests many common interests
00:02:32.000 and kind of overlapping life trajectories.
00:02:35.940 I don't think we're going to be able to get into that because you have produced so many
00:02:40.140 urgent matters in your recent book that we need to talk about.
00:02:44.000 But to get started here, what is your background and personally, but also just what you're focusing
00:02:50.080 on these days that gives you an expertise on the topics we're going to talk about?
00:02:54.980 Well, it's a really interesting and crazy story, one that could only happen in the 21st century.
00:03:01.940 I'm half German and I'm half Nepalese.
00:03:04.520 My father was a German criminal defense lawyer who in the 70s decided, you know, he was going
00:03:10.100 to seek spirituality and travel east and took his car, threw in a few books and did that
00:03:15.940 big journey that a lot of young people did back in the 70s through Afghanistan, India, and
00:03:21.080 then ended up in Nepal, which at this time was still this hermetic kingdom, fell in love
00:03:26.220 with it and met my mother there briefly after a decade or so.
00:03:31.760 And basically, my mother came from this totally different universe.
00:03:35.460 She grew up in Nepal as a member of this community in a Himalayan tribe, had no running
00:03:41.960 water, electricity, shoes when she was growing up.
00:03:44.820 And because she met my father, you know, they fell in love and they kind of decided to have
00:03:48.680 us, my brother and myself.
00:03:50.540 And I grew up in Kathmandu in the 80s and the 90s.
00:03:53.440 And then eventually I came to the UK to go to university and I went to Cambridge and UCL.
00:04:00.220 And my kind of discipline is really in history and politics.
00:04:03.640 I've always been fascinated by history and politics.
00:04:06.500 And especially at this time when the geopolitical sands seem to be shifting in such a dramatic
00:04:11.420 way.
00:04:12.180 So my career over the last 10 years has really been working at the heart of Westminster as
00:04:17.860 a policy analyst, a journalist, and an advisor on some of the key geopolitical shifts around
00:04:23.680 the European Union.
00:04:24.580 So this includes the kind of what happened with Russia and the invasion of Ukraine in 2013.
00:04:30.380 Subsequently, the EU's migrant crisis in 2015.
00:04:35.300 Then, obviously, I was very tied into the work here in the UK around Brexit.
00:04:39.840 I was helping to advise the government on that in 2016.
00:04:43.060 Then, of course, the election of Trump in 2016.
00:04:46.420 Then I went on to advise Emmanuel Macron's campaign, which was also interestingly hacked
00:04:51.780 by the Russians.
00:04:52.520 And finally, I got to a point in 2018 where I was working with the former NATO Secretary General
00:04:58.800 and he covened a group of global leaders, which included Joe Biden.
00:05:03.700 And he wanted to look at how the 2020 election might be impacted by what we had seen in 2016
00:05:13.220 and how the new kind of threats were emerging.
00:05:16.340 And this is really where I came to deepfakes.
00:05:18.420 And that is really the starting point for my book.
00:05:21.460 So I have this background in geopolitics, politics, information warfare.
00:05:26.020 And my area of interest is really how the exponential changes in technology,
00:05:31.580 and particularly in AI, are rewriting not only politics, but society at large as well.
00:05:37.800 So, yeah, you are a citizen of the world.
00:05:39.540 I mean, that's quite amazing.
00:05:41.040 Did you grow up speaking Nepali and German?
00:05:44.440 Yeah.
00:05:44.800 I mean, I grew up with four languages.
00:05:47.500 So, Nepali, German, Tamang, because my mother is from an ethnic minority group in Nepal,
00:05:55.040 which actually is closely related to Tibetans.
00:05:59.180 So, Tamang is a completely different language.
00:06:01.560 So, Nepali, German, Tamang, and Hindi, because everybody in Nepal speaks Hindi.
00:06:06.060 India is the big brother on the border.
00:06:08.300 So, that was, you know, something I wish I could give my daughter as well.
00:06:12.740 Well, I live in the UK now, and most people in the UK, you know, we speak English.
00:06:18.020 That's it.
00:06:18.720 Yeah, all too well, I can hear.
00:06:20.220 So, your English betrays none of that colorful backstory.
00:06:24.720 It's quite amazing.
00:06:26.180 So, yeah, I know we have common interests in the kinds of things that brought your father
00:06:32.460 to Nepal in the first place and meditation and forming a philosophy of life that is aimed
00:06:38.500 at deeper levels of well-being than is often attained by people.
00:06:43.100 But we have such a colossal mess to clean up in our society now with how our information
00:06:49.780 ecosystem has been polluted and deranged that I think we're just going to do another
00:06:55.520 podcast on the happy talk of what we could share when we get past these increasingly terrifying
00:07:04.260 dangers and self-inflicted wounds.
00:07:07.240 I mean, it's really, it's amazing to see how much of this is our own doing, and we'll
00:07:13.980 talk about bad actors and people who are consciously using our technology against us to really destroy
00:07:20.600 the possibility of living in an open society, but so much of this is a matter of our entertaining
00:07:28.100 ourselves into a kind of collective madness and what seems like it could be a coming social
00:07:35.000 collapse.
00:07:36.040 I realize that if you're not in touch with these trends, you know, if anyone in the audience
00:07:40.860 who isn't, this kind of language coming from me or anyone else can sound hyperbolic, but
00:07:47.540 we're really going over some kind of precipice here with respect to our ability to understand
00:07:54.380 what's going on in the world and to converge on a common picture of a shared reality, because
00:08:00.520 we're in the midst of an information war, and it's being waged against democratic societies
00:08:06.040 by adversaries like Russia and China, but it's also a civil war that's being waged by
00:08:12.220 factions within our society, and there are various political cults, and then there's the president
00:08:17.940 of the United States himself.
00:08:20.100 All of this is happening on the back of and facilitating an utter collapse of trust in
00:08:27.500 institutions and a global decline in democracy, and again, we've built the very tools of our
00:08:35.280 derangement ourselves, and in particular, I'm talking about social media here.
00:08:39.500 Yeah, so your book goes into this, and it's organized around this new piece of technology
00:08:44.720 that we call deep fakes, and the book is deep fakes, the coming infocalypse, which, that's
00:08:52.500 not your coinage.
00:08:53.460 On the page, it's very easy to parse.
00:08:56.060 When you say it, it's hard to understand what's being said there, but it's really, you're talking
00:08:59.860 about an information apocalypse.
00:09:01.560 Just remind people what deep fakes are, and suggest what's at stake here in terms of how
00:09:08.640 difficult it could be to make sense of our world in the presence of this technology.
00:09:13.220 Yes, absolutely.
00:09:14.160 So a deep fake is a type of synthetic media, and what synthetic media essentially is, is
00:09:21.100 any type of media.
00:09:22.300 It can be an image, it can be a video, it can be a text that is generated by AI.
00:09:29.280 And this ability of AI to generate fake or synthetic media is really, really nascent.
00:09:36.080 We're only at the very, very beginning of the synthetic media revolution.
00:09:40.720 It was only probably in about the last four or five years that this has been possible,
00:09:46.920 and for the last two years that we've been seeing how the real world applications of this
00:09:52.340 have been leeching out from beyond the AI research community.
00:09:55.340 So the first thing to say about synthetic media is that it is completely going to transform
00:10:01.120 how we perceive the world.
00:10:03.760 Because in future, all media is going to be synthetic, because it means that anybody can
00:10:10.920 create content to a degree of fidelity that is only possible for Hollywood studios right
00:10:17.820 now, right?
00:10:18.540 And they can do this for little to no cost using apps or software, various interfaces,
00:10:25.520 which will make it so accessible to anyone.
00:10:29.560 And the reason why this is so interesting, another reason why synthetic media is so interesting
00:10:35.100 is until now, the best kind of computer effects, CGI, you still can't quite get humans right.
00:10:43.400 So when you use CGI to do effects where you're trying to create robotic humans, it still
00:10:48.820 doesn't look like it's called, you know, Uncanny Valley.
00:10:51.640 But it turns out that AI, when you train your machine learning systems with enough data,
00:10:56.980 they're really, really good at generating fake humans or synthetic humans, both in images.
00:11:02.940 I mean, and when it comes to generating fake human faces, so images, still images, it's already
00:11:09.400 perfected that.
00:11:10.240 And if you want to kind of test that, you can go and look at thispersondoesnotexist.com.
00:11:14.620 Every time you refresh the page, you'll see a new human face that, to the human eye, to
00:11:19.180 you or me, Sam, will look at that and will think that's an authentic human, whereas that
00:11:24.380 is just something that's generated by AI, that human literally doesn't exist.
00:11:28.580 And also now increasingly in other types of media like audio and film.
00:11:35.200 So I could take essentially a clip of a recording with you, Sam, and I could use that to train
00:11:42.260 my machine learning system.
00:11:43.480 And then I can synthesize your voice.
00:11:45.160 So I can literally hijack your biometrics.
00:11:48.300 I can take your voice, synthesize it, get my AI kind of machine learning system to recreate
00:11:53.480 that.
00:11:53.820 I can do the same with your digital likeness.
00:11:56.920 Obviously, this is going to have tremendous commercial applications.
00:12:01.660 Entire industries are going to be transformed.
00:12:03.440 For example, corporate communications, advertising, the future of all movies, video games.
00:12:12.840 But this is also the most potent form of myths and disinformation, which you're democratizing
00:12:19.640 for almost anyone in the world at a time when our information ecosystem has already become
00:12:25.120 increasingly dangerous and corrupt.
00:12:27.240 So the first thing I'd say about synthetic media is it is actually just heralding this
00:12:33.800 tremendous revolution in the way that we communicate.
00:12:36.760 The second thing I'd say is that it's coming at a time when we've had lots of changes in our
00:12:42.440 information ecosystem over the past 30 years.
00:12:44.620 So, you know, that society hasn't been able to keep up with from the internet to social media
00:12:48.660 to smartphones.
00:12:49.200 And this is just the next step in that.
00:12:51.740 And then the final thing, this is where I come to deep fakes, is that this field is still
00:12:56.920 so nascent and emerging that the taxonomy around it is completely undecided yet.
00:13:02.220 And as I already kind of pointed out or touched upon, there will be legitimate use cases for
00:13:07.020 synthetic media.
00:13:07.780 And this is one of the reasons why this cat is out of the bag.
00:13:11.640 There's no way we're putting it back in because there's so much investment in the kind of
00:13:16.180 commercial use cases ever since, I think there's almost 200 companies now that are working
00:13:22.400 exclusively on generating synthetic media.
00:13:24.960 So we have to distinguish between the legitimate use cases of synthetic media and how we draw
00:13:30.500 the line.
00:13:31.220 So I very broad brush in my book say that the use and intent behind synthetic media really
00:13:38.320 matters in how we define it.
00:13:39.600 So I refer to deep fake as when a piece of synthetic media is used as a piece of mis or
00:13:45.560 disinformation.
00:13:46.440 And, you know, there is so much more that you could delve into there with regards to the
00:13:50.880 kind of the ethical implications and the taxonomy.
00:13:53.020 But broadly speaking, that's how I define it.
00:13:55.980 And that's my definition between synthetic media and deep fakes.
00:14:00.700 Well, so as you point out, all of this would be good, clean fun if it weren't for the fact
00:14:06.900 that we know there are people intent upon spreading misinformation and disinformation and doing
00:14:13.980 it with a truly sinister political purpose.
00:14:17.740 I mean, not just for amusement, although that can be harmful enough.
00:14:22.320 It's something that state actors and people internal to various states are going to leverage
00:14:30.860 to further divide society from itself and increase political polarization.
00:14:38.660 But it's amazing that it is so promising in the fun department that we can't possibly even
00:14:47.640 contemplate putting this cat back in the bag.
00:14:50.260 I mean, it's just that's the problem we're seeing on all fronts.
00:14:54.240 I mean, so it is with social media.
00:14:56.600 So it is with the the ad revenue model that is selecting for so many of its harmful effects.
00:15:02.680 I mean, we just can't break the spell wherein people want the cheapest, most fun media and
00:15:09.640 they want it endlessly.
00:15:10.420 And yet the the harms that are accruing are so large that it's it's amazing just to see
00:15:18.140 that there's just no there's no handhold here whereby we can resist our slide toward the
00:15:23.780 precipice.
00:15:24.400 Just to underscore how quickly this technology is developing in your book, you point out what
00:15:30.740 happened with the once Martin Scorsese released his film, The Irishman, which had this exceedingly
00:15:37.920 expensive and laborious process of trying to de-age its principal actors, Robert De Niro and
00:15:44.580 Joe Pesci.
00:15:45.840 And that was met with something like derision for the the imperfection of what was achieved
00:15:53.360 there.
00:15:54.100 Again, a great cost.
00:15:55.220 And then very, very quickly, someone on YouTube using free software did a nearly perfect de-aging
00:16:04.740 of the same film.
00:16:06.580 It's just amazing what what's happening here.
00:16:09.300 And again, these tools are going to be free, right?
00:16:12.280 I mean, they're already free.
00:16:13.360 And and ultimately, the best tools will be free.
00:16:17.660 Absolutely.
00:16:18.600 So you already have various kind of software platforms online.
00:16:23.920 So the barriers to entry have come down tremendously right now.
00:16:28.380 If you wanted to make a convincing deepfake a video, you would still need to have some knowledge,
00:16:34.040 some knowledge of machine learning, but you wouldn't have to be an AI expert by any means.
00:16:38.160 But already now we have apps that allow people to do certain things like swap their faces into
00:16:44.520 scenes.
00:16:44.900 For example, we face, I don't know if you've come across that app.
00:16:48.480 I don't know how old your children are.
00:16:50.240 But if you have a teenager, you've probably come across it, you can basically put your own face
00:16:56.300 into a popular scene from a film like Titanic or something.
00:17:00.000 This is using the power of synthetic media.
00:17:04.280 But experts who I speak to on the generation side, because it's so hugely exciting to people
00:17:09.820 who are generating synthetic media think that by the end of the decade, any YouTuber, any
00:17:17.220 teenager will have the ability to create special effects in film that are better than anything
00:17:23.500 a Hollywood studio can do now.
00:17:25.020 And that's really why I put that anecdote about the Irishman into the book, because it
00:17:29.940 just demonstrates the power of synthetic media.
00:17:32.740 I mean, Scorsese was working on this project from 2015.
00:17:36.020 He filmed with a special three rig camera.
00:17:38.940 He had this best special effects artist, post-production work, multi-million dollar budget, and still
00:17:45.320 the effect at the end wasn't that convincing.
00:17:47.860 It didn't look quite right.
00:17:48.800 And now one YouTuber, Free Software, takes a clip from Scorsese's film in 2020.
00:17:55.580 So Scorsese's film came out in 2019.
00:17:57.740 This year, he can already create something that's far more, when you look at it, looks
00:18:03.620 far more realistic than what Scorsese did.
00:18:05.880 This is just in the realm of video.
00:18:08.420 As I already mentioned, with images, it can already do it perfectly.
00:18:13.660 There is also the case of audio.
00:18:16.980 There is another YouTuber, for example, who, because a lot of the kind of early pieces
00:18:23.100 of synthetic media have sprung up on YouTube.
00:18:25.620 There is a YouTuber called Vocal Synthesis, who uses an open-sourced AI model to train,
00:18:33.040 trained on celebrities' voices.
00:18:35.260 So he can, something that he's done that's gotten many, many views on YouTube is he's literally
00:18:40.280 taken audio clips of dead presidents and then made them rap N.W.A.'s Fuck the Police, right?
00:18:47.560 Ronald Reagan, FDR.
00:18:50.120 He, very interesting, this is an indicator of how complex these challenges are going to
00:18:56.960 be to navigate in future.
00:18:58.360 Because another thing that he did was he took Jay-Z's voice and made him rap, recite Shakespeare's
00:19:07.620 to be or not to be.
00:19:09.000 And interestingly, Jay-Z's record label filed a copyright infringement claim against him
00:19:14.000 and made him kind of take it down.
00:19:15.800 But this is really just a forebear of the kind of battles we're going to see when any anonymous
00:19:22.560 user, this is, can take your likeness, can take your biometrics and make you say or do
00:19:29.300 things that you never did.
00:19:30.240 And of course, this is disastrous to any liberal democratic model, because in a world where
00:19:37.480 anything can be faked, everyone becomes a target.
00:19:40.960 But even more than that, if anything can be faked, including evidence that we today see
00:19:47.020 as an extension of our own reality, and I say evidence in quotation marks, video, film,
00:19:52.580 audio, then everything can also be denied.
00:19:55.580 So the very basis of what is reality starts to become corroded.
00:20:01.720 Of course, reality itself remains.
00:20:03.880 It's just that our perception of reality starts to become increasingly clouded.
00:20:09.740 So what are we going to do about this?
00:20:11.220 Again, we're going to get into all of the evidence of just how aggressively this will be
00:20:17.220 used, given everything else that's been happening in our world.
00:20:21.080 We'll talk about Russia and Trump and QAnon and other problems here.
00:20:26.580 But many of us can dimly remember 20 years ago before COVID, when the Bush audio tape dropped
00:20:35.160 and Trump sort of attempted to deny that the audio was real of him on the bus.
00:20:42.420 But we were not yet in the presence of such widespread use of deepfake technology that anyone
00:20:51.720 was even tempted to believe him.
00:20:54.060 We knew the audio was real.
00:20:56.460 Now, apparently it didn't matter, given how corrupted our sense of everything had become
00:21:02.760 by that point politically.
00:21:04.440 But we could see the resort to claiming fakery that will be relied upon by everyone and anyone
00:21:14.740 who is committed to lying.
00:21:17.780 Because there'll be so much of it around that really it will only be charitable to extend
00:21:23.600 the benefit of the doubt to people who say, listen, that wasn't me.
00:21:27.140 That's just a perfect simulacrum of my voice and even my face.
00:21:31.360 But you actually can't believe your eyes and ears at this point.
00:21:35.200 I would never say such a thing.
00:21:37.100 In any of your conversations with experts on this topic, are any of them hopeful that
00:21:42.360 we will be able to figure out how to put a watermark on digital media in such a way that
00:21:49.020 we will understand its provenance and be able to get to ground truth when it matters?
00:21:54.720 So I think the problem of what we do about it is so huge that ultimately we can only fight
00:22:03.220 the corroding information ecosystem by building society-wide resilience.
00:22:07.900 But the solutions, if you want to term it that way, broadly fit into two categories.
00:22:13.700 The first are the kind of technical solutions.
00:22:16.180 So because synthetic media is going to become ubiquitous, and we as humans will not be able
00:22:24.000 to discern because of the fidelity, the quality, whether it's real or fake.
00:22:28.960 So you can't rely on digital forensics in the sense that somebody goes through and clicks
00:22:34.820 and looks at each media and decides, oh, are the eyes blinking correctly?
00:22:38.620 Do the ears look a little bit blurred?
00:22:40.340 Because these are what we do now, right?
00:22:43.140 Because the generation side of synthetic media is still so nascent.
00:22:47.480 So we're not going to be able to do that.
00:22:49.320 Second, the sheer volume when you talk about at the scale at which you can generate synthetic
00:22:55.780 media means that humans are never going to be able to go through it all, never going to
00:22:59.940 be able to fact check each piece of media.
00:23:02.460 So we have to rely on building the AI software to detect, for example, deep fakes.
00:23:09.300 And right now, there is an interest, and increasingly, there are certain experts and groups who are
00:23:16.100 putting money into being able to detect deep fakes.
00:23:19.500 However, the problem is, because of the adversarial nature of the AI and the way that it's trained,
00:23:25.840 every time you build a detector that's good enough to detect the fake, the generation model
00:23:34.740 can also become stronger.
00:23:36.120 So you're in this never-ending game of cat and mouse, where, you know, you keep on having
00:23:42.120 to build better detectors.
00:23:43.260 And also, given the various different models and ways in which the fakes can be generated,
00:23:49.780 there's never going to be a one-size-fits-all model.
00:23:53.080 There's a hypothetical question, which is open still in the AI research community, about whether
00:23:59.880 or not the fakes can become so sophisticated.
00:24:03.800 So we already know they're going to be humans.
00:24:05.900 They already basically do.
00:24:07.340 But is there a point where the fakes become so sophisticated that even AI and AI detector
00:24:12.640 can never detect in the DNA of that fake that it's actually a piece of synthetic media?
00:24:17.620 We don't know yet, is the answer to that.
00:24:19.800 But I will say that there is far more research going into the generation side, because like
00:24:27.960 so much in terms of the information ecosystem, the architecture of the information ecosystem
00:24:33.420 and the information age, it has been driven by this almost utopian flawed vision of how these
00:24:38.740 technologies will be serving an unmitigated good for humanity without thinking about how they
00:24:44.100 might amplify the worst sides of human intention as well.
00:24:48.100 The second side, and you touched upon that, is building provenance architecture into the
00:24:54.400 information ecosystem.
00:24:56.000 So basically embedding right into the hardware of devices, whether that's a camera, a mobile
00:25:01.400 phone, the authenticity watermark to prove that that piece of media is authentic.
00:25:09.700 You can track it throughout its life to show that it hasn't been tampered with or edited.
00:25:13.320 And this is something that, for example, Adobe is working on, along with the, on its content
00:25:20.580 initiative, authenticity initiative.
00:25:22.940 So there are technical solutions underway, both inside, in terms of the detection and the
00:25:30.500 provenance side of the problem.
00:25:31.860 However, ultimately, this is a human problem to the extent that disinformation or bad information
00:25:40.740 didn't just come about at the turn of the millennium.
00:25:45.560 It's just that we have never seen it at this scale.
00:25:49.120 We have never seen it this potent, and we have never, ever been able to see, to have it as
00:25:54.380 accessible as it is now.
00:25:56.200 So ultimately, this is a human problem.
00:25:59.160 There's no way we can deal with the challenges of our corroding information ecosystem without
00:26:03.440 talking about human, quote unquote, solutions.
00:26:06.640 How do we prepare society for this new reality?
00:26:10.000 And we are way behind.
00:26:11.660 We're always reactive.
00:26:12.980 Our reactions are always piecemeal.
00:26:15.100 And the biggest problem is the information ecosystem has become corrupt to the extent that
00:26:20.760 we can't even identify what the real risks are, right?
00:26:24.560 We're too busy fighting each other about other things without seeing what the real existential
00:26:28.840 risk is here.
00:26:30.640 Yeah, yeah.
00:26:31.020 I mean, that is a very symptom of the problem itself, the fact that we can't even agree on
00:26:35.960 the nature of the problem.
00:26:37.720 There's so much disinformation in the air.
00:26:40.540 It makes me think that one solution to part of the problem, I don't think it captures all
00:26:46.320 of it, but certainly some of the most pressing parts of it could be solved if we had lie detection
00:26:53.180 technology that we could actually rely on.
00:26:56.240 Just imagine we had real-time lie detection, and you could go to the source, you know, if
00:27:01.340 some awful piece of audio emerged from me, and it purported to be a, you know, part of
00:27:07.980 my podcast, where I said something, you know, reputation-canceling, and I said, well, that's
00:27:13.580 a fake, that wasn't me, the only way to resolve that would be to tell whether I'm lying or
00:27:19.740 not.
00:27:20.180 We're forcing ourselves into a position where it's going to be a kind of emergency not to
00:27:27.780 be able to tell with real confidence whether or not somebody is lying.
00:27:32.420 So I think we're going to, in addition to the arms race between deep fakes and deep fake
00:27:38.340 identifying AI, I think this could inspire a lie detection arms race, because there's
00:27:45.380 so many other reasons why we would want to be able to detect people who are lying.
00:27:49.500 Having just watched the presidential and vice presidential debates in America, one could
00:27:54.440 see that the utility of having a red light go off over someone's head when he or she knows
00:28:00.480 that he or she is lying.
00:28:01.860 But if we can't trust people, and we can't trust the evidence of our senses when we have
00:28:08.620 media of them saying and doing things convincingly delivered to us in torrents, it's hard to see
00:28:15.920 how we don't drift off into some horrifically dystopian dream world of our own confection.
00:28:23.900 Absolutely, and this is really why, you know, I wrote the book.
00:28:27.600 I wrote it in a way that was very accessible to anyone to pick up and zoom through an afternoon,
00:28:33.900 because I think without this conceptual framework, where we can connect everything from Russian
00:28:41.820 disinformation to the increasingly partisan political divide in the United States, but
00:28:49.500 also around the rest of the Western world, and understanding how now with the age of coming,
00:28:55.280 with the age of synthetic media upon us, how our entire perception of the world is going
00:29:02.360 to be changed in a way that is completely unprecedented, how we can be manipulated in the age of information
00:29:10.400 where we had assumed that once we have access to this much information, that, you know, surely
00:29:17.540 progress is inevitable.
00:29:19.620 But to actually understand how the information ecosystem itself has become corrupt, I think
00:29:24.800 is the first step.
00:29:27.100 And to be honest with you, I do tend to think that things will probably get worse before they
00:29:33.620 get better.
00:29:34.320 And I think the US election is a great case study of that, because it's almost no matter the
00:29:40.560 outcome, right?
00:29:41.420 Let's say that Trump loses, and he loses by large margin, you know, that he could still
00:29:49.460 refuse to go, even if the Secret Service will come and, you know, take his bags and ask him,
00:29:54.360 please, Mr. Trump, there's the door.
00:29:56.500 He has this influence now where a lot of his followers genuinely believe that he is, you know,
00:30:03.920 the, this kind of savior of America.
00:30:08.060 And if he asks them to take arms and take to the streets, I mean, this is literally already
00:30:12.640 happening right now, right?
00:30:13.680 You have armed insurrection, militia kind of patrolling the streets of the United States
00:30:17.920 on both the left and the right for their political grievances.
00:30:22.460 So if Biden wins, let's say Trump goes quietly and Biden wins, well, then you still haven't
00:30:27.820 addressed the bigger problem of the infocalypse, where the information ecosystem has become so corrupt
00:30:33.600 and so corroded and the synthetic media revolution is still upon us.
00:30:37.520 So I, okay, I'm hopeful that we still have time to address this, because like I said, this
00:30:43.000 technology is so nascent, we can still try to take some kind of action in terms of what's
00:30:49.240 the ethical framework?
00:30:50.680 How are we going to adjudicate the use of synthetic media?
00:30:54.220 How can we digitally educate the public about the risks of synthetic media?
00:30:58.900 But it is a ticking time bomb, and the window is short.
00:31:03.600 As if to underscore your last point, at the time we're speaking here, there's a headline
00:31:09.100 now circulating that 13 men were just arrested, including seven members of a right-wing militia
00:31:15.860 plotting to kidnap the Democratic governor of Michigan, Gretchen Whitmer, for the purposes
00:31:22.060 of inciting a civil war.
00:31:23.940 One can only imagine the kind of information diet of these militia members.
00:31:29.320 But this is the kind of thing that gets engineered by crazy information and pseudo-facts being
00:31:35.740 spread on social media.
00:31:37.000 And this is the kind of thing that when even delivered by a mainstream news channel, one
00:31:43.900 now has to pause and wonder whether or not it's even true, because there's been such a breakdown
00:31:50.500 of trust in journalism.
00:31:53.140 And there's so many cries of fake news, both cynical and increasingly real, that it's just
00:32:00.520 we're dealing with the circumstance of such informational pollution.
00:32:05.000 Let's talk about Russia's role in all of this, because Russia has a history of prosecuting what
00:32:11.500 they call active measures against us.
00:32:13.880 And we really have, for a long time, been in the midst of an information war, which is
00:32:19.300 essentially a psychological war.
00:32:22.780 And Russia is increasingly expert at exploiting the divisions in our society, especially racial
00:32:29.620 divisions.
00:32:30.160 So maybe you can summarize some of this history.
00:32:32.960 Yeah, I mean, I start my book with Russia because my career intersected a lot with what
00:32:42.100 Russia was doing in Ukraine in 2014 and the kind of information war they fought around
00:32:48.500 the annexation of Crimea and eastern Ukraine, where they basically denied that it was happening
00:32:53.580 at all.
00:32:54.560 And the same with the shooting down of MH17.
00:32:56.900 This was the Malaysian aircraft that was shot down over eastern Ukraine, which now has been
00:33:02.880 proven to have been by Russian military services.
00:33:06.000 But at the time, they were saying this had nothing to do with them and that this was pro-Russian
00:33:10.880 Ukrainian separatists who had shot down the airliner.
00:33:15.000 So what Russia did with information warfare around Ukraine, Crimea, around Europe in 2015, when
00:33:22.340 Putin and Assad stepped up their bombardment of civilians in Syria, unleashing this mass
00:33:29.560 migration, which basically led to the EU's migrant crisis five years ago.
00:33:33.900 I don't know if you remember those images of people just arriving at the shores, you know,
00:33:40.520 and some of them were refugees.
00:33:42.740 But as we now know, you know, a lot of them were there were also terrorists, economic migrants.
00:33:47.940 And how that almost tore Europe apart and the information war that Russia fought around those
00:33:55.700 events where they perpetrated these stories about, for example, girls in Germany who had
00:34:03.100 been raped by, supposedly raped by arriving migrants.
00:34:07.300 And stories like this legitimately did happen.
00:34:10.320 But this story was completely planted.
00:34:12.220 So it's dividing the line, you know, it's blurring the line between what's real and fake.
00:34:16.220 But what was also very interesting for me was that I worked on or I studied and I worked
00:34:23.120 on the Russian information operations around the U.S. election in 2016.
00:34:27.460 And the first thing to say about that is, to me, it's an it's a we can see how corrupt
00:34:33.860 the information ecosystem has become to the extent that those information operations have
00:34:38.120 become a completely partisan event in America.
00:34:41.020 Some people say that Russia is behind everything and others deny that Russia did anything at
00:34:47.800 all.
00:34:48.180 And this is just nonsense.
00:34:50.000 You know, for sure, the Russians intervened in the 2016 election and they continue to intervene
00:34:55.640 in U.S. politics to this day.
00:34:58.260 And I suppose what was very interesting to me about what Russia was doing was how this information
00:35:05.700 warfare strategy, which is old and it goes all the way back to the Cold War, was becoming
00:35:11.700 increasingly potent with the weapons of this modern information ecosystem.
00:35:16.940 And one of those was social media.
00:35:19.840 What they did in Ukraine and then Europe around the migrant crisis and then around the U.S.
00:35:25.480 election was influence operations on social media where they actually posed in the case of
00:35:31.700 the United States as authentic Americans.
00:35:35.260 And then they over years, by the way, this wasn't just them getting involved in the weeks
00:35:40.880 running up to the election.
00:35:41.940 They started their influence operations in the United States in 2013.
00:35:46.220 They built up these tribal communities on social media and built up, well, basically played
00:35:53.320 identity politics, built up their pride in their distinct identity.
00:35:58.700 And interestingly, this wasn't just Russians targeting, you know, right wing kind of Trump
00:36:07.020 supporters.
00:36:07.660 They did it across the political spectrum.
00:36:10.260 And as a matter of fact, they disproportionately focused on the African-American community.
00:36:15.040 So they built these fake groups, pages, communities where you imbue them with your distinct
00:36:23.580 pride in your distinct identity.
00:36:25.700 And then as we got closer to the election, those groups were then sporadically injected
00:36:30.760 with lots of political grievances, some of them legitimate, to make these groups feel
00:36:35.500 alienated from the mainstream.
00:36:37.120 And again, the primary focus of their influence operations on social media was the African-American
00:36:43.160 community who they were basically targeting so that they felt so disenfranchised and disconnected
00:36:49.100 from Hillary, America at large, that they wouldn't go and vote in the election, right?
00:36:54.000 And what has happened now, four years later, is that those operations are still ongoing,
00:37:00.240 but they've become far more sophisticated.
00:37:02.320 So in 2016, it might have been a troll farm in St. Petersburg.
00:37:06.720 But in 2020, one operation that was earlier this year, which was revealed through CNN, Twitter,
00:37:14.580 Facebook, or Facebook, a joint investigation, was that the Russian agency, which is in charge
00:37:20.640 of the social media operations, it's called the Internet Research Agency, IRA, they had basically
00:37:25.900 outsourced their work to Ghana.
00:37:29.460 They had set up what looked ostensibly like a legitimate human rights organization.
00:37:34.900 They had hired employees in Ghana, real, authentic Ghanaians, and then told them, you know, you're
00:37:41.560 going to have to kind of post, build these groups and communities.
00:37:45.040 And here is basically the same memes, the same ideas that they had used in 2016.
00:37:52.620 They were basically recycling in 2020.
00:37:54.640 So I start with Russia, because what is really interesting is that their strategy of information
00:38:02.840 warfare is actually something called, is a phenomenon where they flood the zone with a lot of information,
00:38:10.860 bad information across the political spectrum.
00:38:13.260 So they're not just targeting, you know, Trump voters, for example.
00:38:16.500 And this chaos, this bad information, this chaotic information has the effect where it's called
00:38:22.520 sensors, they do censorship through noise.
00:38:25.820 So this chaotic, bad information overload gets to the point where we can't make decisions in
00:38:33.100 our own interest of protecting ourselves, our country, our community.
00:38:36.940 And that very spirit of information warfare has come to characterize the entire information
00:38:44.660 ecosystem.
00:38:45.240 I mean, I start with Russia, I map out how their tactics are far more potent, but you cannot
00:38:51.720 talk about the corrosion of the information ecosystem without recognizing that the same
00:38:56.900 chaotic spirit has come to imbue our homegrown debate as well.
00:39:03.460 So I actually think, you know, of course, the Russians are intervening in the US election in 2020.
00:39:08.660 What's also very interesting is that other rogue and authoritarian states around the world are looking
00:39:14.360 at what Russia is doing and copying them, China is becoming more like Russia.
00:39:19.040 But this is also happening at home.
00:39:21.840 And arguably, the domestic disinformation, misinformation and information disorder is far
00:39:28.800 more harmful than anything that foreign actors are doing.
00:39:31.140 Yeah, I want to cover some of that ground again, because it's easy not to understand at
00:39:38.620 first pass just how sinister and insidious this all is, because the fact that we can't agree
00:39:48.480 as a society that Russia interfered in the 2016 presidential election is one of the
00:39:56.060 greatest triumphs of the Russian interference in our information ecosystem, the fact that
00:40:02.660 that you have people on the left over ascribing to Russian influence causality, and you have people
00:40:10.180 on the right denying any interference in the first place, and the fact that each side can sleep soundly
00:40:17.620 at night convinced that the other side is totally wrong, that is itself a symptom of how polluted
00:40:24.140 our information space has become.
00:40:26.780 It's a kind of singularity on the landscape, where everything is now falling into it.
00:40:32.220 And it's, it's happening based on the dynamics you just sketched out.
00:40:37.440 Whereas if you mingle lies of any size and consequence with enough truths and half truths, or, you know,
00:40:45.860 background facts that suggest a plausibility to these lies, or at least you can't, you can't ever
00:40:52.320 ascertain what's true, it leads to a kind of epistemological breakdown and a cynicism that is the goal of
00:41:00.760 this entire enterprise. It's not merely to misinform people, which is to say, have them believe things
00:41:06.300 that are false. It is to break people's commitment to being informed at all, because they realize how
00:41:13.180 hopeless it is. And so we all just tune out and go about our lives being manipulated to who knows
00:41:19.300 what end. So, you know, the, some of the history, which you go through in your book is, relates to
00:41:24.700 the fact that, you know, that for long ago, long before they had any tools, really to work with,
00:41:29.820 you know, certainly didn't have social media. The Russians planted the story that AIDS was a,
00:41:35.760 was essentially a bioweapon cooked up in a US lab, you know, with the purpose of performing a
00:41:41.760 genocide on the black community. And they targeted the black community with this lie. And to this
00:41:48.260 day, you know, a disproportionate number of people in the black community in the US believe that AIDS
00:41:54.500 was made in a lab for the purpose of, you know, wiping out black people. But the reason why that was
00:42:00.060 so clever is because it has an air of plausibility to it, given the history of the Tuskegee experiments,
00:42:08.520 the syphilis experiments, where African Americans who had syphilis were studied and not given the
00:42:16.620 cure, even once the cure, penicillin emerged. They were then, you know, studied to the end of their
00:42:22.820 lives with what amounted to the ethical equivalent of the Nazi cold water experiments, trying to see
00:42:30.220 the effects of tertiary syphilis on people. And it was an absolutely appalling history. And it's in
00:42:36.880 the context of that history that you can make up new allegations that should seem patently insane,
00:42:45.700 they're so evil, but they don't seem patently insane, given the points of contact to a surrounding
00:42:53.380 reality that that is fact based. And so it is with, you know, the current leveraging of identity
00:43:00.340 politics in the U.S. where they create Black Lives Matter Facebook groups that are fake, and they can,
00:43:08.000 you know, I think there's, there was one protest in Times Square that had like 5,000 or 10,000 people
00:43:15.060 show up, and it was completely fake. I mean, the organizers were fake, you know, they were Russians,
00:43:20.940 there was no man on the ground who was actually a real leader of this thing. And people went to this
00:43:25.880 protest, never realizing that they were characters in somebody's dreamscape.
00:43:32.540 Absolutely. This is why it is so dastardly. And as you pointed out, the Russians, or even the
00:43:40.200 Soviets going back to the Cold War, very quickly identified that race relations is a sore point
00:43:45.280 for the United States. And they abused that to great effect. And the Operation Infection,
00:43:52.500 the lie that you already correctly pointed out, that the CIA invented the HIV virus as a way to
00:44:00.460 kill African Americans was something that in the 1980s took about 10 years to go viral. But when it
00:44:08.060 did, oh boy, did it grab a hold of the imagination to the extent that it still plays a challenge when
00:44:14.540 you're trying to deal with HIV public health policy today, where you have communities,
00:44:19.660 African American communities, you disproportionately believe that the HIV virus is somehow connected to
00:44:25.820 a government plan to commit a genocide. And in 2016, I suppose, what happened is that the strategy was
00:44:36.040 the same, right? We want to play identity politics, we want to hit the United States where it hurts.
00:44:41.180 We know that race is the dividing factor. But in 2016, it became so much more powerful,
00:44:47.920 because Operation Infection, the HIV lie was a single lie. Whereas in 2016, and what's happening
00:44:54.440 in 2020 is numerous groups, communities, pages, where it's not only about spreading one lie,
00:45:00.840 but it's actually about entrenching tribal divisions, entrenching identity politics. And
00:45:07.680 in the context of what's happened in 2020, very interesting, some of the other kind of
00:45:13.200 information operations that have come out that have been exposed is unsurprisingly, given your
00:45:19.520 interest, Sam, and kind of the culture wars and wokeness, is that a lot of kind of unemployed
00:45:26.160 American journalists who had lost their job due to COVID, were now working for a kind of social
00:45:33.400 justice oriented left wing news, news network, in favor of BLM. And it turned out that actually,
00:45:40.920 that entire network was fabricated, and the Russians were behind it. So these unwitting
00:45:46.500 Americans who genuinely have good intentions are being co-opted into something that is actually
00:45:53.560 being run by Russian intelligence. And I suppose with our information ecosystem right now, it's so
00:46:00.700 much easier to actually infiltrate public life in the United States in a way that wouldn't have been
00:46:06.580 possible in the 1980s. So we can't, we don't even know what we're starting to see the impact of these
00:46:13.280 operations on society. That's not to say that, you know, the Russians created the problems with race,
00:46:20.300 of course not. But do they exploit them? Absolutely. And are other countries also other rogue and
00:46:27.660 authoritarian nation states seeking to do the same? Absolutely. Russia is the best at this kind of
00:46:34.000 information warfare. But other countries are learning quickly. And what's been really interesting
00:46:38.980 for me to watch is, for example, how China has taken an aggressive new interest in pursuing similar
00:46:47.500 disinformation campaigns in Western information spaces. This was something that they didn't do until
00:46:54.600 about last year when the protests started in Hong Kong, and then obviously this year with COVID.
00:46:59.300 I think you say in your book that Russian television, RT, is the most watched news channel
00:47:07.080 on YouTube? Yes, it is. So this is another example to me of how quick they were to recognize that the
00:47:17.420 architecture of this new information ecosystem, right, which developed over the turn of the millennium,
00:47:21.940 that's powerful, right? If you'd like to continue listening to this podcast, you'll need to subscribe
00:47:30.800 at SamHarris.org. You'll get access to all full-length episodes of the Making Sense podcast,
00:47:36.220 and to other subscriber-only content, including bonus episodes and AMAs, and the conversations I've
00:47:42.000 been having on the Waking Up app. The Making Sense podcast is ad-free and relies entirely on listener
00:47:47.500 support. And you can subscribe now at SamHarris.org.