Making Sense - Sam Harris - January 02, 2019


#145 — The Information War


Episode Stats

Length

31 minutes

Words per Minute

168.7205

Word Count

5,399

Sentence Count

299

Misogynist Sentences

3

Hate Speech Sentences

4


Summary

Renee DiResta is the Director of Research at New Knowledge and the Head of Policy at the non-profit Data for Democracy. She investigates the spread of hyper-partisan and destructive narratives across social networks, and she's co-authored a recent report on the Russian disinformation campaign both before and since the 2016 presidential election. She's advised politicians and policymakers, including members of Congress, the State Department, and CNN, and many other outlets. She s a member of the Council on Foreign Relations and a Truman National Security Project security fellow. She also holds degrees in computer science and political science from SUNY Stony Brook. As you ll hear, Renee was recommended to me by my friend and former podcast guest, Tristan Harris, who recommended her as an authority on just what happened with the Russian influence campaign in recent years. And Renee did not disappoint. In this episode, I speak with Renee about how she came to be thinking about the problem of bots, and how they intersect with her research into how social platforms are having profound impacts on policy and society, and the specific problem we re going to be talking about: The Russian Disinformation Campaign (or, as she calls it, the "Deep State.") by her research co-author, Tristan Harris. We don t run ads on the podcast, and therefore, our support is made possible entirely through the support of our subscribers. Please consider becoming a supporter of the podcast by becoming a subscriber. You ll get a better listen and a better understanding of what we re doing here. Thanks for listening! Sam Harris - The Making Sense Podcast: This is a podcast that s all about what we're doing here, not just talking about, and why you should be listening to the podcast is making sense, and not listening to it, and what you should listen to it. - Thank you for listening to this podcast, too, right here, right? Thank you, Ms. Harris? - Alyssa, and so on and so much so that you can help us make sense of it, too much more so that it helps us make it so that we can be a better listening experience, and that s not just a good one, too can be more of that, too is a good thing, and we can do it, right so much of it so much more of it is not just that, you can say so, too says so, we really do that, right ...


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.820 Today I am speaking with Renee DiResta.
00:00:50.220 Renee is the Director of Research at New Knowledge and the Head of Policy at the non-profit Data
00:00:55.240 for Democracy.
00:00:55.960 And she investigates the spread of hyper-partisan and destructive narratives across social networks.
00:01:04.480 She's co-authored a recent report on the Russian disinformation campaign, both before and since
00:01:10.700 the 2016 presidential election.
00:01:14.260 And we talk about all that.
00:01:16.480 She's advised politicians and policymakers, members of Congress, the State Department.
00:01:21.660 Her work has been featured in the New York Times and the Washington Post and CNN and many
00:01:27.280 other outlets.
00:01:28.720 She's a member of the Council on Foreign Relations and a Truman National Security Project security
00:01:33.920 fellow.
00:01:34.960 She also holds degrees in computer science and political science from SUNY Stony Brook.
00:01:41.060 As you'll hear, Renee was recommended to me by my friend and former podcast guest, Tristan
00:01:46.240 Harris, who recommended her as an authority on just what happened with the Russian influence
00:01:52.940 campaign in recent years.
00:01:56.000 And Renee did not disappoint.
00:01:59.580 So without further ado, I bring you Renee DiResta.
00:02:03.240 I am here with Renee DiResta.
00:02:11.020 Renee, thanks for coming on the podcast.
00:02:12.940 Thanks for having me, Sam.
00:02:14.120 I was introduced to you through our mutual friend, Tristan Harris.
00:02:17.900 How do you know Tristan?
00:02:19.580 Tristan and I met in mid-2017.
00:02:22.140 I had written an essay about bots, and he read it, and he shared it to Facebook, funny
00:02:30.120 enough, and we discovered that we had about 60 mutual friends, even though we'd never met.
00:02:34.060 And we met for breakfast a couple days later, and he wanted to talk about what I was seeing
00:02:39.100 and the things I was writing about, and how they intersected with his vision of social
00:02:43.860 platforms as having profound impacts on individuals.
00:02:46.060 My research into how social platforms are having profound impacts on policy and society,
00:02:52.340 and we had breakfast, hit it off, and I think had breakfast again a couple days later.
00:02:56.320 So fast friends.
00:02:58.000 Yeah, well, Tristan is great.
00:02:59.380 So many people will recall he's been on the podcast, and I think he's actually been described
00:03:04.060 as the conscience of Silicon Valley, just in terms of how he has been sounding the alarm
00:03:10.420 on the toxic business model of social media in particular.
00:03:14.980 So you touched on it there for a second, but give us a snapshot of your background and
00:03:20.980 how you come to be thinking about the problem of bots and also just the specific problem
00:03:27.300 we're going to be talking about of the Russian disinformation campaign and hacking of democracy.
00:03:33.900 Yeah, so it's sort of a convoluted way that I got to investigating Russia and disinformation.
00:03:39.420 It actually started back in 2014.
00:03:43.000 I became a mom, and I had just moved to San Francisco a little bit prior, and I had to
00:03:48.260 get my kid onto a preschool waiting list, which is unfortunate.
00:03:50.920 Not always easy.
00:03:51.660 Yeah.
00:03:51.940 Not like a nice preschool, just like a preschool.
00:03:55.600 And I knew California had some anti-vax problems, and I started Googling for the data sets.
00:04:01.380 The California Department of Public Health has public data sets where they tell you vaccination
00:04:05.740 rates in schools.
00:04:07.000 Anyway, I looked and I thought, God, this is a disaster waiting to happen.
00:04:10.980 And lo and behold, a couple months later, the Disneyland measles outbreak, in fact, did
00:04:15.200 happen.
00:04:15.480 And I reached out to my congressman.
00:04:17.640 It was the first time I'd ever done that.
00:04:18.640 And I said, hey, you know, we should have a law for this now.
00:04:22.340 We should eliminate the vaccine opt-outs.
00:04:24.840 And they told me they were introducing something.
00:04:26.600 So I said, great, I'd love to help.
00:04:28.080 You know, I have a data science background.
00:04:29.180 I can maybe be useful as an analyst.
00:04:31.340 And what wound up happening was that there was this extraordinary thing as the bill took
00:04:35.700 shape, which was that the legislators were finding that polling in their districts was
00:04:39.440 about 85 percent positive.
00:04:40.820 Like, people really liked the idea of eliminating what were called personal belief exemptions,
00:04:45.820 the right to just kind of voluntarily opt your kids out.
00:04:48.740 But the social media conversation was like 99 percent negative.
00:04:52.720 It was very hard to even find a single positive tweet or positive Facebook post expressing support
00:04:56.960 for this bill.
00:04:58.180 And so I started looking into why that was and discovered this entire kind of ecosystem
00:05:03.640 of what was this hybrid between almost activism and manipulation.
00:05:08.380 So there were very real activists who had very real points of view.
00:05:13.100 And then they were doing things like using automation.
00:05:15.660 So the reason that they were dominating the Twitter ecosystem was that they were actually
00:05:19.080 turning on automated accounts.
00:05:21.120 So they were just kind of spamming the hashtags that anytime you search for anything related
00:05:25.020 to the bill in the hashtag, you would find their content.
00:05:27.320 So this is kind of, you know, this is sort of like a guerrilla marketing tactic.
00:05:31.040 And I thought, how interesting that they were using it and then realized that there were like
00:05:34.460 fake personas in there.
00:05:35.820 There were people pretending to be from California who weren't from California.
00:05:40.380 How were you figuring that out?
00:05:41.980 How were you assessing a fake persona?
00:05:44.140 They were created within days of the bill being introduced.
00:05:47.700 And they existed solely to talk about this bill.
00:05:50.960 And then I discovered these communities on Facebook,
00:05:53.940 things with names like Tweet for Vaccine Freedom,
00:05:55.940 where there were actually moderators in the group who were posting instructions on
00:06:00.180 for people from out of state how they could get involved.
00:06:02.680 And the answer was create a persona, change your location ID to somewhere in California
00:06:07.560 and then start tweeting.
00:06:09.200 So they sort of, you know, kind of at the time it seemed brazen.
00:06:12.840 Now it seems so quaint.
00:06:13.860 But these tactics to shape consensus, to really create the illusion that there was a mass
00:06:20.620 consensus in opposition to this bill.
00:06:22.960 And so a very small group of people using social media as an amplifier were able to achieve dominance,
00:06:30.580 to just really own the conversation.
00:06:32.900 And it led me to think this is fascinating because what we have here is this form of activism
00:06:39.040 where there is kind of like a real core and then there's some manipulative tactics
00:06:42.880 layered on top of the real core.
00:06:44.900 But if you're not looking for the manipulation, you don't see it.
00:06:48.780 And most people aren't going looking, you know, they're not digging into this stuff.
00:06:52.240 So it was a kind of a first indication that our policy conversations, our social conversations
00:06:59.600 were not necessarily reflective of, you know, kind of the reality on the ground,
00:07:04.700 the stuff that we were still seeing in the polls.
00:07:06.500 It was an interesting experience.
00:07:08.260 And then a couple of months after that law was all, you know, all done, I got a call
00:07:13.820 from some folks in the Obama administration in the digital service saying, hey, we've read
00:07:20.640 your research because I published about this in Wired.
00:07:23.040 Hey, we've read your research.
00:07:24.140 We'd like you to come down and look at some of the stuff that's going on with ISIS.
00:07:29.300 And I said, you know, I don't know anything about ISIS or about terrorism, candidly.
00:07:33.680 When they said, no, no, you have to understand the tactics are identical.
00:07:36.200 The same kind of, you know, kind of owning the narrative, owning the hashtags, reaching
00:07:42.020 out to people, pulling them into secret Facebook groups.
00:07:44.620 The idea that the terrorists were actually following some of these kind of radicalization
00:07:49.040 pathways, these efforts to kind of dominate the conversation.
00:07:52.480 Anytime there was a real world event related to ISIS, they would get things trending on Twitter.
00:07:57.320 And so people in the administration wanted to understand how this was happening and what
00:08:02.080 they could do about it.
00:08:02.840 So that was how I wound up getting more involved in this in sort of a more official capacity
00:08:08.160 was first kind of conspiracy theorists and terrorists.
00:08:11.680 And then and then Russia was Russia was following the 2016 election.
00:08:17.060 There was a sense that, again, there had been these bizarre bot operations and they were far
00:08:22.420 more nefarious and sophisticated than anyone had realized.
00:08:26.140 Before we get into the Russia case specifically, how do you view the the role of social media
00:08:36.660 in this?
00:08:37.480 Do you distinguish between the culpability or the negligence of Twitter versus Facebook
00:08:43.900 versus YouTube?
00:08:45.460 Are there right lines between how they have misplayed this or are they very similar in the
00:08:52.040 role they're playing?
00:08:52.620 I think that, you know, they've kind of they've really evolved a lot since 2015.
00:08:58.180 In the early conversations about ISIS, there was a, you know, just to kind of take you back
00:09:03.080 to 2015, the attitude wasn't, oh, God, we've got terrorists on our platform.
00:09:09.180 Let's get ahead of this.
00:09:10.000 Right.
00:09:10.160 It was, you know, Facebook, to its credit, took that attitude from day one.
00:09:13.940 It was just this is a violation of our terms of service.
00:09:15.740 We take down their content.
00:09:16.720 We find them.
00:09:17.260 We shut them down.
00:09:17.880 YouTube would kind of take down the beheading videos as they popped up.
00:09:23.940 Twitter, if you go back and you read articles from 2015 is, you know, I've been doing a lot
00:09:29.100 of going back and looking at the conversations from that time.
00:09:32.260 You see a lot of sympathy for Twitter and this idea that if you take down ISIS, what comes
00:09:38.960 next?
00:09:39.360 This is a slippery slope and, you know, interesting co-on to ponder.
00:09:45.440 Right.
00:09:46.160 Well, Satan.
00:09:48.600 So, you know, well, I mean, if we take down ISIS, I mean, who knows what we have to take
00:09:51.640 down next?
00:09:52.200 You know, one man's terrorist is another man's freedom fighter.
00:09:54.420 And, you know, and I would be sitting there in these rooms hearing these conversations
00:09:56.800 saying, like, these are beheading videos, you guys.
00:09:59.260 These are terrorist recruiters.
00:10:00.800 These are people who are killing people.
00:10:03.040 What the hell is this conversation?
00:10:04.280 I can't get my head around it.
00:10:05.920 But that's where we were in 2015 and, you know, go back and read things that people
00:10:09.780 like the, you know, entities like the EFF were putting out and you'll see that this was
00:10:14.020 a topic of deep concern.
00:10:16.800 What would, you know, what would happen if we were to silence ISIS?
00:10:22.040 Would we inadvertently silence things that were tangentially related to ISIS?
00:10:25.660 And then from there, would we silence, you know, certain types of expression of Islam and
00:10:30.940 so on and so forth?
00:10:31.640 And it was a very different kind of mindset back then.
00:10:36.740 I think that the context has changed so much over the last year, in part because of stuff
00:10:41.580 like what Tristan is doing and the tech hearings.
00:10:44.320 And I think that 2016 was almost like the sort of, you know, Pearl Harbor that made people
00:10:48.140 realize that, you know, holy shit, this actually does have an impact.
00:10:51.500 And maybe we do have to do something to get ahead of this because everybody's doing it now.
00:10:56.020 Reading recent articles specifically about Facebook makes me think that there is just
00:11:02.020 an insuperable problem here.
00:11:04.100 You can't put enough people on it to appropriately vet the content and the algorithms don't seem
00:11:12.280 to be up to it.
00:11:13.340 And so the mistakes that people plus algorithms are making are so flagrant.
00:11:19.540 I mean, they're preserving, you know, the accounts of known terrorist organizations.
00:11:24.760 They're deleting the accounts of, you know, Muslim reformers or ex-Muslims who simply say
00:11:31.140 something critical about the faith.
00:11:32.880 I mean, there's just people can't figure out which end is up, apparently.
00:11:36.980 And once you view these platforms as publishing platforms that are responsible for their content,
00:11:44.520 it's understandable that you would want to, given the kinds of things we're going to talk
00:11:48.080 about, but I don't know how they solve this.
00:11:52.240 There's a lot of, you know, Tristan and others have done a lot of work on
00:11:55.860 changing the conversation around culpability and accountability.
00:12:01.640 And I think that, again, in 2015, 2016, you know, there would be references to things like
00:12:10.640 the CDA 230, the Communication Decency Act Section 230, that gives them the right to moderate,
00:12:15.320 which they chose to use as their right to not moderate.
00:12:19.060 And the norms, I would say, that evolved in the industry around not wanting to be seen as being
00:12:26.700 censors in any way at the time, which meant that they left a whole lot of stuff up and
00:12:31.500 didn't really do very much digging.
00:12:33.480 And then now the shift, kind of the pendulum swinging hard in the other direction, which is
00:12:38.060 leading to allegations that the conservatives are being censored and allegations that, per
00:12:44.840 your point, unsophisticated moderation.
00:12:47.420 I think there was an article about this in the New York Times over the weekend has led to
00:12:51.000 some disasters where they take down people fighting extremists in Myanmar and leave the
00:12:55.340 extremists up.
00:12:56.300 So, yeah, there's a, I think that the recognition that they are culpable, that fundamental change
00:13:02.380 in the attitudes of the public has led them to start to try to take more responsibility.
00:13:08.700 And right now it's being done in something of kind of a ham-handed way.
00:13:12.620 Yeah, well, they're certainly culpable for the business model that have kind of a less
00:13:19.060 of a view of Twitter here, because Twitter doesn't seem to have its business model together
00:13:24.320 in the way that Facebook does.
00:13:25.360 But clearly Facebook, you know, per Tristan's point, that their business model promotes
00:13:32.780 outrage and sensationalism preferentially.
00:13:36.520 And the fact that they continue to do that is just selecting for these crazy conspiratorial
00:13:43.040 divisive voices.
00:13:44.900 And then they're trying to kind of curate against those, but they're still amplifying those because
00:13:49.980 it's their business model.
00:13:50.920 And at least that's the way it seems as of my recent reading of the New York Times.
00:13:55.380 Is that still your understanding of the bad geometry over there?
00:13:59.240 Yeah, I would say that's accurate.
00:14:00.560 So I see a lot of, you know, I try to focus on the disinformation piece.
00:14:04.740 There are some people who work on privacy, some who think about monopoly, you know, a lot
00:14:07.780 of different grievances with tech platforms these days.
00:14:10.360 But I see a lot of the manipulation specifically, I would say, comes from a combination of three
00:14:15.960 things.
00:14:16.600 There's this mass consolidation of audiences on a handful of very few platforms.
00:14:21.260 And that's just because as the web moved from these kind of, you know, decentralization,
00:14:24.540 where there's always been manipulation and disinformation and lies on the Internet, right?
00:14:29.040 But the mass consolidation of audiences onto a very small handful of platforms meant that
00:14:34.520 if you were going to run a manipulative campaign, much like if you were going to run a campaign
00:14:38.680 for, you know, Pepsi, you only had to really blanket five different sites.
00:14:42.840 And then the second piece was the precision targeting, right?
00:14:47.460 So the ads business model, the thing that you're referring to, these are attention brokers,
00:14:51.660 which means they make money if you spend time on the platform.
00:14:54.220 So they gather information about the user in order to show the user things that they want
00:15:00.260 to see so that they stay on the platform.
00:15:02.220 And then also as they're gathering that information, it does double duty in that they can use it to
00:15:06.400 help advertisers target them.
00:15:08.440 And then I would say the last piece of this is the algorithms that you're describing and
00:15:12.760 the fact that for a very, very long time now, they've been very easy to game.
00:15:17.740 And when we think about what you're describing, the idea that that outrage gets clicks, that's
00:15:22.940 true.
00:15:23.660 And the algorithm, particularly things like the recommendation engines, they're not sophisticated
00:15:28.500 enough to know what they're showing.
00:15:31.180 So there is no sense of downstream harm or psychological harm or any other type of harm.
00:15:35.820 All they know is this content gets clicks and this content drives engagement.
00:15:41.040 And if I show this content to this person, they're going to stay on the platform longer.
00:15:44.540 I can, you know, mine them for more data.
00:15:46.600 I can show them more ads.
00:15:47.820 So it's beneficial to them to do this.
00:15:51.380 And I think one of the interesting challenges here is as we think about recommendation engines,
00:15:56.060 that's where there is, in my opinion, a greater sense of culpability and a greater requirement
00:16:03.460 for responsibility on the part of the platforms.
00:16:05.920 And that's because they've moved into acting as a curator, right?
00:16:10.920 They're saying, you should see this.
00:16:13.340 And the recommendation engines, in particular, often surface things that are not necessarily,
00:16:20.680 you know, what we would necessarily want them to be showing.
00:16:23.820 This is how you get at things like, you know, my anti-vaxxers, right?
00:16:27.580 I had an anti-vax account, an account that was active in anti-vax groups.
00:16:31.760 And it didn't engage with any of the people.
00:16:33.660 It just sort of sat in the groups and, you know, kind of observed.
00:16:36.580 And it was being referred into Pizzagate groups.
00:16:40.060 So long before Pizzagate was a matter of national conversation, long before that guy showed up
00:16:45.480 with a gun and shot up a pizza place thinking that Hillary Clinton was running a sex dungeon
00:16:49.180 out of the basement, these personas that were prone to conspiratorial thinking, the recommendation
00:16:55.540 engine recognized that there was a correlation and people who were prone to conspiracy, you
00:17:01.080 know, conspiracy type A would be interested in Pizzagate, which we can call conspiracy type
00:17:05.820 B.
00:17:06.480 And then soon enough, QAnon started to show up in the recommendation engine.
00:17:11.360 And so the question becomes, you know, where is the line?
00:17:15.400 And, you know, what the platform is actively making a recommendation here, these accounts
00:17:20.160 have never gone and proactively searched for Pizzagate and QAnon.
00:17:23.140 They're being suggested to them.
00:17:25.120 So where is the responsibility?
00:17:28.960 Should we have the recommendation engine not surface that type of content?
00:17:32.220 Or is even making that suggestion a form of censorship?
00:17:35.480 These are the kinds of conversations I think we'll start to see more of in 2019.
00:17:39.660 Okay, well, let's focus on the topic at hand, which is Russian interference in
00:17:45.220 I guess democracies everywhere, but specifically the U.S. presidential election in 2016, and
00:17:51.360 the recent report that you helped produce on this, which runs to 100 pages, and I'll put
00:17:56.860 a link to that where I post this on my blog.
00:17:59.780 First, I just got a big picture, sort of political partisan question.
00:18:04.160 It seems to me that many people, certainly most Trump supporters, continue to doubt whether
00:18:10.620 Russia interfered in anything in 2016.
00:18:13.760 And this is just, you know, this is fake news.
00:18:16.840 Is there any basis for doubt about that at this point?
00:18:22.420 Nope.
00:18:23.260 This is just crystal clear as a matter of what our intelligence services tell us and as a
00:18:28.600 matter of what people like you can ascertain by just studying online behavior.
00:18:33.700 It happened.
00:18:34.360 There's really nothing else to say about it.
00:18:35.800 The intelligence agencies know it happened.
00:18:37.520 Foreign governments know it happened.
00:18:39.420 Researchers know it happened.
00:18:40.560 The platforms acknowledge it happened.
00:18:41.980 I mean, you know, sure, there can be some small group of people who continues to, you
00:18:45.360 know, live like ostriches, but that doesn't mean that it didn't happen.
00:18:48.660 And what do you do with the charge that we do the same thing all the time everywhere ourselves?
00:18:53.940 So there's really nothing to complain about here.
00:18:57.220 Well, I mean, we probably do it to each other at this point, right?
00:19:00.060 There's evidence of that as far back as 2016, you know, some things that, insinuations about
00:19:05.500 Alabama.
00:19:06.100 There's a whole lot of, you know, evidence that domestic groups can and do do this as
00:19:10.840 well.
00:19:11.260 And that's why what I keep going to when I talk about this topic publicly is that this
00:19:16.460 is not a partisan issue.
00:19:17.940 This is not a one, you know, one state, you know, one foreign actor interfering in one
00:19:23.860 moment issue.
00:19:25.000 This is sort of just an ongoing global challenge at this point.
00:19:28.240 If we're speaking specifically about Russia and whether that happened, I think that it's
00:19:35.180 incontrovertible truth at this point.
00:19:37.640 Yeah.
00:19:37.740 And the other thing that seems incontrovertible is that it happened to favor the election of
00:19:44.120 Trump in many obvious ways and in many surprising ways that we'll go into.
00:19:49.660 But they were not playing both sides of this.
00:19:52.020 This was not a pro-Clinton campaign.
00:19:55.220 And in your report, you break down three ways which their meddling influence things and
00:20:03.000 or attempted to influence things.
00:20:04.840 We're going to be talking about one of them, but I'll just run through those three quickly
00:20:08.240 and then we'll focus on one.
00:20:10.400 The first is there were attempts to actually hack online voting systems.
00:20:13.960 And, you know, that's been reported on elsewhere.
00:20:17.780 Secondly, there was just this very well-known and consequential cyber attack on the Democratic
00:20:24.400 National Committee and the leaking of that material through WikiLeaks.
00:20:29.320 And that was obviously to the great disadvantage of the Clinton campaign.
00:20:33.040 Then finally, and this is what we're going to focus on, there was just this social influence
00:20:39.060 based on the disinformation campaign of the sort that you've just described, using bots
00:20:45.620 and fake personas and targeting various groups.
00:20:50.860 This was surprising that when you get into the details of who was targeted and the kinds
00:20:55.600 of messages that were spread, it's fairly sophisticated and, you know, amazingly cynical.
00:21:01.680 There's a kind of morbid fun you can imagine these people were having at our expense in how
00:21:07.040 they played one community against another in American society.
00:21:09.940 So let's focus on this third method.
00:21:14.520 And this was coming from something called the Internet Research Agency.
00:21:19.440 What, we'll call them the IRA as you do in your report.
00:21:22.860 What is the IRA and what were they doing to us?
00:21:26.480 So the IRA is, you can think of them a little bit as a social media marketing agency meets
00:21:33.700 intelligence agency.
00:21:36.140 So what they did, to a large extent, was they kind of built these pages, they built these
00:21:41.460 communities, they built these personas, and they pretended to be Americans, Americans of
00:21:46.320 all stripes.
00:21:47.120 So some were Southern Confederates, some were Texas secessionists, some were Black liberationists.
00:21:53.080 It really, they had all of these personas, they really ran the gamut.
00:21:55.800 What they were doing was they were creating pages to appeal to tribalism.
00:22:02.700 So a lot of the conversation about the IRA over the last two years has referred to this
00:22:07.320 idea that they were exploiting divisions in society.
00:22:10.080 And that's true.
00:22:10.900 But the data set that I had access to, which was provided by the tech platforms to the Senate
00:22:15.000 Intelligence Committee, was the first time that anybody saw the full scope, you know, through
00:22:20.420 the full two and a half years.
00:22:22.320 And what we saw there was not a media marketing, you know, meme shit poster type agency that
00:22:28.480 was just throwing out memes haphazardly and trying to exploit divisions.
00:22:33.500 What they were trying to do was grow tribes.
00:22:35.800 So a little bit, a little bit different.
00:22:38.060 The IRA originally started as a entity that was designed to propagandize to Russian citizens,
00:22:45.860 to Ukrainian citizens, to people who were in Russia's sphere of influence.
00:22:50.700 And the early stuff in the data set, Twitter provided the earliest possible information
00:22:55.320 of the material the companies gave us, was actually Russian language tweets talking about
00:23:00.700 the invasion of Crimea.
00:23:02.840 It was talking about, you know, it was creating conspiracy theories about the downing of the
00:23:06.800 Malaysia Airlines flight MH17.
00:23:08.520 So the early activities of the IRA were very much focused inward, focused domestically.
00:23:14.080 And then around 2015, they turned their energy to the United States in what the Mueller and
00:23:20.260 some of the Eastern District court indictments have been referring to as Project LOCTA.
00:23:25.300 So Project LOCTA was when the effort to grow these American tribes really started.
00:23:31.140 This precedes the election, right?
00:23:32.640 So this precedes Trump's plausible candidacy.
00:23:35.280 And there was still this goal of amplifying tribalism in the U.S.
00:23:42.120 Yeah.
00:23:42.260 So the goal was to create these.
00:23:44.800 So this was a long game.
00:23:46.780 This was not a short-term social media operation to screw around with an election.
00:23:52.580 This was a long game to develop extended relationships, trusted relationships with Americans.
00:23:58.840 And what they did was they created these pages.
00:24:01.280 So an example would be Heart of Texas was a page that really amplified notions of Texas
00:24:09.200 pride.
00:24:10.220 Almost all of their pages, an LGBT page, pages targeting the Black community, pages targeting
00:24:15.900 Confederate aficionados, all of these pages were designed around the idea of pride and
00:24:22.100 pride in whatever particular tribe they were targeting.
00:24:24.580 So the vast majority of the content, particularly in 2015 in the early days, was, you know, we
00:24:31.560 are LGBT and proud.
00:24:32.940 We are Texans and proud.
00:24:35.060 We are proud descendants of Confederates.
00:24:37.400 And so this idea that you should have pride in your tribe was what they reinforced over and
00:24:43.340 over and over and over again.
00:24:45.280 And then you would see them periodically slide in content that was either political or divisive.
00:24:52.520 And sometimes that would be about othering another group.
00:24:56.620 So we are, you know, some of the content targeting the Black community in particular did this.
00:25:03.780 This country is not for us.
00:25:05.240 We're not really part of America.
00:25:07.060 We exist outside of America.
00:25:10.100 And so a lot of exploitation of real grievances tied to real news events.
00:25:15.380 So constant drumbeat of pride plus leveraging real harms to exploit feelings of alienation.
00:25:24.600 Sometimes you would see them do this with political content.
00:25:28.140 So as the primaries heated up, that was where you started to see them weaving in their support
00:25:33.540 for candidate Trump, weaving in their opposition to candidate Clinton.
00:25:38.220 I'm looking at your report now and I'm seeing this list of themes.
00:25:41.760 And I'll just tick off some of these because it's, again, rather diabolical and clever how
00:25:46.300 they were playing both sides of the board here.
00:25:48.160 So they would focus on, you know, the Black community and Black Lives Matter and issues
00:25:52.620 of police brutality.
00:25:53.820 But also they would amplify pro-police, Blue Lives Matter pages.
00:25:59.700 You had anti-refugee messages and, you know, immigration, border issues, Texas culture,
00:26:05.660 as you said, Southern culture, Confederate history, various separatist movements, Muslim
00:26:10.420 issues, LGBT issues, meme culture, red pill culture, gun rights in the Second Amendment,
00:26:18.600 pro-Trump and anti-Clinton, and more anti-Clinton in the form of pro-Bernie Sanders and Jill
00:26:24.720 Stein, Tea Party stuff, religious rights, Native American issues.
00:26:29.580 And all of this is just sowing divisiveness and conflict.
00:26:35.400 Although it really does seem there was, to a surprising degree, a focus on the Black community.
00:26:43.360 Do you have more information about or just an opinion about why that was such an emphasis
00:26:49.120 for them?
00:26:50.300 Yeah.
00:26:50.560 So there were about, there were 81 Facebook pages, 133 Instagram accounts.
00:26:55.680 Of the 81 Facebook pages, 30 focused on the Black community.
00:27:01.440 Now, there were, there were other pages that focused on other kind of traditionally left-leaning
00:27:05.480 groups, as you mentioned, Muslims, Native Americans, Latinos.
00:27:10.300 So there was, you know, there were other kind of non-Black lefty pages.
00:27:14.280 Before we go on, Renee, I just, so those numbers don't sound very large.
00:27:18.280 So 81 Facebook pages sounds like not even a drop in the ocean.
00:27:22.640 I think we should give some sense of the scale of what happened here.
00:27:27.600 Yes.
00:27:27.760 So there were 81 Facebook pages.
00:27:31.000 I think there were about 62,000 posts across them.
00:27:34.220 There were 133 Instagram accounts, 116,000 posts across them.
00:27:38.780 There were about 187 million engagements on the Instagram content and another 75 million
00:27:44.900 engagements on the Facebook content.
00:27:46.840 And an engagement is like a like or a share or a comment.
00:27:50.040 But the pages, to be totally, totally clear, they had what I would call like a long tail.
00:27:55.880 Like 20 of them were successful enough that they had, you know, in the hundreds of thousands
00:28:04.700 of followers.
00:28:06.280 And then a lot of the remainder, the long tail was just crap.
00:28:09.720 They were just failed pages.
00:28:10.540 And so one of the things that was actually interesting was you could see them in the
00:28:13.880 data set pivoting those pages.
00:28:15.840 So pivoting their failures, going in there and actually and saying like, OK, well, one
00:28:21.240 example is the Army of Jesus page.
00:28:23.020 A lot of people have seen some of the memes of like Hillary fighting Satan.
00:28:25.940 There were about 900 posts by that account before it found Jesus.
00:28:30.440 It started as a Kermit the Frog meme page, you know, memes of like Kermit sipping tea and
00:28:36.260 stuff.
00:28:36.800 And they didn't seem to get enough traction there.
00:28:39.140 They pivoted it to a Simpsons meme page.
00:28:41.820 And it was, you know, sharing these kind of ridiculous Homer Simpson memes again, just
00:28:45.620 like messing around with American culture, seeing what stuck.
00:28:48.680 When that didn't stick, all of a sudden it became a religious page devoted to Jesus.
00:28:53.600 That seemed they seem to have then kind of like nailed it.
00:28:56.320 You start to see the memes doing things like like for Jesus.
00:29:00.240 When you do something like, say, like like for Jesus, share for Jesus, they're getting
00:29:05.500 people to share their content organically.
00:29:07.520 So you actually see them kind of hitting their stride with standard kind of tactics of social
00:29:12.820 media audience growth with examples like this, this Army of Jesus account.
00:29:17.860 So there is absolutely true that many of their pages were complete failures that had no lift.
00:29:22.960 But then some of their pages were actually if you go and you look at the audience reach
00:29:28.020 using things like CrowdTangle and you look at their engagements versus the engagements
00:29:31.400 for other conservative pages or other black media, you do see them kind of popping up in
00:29:36.220 the, you know, top 20, top 50 in terms of engagement overall.
00:29:41.140 So when, you know, am I saying this, these were like the best possible pages for this content
00:29:48.140 for these audiences?
00:29:49.000 No, but what they did do was they achieved substantial success with some of them and
00:29:55.200 they use their successful pages to direct people to their other pages.
00:30:00.600 So the black community was particularly, they did this particularly, this was a, I can't say
00:30:07.120 effectively necessarily because I can't see the conversion data.
00:30:09.780 I know that they showed people these other memes.
00:30:11.640 I don't know if people converted to the page for these other memes.
00:30:15.080 But what they were doing was they were saying, if you like this content from our page Blackstagram
00:30:20.380 that you're following, here's some other, you know, hey, look at this other group called
00:30:24.500 Williams and Calvin.
00:30:25.780 Now, of course, there's no disclosure that the Internet Research Agency is also running
00:30:28.800 Williams and Calvin.
00:30:30.020 And then they're saying, look at this other content from this page called Blacktivist.
00:30:32.860 Look at this other content from this page called Nefertiti's Community.
00:30:35.900 So a lot of this kind of cross-pollination of audiences in an attempt to push people so that
00:30:40.560 if they're following one of their accounts, one of their pages, they're inundated with
00:30:44.940 posts from the others.
00:30:47.320 Right.
00:30:47.400 And they're also amplifying legitimate pages that are highly polarized in their message.
00:30:53.800 So what's agey here is that not only creating their own fake partisan accounts.
00:31:00.000 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:31:06.040 SamHarris.org.
00:31:07.380 Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along
00:31:11.940 with other subscriber-only content, including bonus episodes and AMAs and the conversations
00:31:17.440 I've been having on the Waking Up app.
00:31:19.560 The Making Sense podcast is ad-free and relies entirely on listener support.
00:31:23.640 And you can subscribe now at SamHarris.org.
00:31:30.000 Thank you.