The Joe Rogan Experience - March 12, 2019


Joe Rogan Experience #1263 - Renée DiResta


Episode Stats

Length

2 hours and 7 minutes

Words per Minute

158.65585

Word Count

20,263

Sentence Count

1,347

Misogynist Sentences

17

Hate Speech Sentences

14


Summary

In this episode, I sit down with journalist and author Renee Vellian to talk about her new book, "The Dark Side of the Internet: How Russia Uses Social Media to Manipulate Us, Control Us, and Control the World," and why she thinks we should all be worried about what s going on in the world of social media. She talks about how the internet is vulnerable to manipulation by anyone, and what we can do to stop it. We also talk about how she and her co-author, Alex Vatanka, got their start in the field, and why they think we should be worried. This is a really important episode for anyone who wants to know what's going on with social media and how it affects us, and how we need to be prepared to fight back. It's a must-listen episode, and you won't want to miss it! Thanks to Renee for coming on the show, and for being kind enough to share it with us! Thank you so much Renee, and thank you for being brave enough to take the time to share her story with us. I really appreciate it. And I hope you enjoy this episode. -Jon Soraya, and I know you'll agree that it's one of those rare moments where you'll be blown away by what's coming out of your mind! -Sam Harris and I can't wait to do it again! (and again, thank you, again, for listening to this episode of The Dark Side Of The Internet. . - Jon Soraya and I'll see you again next week for listening and sharing it with the rest of your support. --Jonah, Jonah, and Jonah Jonah's podcast, Caitlyn, and Sarah, and Johnathan, and all of us at the podcast, and we're looking forward to hearing back from you, and your support, and support you in the next episode! -- Thank you Jonah and I appreciate you, Sarah, again and all your support and support, all of your feedback, and so much more! <3 -Jonah and Sarah -- Jonah & Sam Harris, and much more -- thank you. , Jonah is Sarah, Sarah -- Thanks Jonah: -- Sam Harris: Thank you for listening? -- Sarah: Sam Harris's podcast:


Transcript

00:00:00.000 People, though.
00:00:00.000 They really are.
00:00:01.000 It's just fucking hard business, especially when you didn't see it coming.
00:00:06.000 Two, one.
00:00:08.000 Hello, Renee.
00:00:10.000 Hello.
00:00:10.000 Thanks for doing this.
00:00:11.000 I really appreciate it.
00:00:12.000 Thanks for having me.
00:00:13.000 I listened to you on Sam Harris's podcast, and I was utterly stunned.
00:00:16.000 I had to listen to it twice, because I just couldn't believe it.
00:00:19.000 Let's get into this from the beginning.
00:00:22.000 How did this start out?
00:00:23.000 How did you start researching these online Russian trolls and bots and all this jazz?
00:00:30.000 Yeah, so a couple years back, in around 2015, I had had my first baby in 2013, and I was getting them on these preschool lists.
00:00:38.000 And what I decided to do was I started looking at anti-vaccine activity in California because I had a kid and I wanted to...
00:00:46.000 You know, put them on preschool lists where I was going to fit with the parents, basically, as someone who vaccinates.
00:00:51.000 And I started looking at the way that small groups were able to kind of disproportionately amplify messages on social channels.
00:01:00.000 And some of this was through very legitimate activity, and then some of it was through really kind of coordinated, deliberate attempts to kind of game ways that algorithms were amplifying content, amplifying particular types of narratives.
00:01:13.000 And I thought it was interesting, and I started writing about it.
00:01:16.000 And I wound up writing about ways in which hashtag gaming, ways in which people were kind of using automation to just be in a hashtag all the time.
00:01:26.000 So it was kind of a way to really gain control of shared voice and what that meant when very small groups of people could achieve this kind of phenomenal amplification and what the pros and cons of that were.
00:01:36.000 And then this was 2015. So the way that this sort of...
00:01:44.000 Yeah, I think.
00:01:59.000 To really kind of own a narrative, really push this brand, this digital caliphate, to kind of build it on all social platforms almost simultaneously.
00:02:07.000 And the ways in which information was hopping from one platform to another through kind of deliberate coordination and then also just ways in which information flows kind of contagion style.
00:02:18.000 And I wound up working on thinking about how the government was going to respond to the challenge of terrorist organizations using American social platforms to spread propaganda.
00:02:30.000 So what we came to realize was that there was just this information ecosystem and it had evolved in a certain way over a period of about eight years or so.
00:02:39.000 And the kind of unintended consequences of that.
00:02:42.000 And the way that Russia kind of came into the conversation was around October 2015, when we were thinking about what to do about ISIS, what to do about terrorism, and terrorist, you know, kind of proliferation on social platforms.
00:02:57.000 This was right around when Adrian Chen had written the article, The Agency for the New York Times.
00:03:02.000 And that was one of the first big exposés of the Internet Research Agency.
00:03:05.000 The first time an American journalist had gone over there and actually met the trolls, been in St. Petersburg, and began to write about what was happening over there and the ways that they had pages that were targeting certain facets of American culture.
00:03:18.000 So while we were in D.C. talking about what to do about terrorists using these platforms to spread propaganda...
00:03:25.000 There were beginning to be rumblings that Russian intelligence and, you know, Russian entities were doing the same thing.
00:03:31.000 And so the question became, can we think about ways in which the internet is vulnerable to this type of manipulation by anyone, and then come up with ways to stop it?
00:03:42.000 So that was how the Russia investigation began, was actually around 2015, a handful of people started looking for evidence of Russian bots and trolls on social platforms.
00:03:53.000 2015, if we think about social media and the birth of social media, essentially, it had only been alive for, I mean, was Twitter 2007, I believe?
00:04:02.000 Something like that.
00:04:04.000 So, eight years.
00:04:06.000 Like, eight years of social media, and then all of a sudden they figured out how to game this system, and then they figured out how to use this to make people argue against each other.
00:04:18.000 Yeah, so I think, so there was this, if you go back to like, remember like GeoCities?
00:04:24.000 Yes, sure.
00:04:25.000 Okay, AOL used that.
00:04:26.000 Yeah, of course.
00:04:27.000 So we're probably about the same age.
00:04:29.000 So there have always been, you know, kind of, the thing that was great about the internet, like internet 1.0 we can call it, right, was this idea that everybody was given a platform and you could use your platform, you could put up your blog, you could say whatever you wanted.
00:04:42.000 You didn't necessarily get attention, but you could say whatever you wanted.
00:04:45.000 And so there was this kind of consolidation as social platforms kind of came into existence.
00:04:51.000 Content creators were really excited about the fact that now they not only had this access to write their own stuff, but they also had access to this audience because as the network effects got more and more pronounced, more and more people came to be on social platforms.
00:05:06.000 And it originally wasn't even Facebook.
00:05:07.000 If you remember, it was like, you know, there's like Friendster and MySpace and social networks kind of evolved.
00:05:12.000 When I was in college, Facebook was still limited to like, you know, a handful of like Ivy League schools.
00:05:16.000 And so I wasn't even eligible.
00:05:18.000 And as you watch this consolidation happen, you start to have this information ecosystem really dominated by a handful of companies that grow very large.
00:05:29.000 Because they're providing a service that people really want.
00:05:32.000 But there's a kind of mass consolidation of audiences onto this handful of platforms.
00:05:36.000 So this becomes really interesting for regular people who just want to find their friends, reach people, spread their message, grow an audience.
00:05:45.000 It also becomes really interesting for propagandists and trolls and, in this case, terrorist organizations and state intelligence services.
00:05:52.000 Because instead of reaching the entire internet, they really just kind of have to concentrate their efforts on a handful of platforms.
00:05:58.000 So that consolidation is one of the things that kind of kicks off one of the reasons that we have these problems today.
00:06:05.000 Right.
00:06:06.000 So the fact that there's only a Facebook, a Twitter, an Instagram, and a couple other minor platforms other than YouTube.
00:06:13.000 I mean, anything that you can tell it's an actual person.
00:06:16.000 Like, YouTube is a problem, right?
00:06:18.000 Because you can see it's an actual person.
00:06:19.000 If you're narrating something, you know, if you're in front of the camera and explaining things, people are going to know that you're an actual human being.
00:06:29.000 Mm-hmm.
00:06:30.000 Whereas there's so many of these accounts that I'll go to like I'll watch people get involved in these little online beefs with each other and then I'll go to some of these accounts like this doesn't seem like a real person and I'll go and it's like hashtag MAGA there's a American Eagle in front of a flag and then you you read their stuff like wow this is this is probably a Russian troll account and it's strange like you feel like you're not supposed to be seeing this like you've seen the wiring under the board or something and then you'll go through the timeline I think?
00:07:14.000 Yeah, so in 2016, there was a lot of that during the presidential campaign, right?
00:07:20.000 And there was so much that was written, you know, we can go back to the free speech thing we were kind of chatting about before.
00:07:26.000 There was so much that was written about harassment and trolling and negativity and these kind of hordes of accounts that would brigade people and harass them.
00:07:35.000 Of course, a lot of that is just real Americans, right?
00:07:37.000 There are plenty of people who are just assholes on the internet.
00:07:39.000 Sure.
00:07:40.000 But there were actually a fair number of these as we began to do the investigation into the Russian operation.
00:07:47.000 It started on Twitter in about 2014, actually.
00:07:51.000 So 2013-2014, the Internet Research Agency is targeting Russian people.
00:07:57.000 So they're tweeting in Russian at Russian and Ukrainian folks, people in their sphere of influence.
00:08:02.000 So they're already on there.
00:08:03.000 They're already trying this out.
00:08:04.000 And what they're doing is they're creating these accounts.
00:08:08.000 It's kind of wrong to call them bots because they are real people.
00:08:11.000 They're just not what they appear to be.
00:08:13.000 So I think the unfortunate term for it has become like cyborg, like semi-automated.
00:08:17.000 Sometimes it's automated.
00:08:19.000 Sometimes it's a real person.
00:08:20.000 But sock puppet is the other way that we can refer to it, a person pretending to be somebody else.
00:08:26.000 So you have these sock puppets, and they're out there, and they're tweeting in 2014 about the Russian annexation of Crimea, or about MH17, that plane that went down, which Russia, of course, had no idea what happened, and it wasn't their fault at all.
00:08:38.000 And gradually, as they begin to experience what I imagine they thought of was success, that's when you see some of these accounts pivot to targeting Americans.
00:08:49.000 And so in late 2014, early 2015, you start to see the...
00:08:55.000 This strategy that for a long time had been very inwardly focused, making their own people think a certain way or feel a certain way or have a certain experience on the internet, it begins to spread out.
00:09:05.000 It begins to look outwards.
00:09:07.000 And so you start to see these accounts communicating with Americans.
00:09:11.000 And as we were going through the datasets, which the Twitter dataset is public, anyone can go and look at it at this point, you do see some of the accounts that are kind of, you know, that were somewhat notorious for being really virulent, nasty trolls, anti-Semitic trolls going after journalists,
00:09:29.000 you know, some of these accounts.
00:09:32.000 Being revealed as actually being Russian trolls.
00:09:36.000 Now, it doesn't kind of exculpate the actual American trolls that were very much real and active and part of this and expressing their opinion.
00:09:44.000 But you do see that they're mimicking this.
00:09:46.000 They're using that same style of tactic, that harassment to get at real people.
00:09:51.000 And if they do get banned, if their account gets banned, they just simply make another account.
00:09:56.000 They use some sort of a, you know, what is it, a virtual server?
00:10:02.000 What is that called?
00:10:04.000 You mean VPNs?
00:10:26.000 I think?
00:10:45.000 Yeah, so what they're doing is they're operating in communities.
00:10:49.000 So one of the really common criticisms of, you know, people who, a lot of people think that this didn't have a huge impact, didn't, you know, did it swing the election?
00:10:57.000 We have no idea.
00:10:59.000 But what it does do in the communities that it targets is it can change that tone.
00:11:04.000 And that's where you see – I mean, I think everybody's probably had this experience.
00:11:10.000 You're part of a group and then a new person gets added to the group and the dynamic changes.
00:11:13.000 It's very much the same kind of thing, just that these are not real people who are joining the group.
00:11:18.000 And so there's this opportunity to – Kind of expand the bounds of tolerance just that little bit more or try to normalize using particular ways of communicating that maybe a group wouldn't naturally gravitate to,
00:11:35.000 but then it does.
00:11:36.000 So there are definitely ways in which any type of troll doing this, doesn't have to be a Russian troll, has this ability to kind of shift the language, shift the community, shift the culture just a little bit.
00:11:48.000 Now, why did the agency do this?
00:11:52.000 And do we know?
00:11:53.000 Do we have someone who's ever left there or become a whistleblower who can give us some information about what the mandate was and how it was carried out?
00:12:03.000 There have been a couple of whistleblowers and actually some investigative journalism in Russia that's covered this.
00:12:09.000 They describe the employees of the Internet Research Agency.
00:12:13.000 So it's a little bit like a social media marketing agency, plus tactics that we would not expect a social media marketing agency to use.
00:12:21.000 I think?
00:12:48.000 She wrote an expose, I believe, on this.
00:12:50.000 And it's described as being much like you would expect if you were doing social media grunt work.
00:12:56.000 You have a certain number of posts per day.
00:12:58.000 You're trying to get a certain amount of engagement.
00:13:01.000 You've got to kind of hit your quotas.
00:13:04.000 Most people are young millennials, the people that work there.
00:13:07.000 They're well-versed in trolling culture.
00:13:09.000 They're well-versed in internet culture.
00:13:10.000 You know, they're up to speed on, like, popular memes and things like that.
00:13:14.000 And so you do see this.
00:13:18.000 And then the other thing that they do is they talk about in Mueller indictment, you see some really interesting descriptions of, like, the stand-ups that they have.
00:13:24.000 Stand-up is a thing you do at a tech company where everybody kind of stands up and talk about your goals and responsibilities and blockers and things.
00:13:30.000 And in these stand-ups, they would be sitting there saying things like, if you're targeting black LGBT people, make sure you don't use white people in your image, in your meme, because that's going to trigger them.
00:13:42.000 So trying to get at the very niche rules for communicating authentically in an American community.
00:13:51.000 Yeah.
00:14:09.000 The degree of granularity that they have to recognize that if you are running a black LGBT page and your meme is of white people, you're going to cause some tension and consternation.
00:14:20.000 And assuming that that's not necessarily what you want to be doing, you should go find the meme of black LGBT people to put as your meme for the day.
00:14:28.000 So there's a lot of sophistication.
00:14:31.000 There's a lot of understanding of American culture.
00:14:34.000 And then there's a lot of understanding of trolling culture, and so these things combine to be a rather effective, you know, very effective social media agency.
00:14:43.000 And is there an overwhelming sort of narrative that they're trying to pursue, that they're trying to push?
00:14:49.000 So what we saw, so I did some of the research for the Senate, and the Senate data came from the platforms.
00:14:57.000 So what I had was the attribution was made by the platforms.
00:15:00.000 It wasn't like Renee deciding this was IRA. It was the platforms giving it to our government.
00:15:09.000 Information in there, what it showed was that across all platforms, across Twitter, across Facebook, Instagram, YouTube, they were building up tribes.
00:15:20.000 So they were really working to create distinct communities of distinct types of Americans.
00:15:25.000 And that would be, for example, there's an LGBT page that is very much about LGBT pride.
00:15:31.000 And they created it.
00:15:33.000 And they created it.
00:15:34.000 And they've, they curate it, and they...
00:15:57.000 LGBT United was the name of the page.
00:15:58.000 It had a matching Instagram account, which you would also expect to see from a media property, right?
00:16:02.000 You would expect them to see in both places.
00:16:05.000 And this...
00:16:06.000 And what were they pushing?
00:16:07.000 It read like a young woman talking about crushes on actresses and things, actually.
00:16:14.000 It was really, besides the sometimes wonky English, virtually indistinguishable from what you would read on any kind of young, millennial-focused...
00:16:25.000 It wasn't, none of it was radical or divisive.
00:16:31.000 It wasn't like, the way that they got the division across was they built these tribes where they're reinforcing in-group dynamics.
00:16:40.000 So you have the LGBT page.
00:16:42.000 You have numerous pages targeting the black community.
00:16:45.000 That was where they spent most of their energy.
00:16:47.000 A lot of pages targeting far-right.
00:16:50.000 So both old far-right, meaning people who are very concerned about what does the future of America look like, and then young far-right, which was much more angry, much more like trolling culture.
00:17:02.000 So they recognize that there's a divide there, that the kinds of memes you're going to use to target Younger right-wing audiences are not the same kinds of memes you're going to use to target older right-wing audiences.
00:17:11.000 So there's a tribe for older right-wing, younger right-wing.
00:17:15.000 In the black community, there's a Baptist tribe.
00:17:17.000 There's a black liberation tribe.
00:17:19.000 There's a black women tribe.
00:17:21.000 There's one for people who have incarcerated spouses.
00:17:24.000 There's a Brown Power, I believe was the name of it, page that was very much about Mexican and Chicano culture.
00:17:33.000 There was Native Americans united.
00:17:36.000 And all of these are fake?
00:17:37.000 All these are fake.
00:17:38.000 All these are fake.
00:17:38.000 And what are they trying to do with all these?
00:17:40.000 So you build up this in-group dynamic, and they did this over years.
00:17:44.000 So this was not a short-term thing.
00:17:47.000 They started these pages in 2014-2015 time frame, most of them.
00:17:51.000 They started some other ones that were much more political later.
00:17:54.000 We can talk about the election if you want to.
00:17:56.000 But with this tribal thing, you're building up tribes.
00:18:00.000 So you're saying, like, as black women in America, this is, here's posts about Things that we care about.
00:18:06.000 Here's posts about black hair.
00:18:08.000 Here's posts about child rearing.
00:18:10.000 Here's posts about fashion and culture.
00:18:13.000 And then every now and then, there would be a post that would reinforce, like, as black people, we don't do this.
00:18:20.000 And so, or as LGBT people, we don't like this.
00:18:23.000 And so you're building this rapport.
00:18:25.000 So like me and you, we're having a conversation.
00:18:27.000 We're developing a relationship on this page over time.
00:18:30.000 And then I say, like, as this kind of person, we don't believe this.
00:18:35.000 So it's a way to subtly influence by appealing to an in-group dynamic or appealing to, like, as members of this tribe, as LGBT people, of course we hate Mike Pence.
00:18:45.000 As black people, of course we're not going to vote because, you know, we hate Hillary Clinton because we hate her husband.
00:19:02.000 Yes.
00:19:07.000 Yes.
00:19:20.000 And then once they got everybody on board, how many followers do these pages have?
00:19:25.000 So there was kind of a long tail.
00:19:28.000 There were, I think, 88 pages on Facebook and 133 Instagram accounts.
00:19:33.000 And I would say maybe...
00:19:36.000 30 of the Facebook pages had over 1,000 followers, which is not very many.
00:19:40.000 And then maybe the top 10 had upwards of 500,000 followers.
00:19:45.000 So there's the same way you run any social campaign.
00:19:48.000 Sometimes you have hits.
00:19:49.000 Sometimes you have flops.
00:19:50.000 And what was interesting with the flops is you would see them repurpose them.
00:19:53.000 So they would decide, you know, the same way if you're running a social media agency, well, we've got this audience.
00:19:59.000 This page isn't doing so well.
00:20:01.000 Let's like rebrand it a little bit, change it up, try to make it appeal to somebody else.
00:20:06.000 So you do see this.
00:20:08.000 I got this data set and I was going through these Instagram memes and 133,000 of them.
00:20:17.000 And I was, there was a cluster of images of Kermit the Frog.
00:20:24.000 I was like, what the hell is Kermit the Frog doing in here?
00:20:27.000 And so I, so then I go, so this, the way the platforms provide the data is I got like a CSV of the posts and then I got a folder of the images.
00:20:35.000 And so in order to like connect the dots, I had to have the image up on one screen and the, this thing, the CSV up on the other screen.
00:20:43.000 CSV? It's like a spreadsheet.
00:20:45.000 And we turned it into a database that we could track things a little bit more easily across the platforms.
00:20:53.000 So I have this cluster of Kermit the Frog memes and I go and I look and I realize that they're attributed to an account called Army of Jesus.
00:21:01.000 I thought, well, that's interesting.
00:21:03.000 Some of them were really raunchy.
00:21:05.000 It was like Kermit Miss Piggy.
00:21:08.000 It was just stupid crappy memes attached to Army of Jesus and what the hell is going on here?
00:21:16.000 I keep going through it.
00:21:17.000 Hundreds of Kermit memes.
00:21:19.000 And then I get to a post where they say, this page is owned by Homer Simpson now.
00:21:24.000 Kermit went to jail for being, I don't know, they made some joke.
00:21:29.000 It was stupid.
00:21:29.000 And all of a sudden, the data set turns into Homer Simpson memes.
00:21:34.000 So again, like this kind of raunchy Homer Simpson culture.
00:21:39.000 And again, it's attributed to Army of Jesus.
00:21:41.000 And then I go through all this and realize that they didn't get to actually making Army of Jesus a Jesus-focused page until like 900 posts in.
00:21:49.000 So they just renamed the account at some point.
00:21:52.000 It used to be called Nuts News.
00:21:55.000 Nuts News was what they called it when it was the Kermit the Frog meme page.
00:21:58.000 And then it gets repurposed when they realize Kermit's not doing it.
00:22:01.000 It's not getting the audience they want.
00:22:03.000 Homer Simpson's not getting the audience or engagement they want.
00:22:05.000 And then they pivot over to Jesus.
00:22:08.000 And then all of a sudden, the likes and things start pouring in.
00:22:12.000 So what they're doing is they're actually like either deliberately or they're just creating placeholders.
00:22:19.000 It's kind of a red flag when a brand new account that was created yesterday suddenly starts talking about some highly politically divisive thing or whatever.
00:22:27.000 But if you lay the groundwork and you do it over a period of two years, then somebody who goes and checks to see what the account was, where it came from, how old it is, is going to see something that was two years old.
00:22:38.000 So it's an opportunity to create almost like sleeper accounts where you create them now and then you activate them, you politicize them, you actually put them to use a couple of years in the future.
00:22:49.000 So we saw all kinds of, we saw this over and over again.
00:22:53.000 There was a Black Guns Matter account that turned into an anonymous account at one point.
00:22:59.000 They were pretending to be anonymous, you know, the hacktivist.
00:23:02.000 So they repurposed this Black Guns Matter page, which just had, it was advocating that black people buy weapons and carry, and it's like a pro-Second Amendment page, but for the black community.
00:23:13.000 And they took that page, when it wasn't getting, I guess, a ton of engagement, and it became, it was called, oh gosh, I don't remember the exact name of the anonymous page, and I don't want to say it was something that's legit, but they pivoted into an anonymous page.
00:23:29.000 And when they do that, do they go back and repurpose the content of the earlier posts?
00:23:35.000 Do they change?
00:23:36.000 That was not clear.
00:23:38.000 That wasn't clear.
00:23:39.000 We didn't get that information from the platforms.
00:23:41.000 There was a lot of stuff that I would have loved to have more insight into.
00:23:46.000 We could see, again, you know, you'd think if you started following an Army of Jesus page and you had all this raunchy Kermit shit from like a year ago, that would raise some flags.
00:23:53.000 I would assume that they scrubbed it and restarted, but I don't know.
00:23:57.000 Your podcast with Sam changed how I look at a lot of the pages that I actually follow, because I follow some pages that have classic cars or something like that, and then I'll see them, and most of it is just photographs of cars, like beautiful old cars,
00:24:13.000 and they'll have a giant following, and then all of a sudden something will get political.
00:24:18.000 And I'll look at it and go, oh, wow.
00:24:21.000 Like, this is probably one of those weird accounts.
00:24:24.000 Like, they're getting people to get engaged with it because it represents something that they're interested in, like classic muscle cars.
00:24:32.000 And then they use it for activism and they use it to get this narrative across.
00:24:38.000 I think, I mean, I've seen it happen with some of mine, too.
00:24:42.000 I think one of the challenges is, like, You want people to be aware that this stuff exists, but you don't want them to be paranoid that it's everywhere.
00:24:51.000 I am paranoid.
00:24:52.000 I know.
00:24:53.000 That's a problem.
00:24:54.000 Everybody's a troll now.
00:24:56.000 I look at this all day long and sometimes I see things and I'm like, what are the odds?
00:25:03.000 And I try to not feel like you're in some Tom Clancy novel.
00:25:07.000 Yes.
00:25:10.000 It's this balance between when you make people aware of it, and I think people deserve to be aware of it, they deserve to understand how this plays out.
00:25:18.000 The flip side of that is you do wind up in these weird, you know, you see it happen on social media now or click into a Trump tweet and you'll see like, you're a Russian bot!
00:25:26.000 No, you're a Russian bot!
00:25:27.000 No, you're...
00:25:29.000 Like, they're probably not Russian bots.
00:25:30.000 You know, everybody you don't like on the internet is not a Russian bot.
00:25:35.000 Exactly.
00:25:36.000 And so that's where you get at the interesting conversations of, you know, in some ways, getting caught.
00:25:46.000 This is one of the challenges with running disinformation campaigns, right?
00:25:51.000 Yeah, I think.
00:26:10.000 In your information environment, is this real?
00:26:12.000 Is this not?
00:26:26.000 Or you get caught and then there's a, you know, until there's some confidence in the ability of platforms to detect this stuff, there's real concern among everybody that you're encountering something fake.
00:26:41.000 Now, the overwhelming narrative is that the Russians were very much invested in having Trump win.
00:26:51.000 Right.
00:26:51.000 And if they were very much invested in having Trump win, was the reason why they focused so heavily on the African American community?
00:26:58.000 Because the African American community traditionally seems to vote Democrat.
00:27:02.000 So they were trying to do something to break that up or trying to do something to weaken the position of the incumbent or Hillary Clinton and maybe put some emphasis on Jill Stein or some alternative candidates.
00:27:17.000 Yeah, so the way that the political campaign, the political aspect of it played out, so they established, they started building these relationships in 2015. And, you know, they're doing this tribal thing, we've got our in-group, we're part of this community.
00:27:31.000 And then what you start to see them do is early, they're actually, there was a tiny, tiny cluster in the early primaries where they were supporting Rand Paul.
00:27:41.000 And then they pivot to Trump pretty quickly.
00:27:44.000 And probably Rand Paul just didn't poll well and they were like, there's no way to get any lift here.
00:27:49.000 But maybe Trump was getting, you know, some actual lift in the media.
00:27:54.000 And so you see them move into supporting Trump.
00:27:56.000 And then for the remainder of the data set from 2015 through the end, which was mid-2017 or so is when this thing ends, It's adamantly pro-Trump on the right.
00:28:07.000 And on the right, you see not only pro-Trump, but you see them really working to erode support for mainstream or traditional Republicans, traditional conservatives.
00:28:18.000 You see a lot of the memes about like, are you with the cuck-servatives or the conservatives?
00:28:23.000 And so the cuck-servatives, of course, they've got pictures of Lindsey Graham and John McCain.
00:28:28.000 They hate John McCain.
00:28:29.000 John McCain shows up a million times.
00:28:33.000 Is it clear why?
00:28:35.000 Well, I think that they, you know, one of the theories is, and I believe this is probably true, they really strongly disliked Hillary Clinton because there was concern that she would, you know, things that she was saying about increasing freedoms in Russia were very threatening.
00:28:50.000 They thought the best bet to get sanctions removed was Trump.
00:28:53.000 So they had specific outcomes that they were hoping for, and that was one of, you know, so there's always like a political motivation.
00:29:01.000 So there is this narrative around they just want to kind of like screw with American society, create divisions, amplify divisions.
00:29:07.000 When you look at the political content, the clear and sustained support for Trump, and even more than that, the clear disdain for Hillary Clinton, there is not, on Facebook and Instagram, there was not one single pro-Hillary post.
00:29:23.000 Mm-hmm.
00:29:44.000 Because then they're using Bernie Sanders as a way to say this was stolen from him by the evil Clintons or Jill Stein.
00:29:52.000 You know, here's a true independent, real liberal.
00:29:55.000 We should be voting for her if we want to support a woman.
00:29:57.000 So there are these feminism pages really pushing this narrative of Jill Stein.
00:30:01.000 So you have the left-leaning pages, totally anti-Clinton.
00:30:05.000 And then you have the right-leaning pages, staunchly pro-Trump and also strongly anti-Cruz, anti-Rubio, anti-Lindsey Graham, basically anti-Trump.
00:30:14.000 Every, now what's called, establishment Republican.
00:30:17.000 And there's this kind of pushing of people to opposite ends of the political spectrum.
00:30:24.000 So this is where you get at the conversation around facilitating polarization.
00:30:31.000 It wasn't enough to just support Donald Trump.
00:30:34.000 It was also necessary to strongly disparage the kind of traditional conservative moderate center right in the course of amplifying the Trump candidacy.
00:30:48.000 Yes it does.
00:31:09.000 Because they never had a king before.
00:31:11.000 Everyone who was running for president was at least mostly dignified.
00:31:17.000 I mean, basically, it's really difficult to go back in time and find someone who isn't.
00:31:23.000 There's no one who insults people like he does.
00:31:26.000 I mean, he insults people's appearances.
00:31:28.000 He calls them losers.
00:31:30.000 He called Stormy Daniels horse face.
00:31:33.000 I mean, he says some outrageous shit.
00:31:35.000 So part of it was me thinking like, wow, maybe he's just ignited and emboldened.
00:31:39.000 I actually had this conversation with my wife today.
00:31:41.000 She was like, it feels like racism is more prevalent.
00:31:45.000 Like, it's more accepted.
00:31:47.000 People feel more emboldened because in their mind they think he is a racist.
00:31:52.000 I can get away with more things.
00:31:54.000 Trump is president.
00:31:55.000 Like, there's actually videos of people saying racist shit and saying, hey, Trump's president now.
00:32:00.000 We can do this.
00:32:01.000 So, I was thinking that, well, maybe that's what it was.
00:32:04.000 It's just sort of like some rare flower that only blooms under the right conditions.
00:32:09.000 Poof!
00:32:10.000 It's back.
00:32:10.000 Right?
00:32:11.000 But...
00:32:14.000 I think?
00:32:33.000 It seems different.
00:32:35.000 Political discourse, discussions online, and social media, the way social media reacted.
00:32:42.000 There was a lot of people that were anti-Obama before either of his elections that he won.
00:32:48.000 But it seemed different.
00:32:50.000 It seemed different to me than this one.
00:32:52.000 This one seemed like We had moved into another level of hostility that I'd never experienced before, and another level of division between the right and the left that I'd never experienced before.
00:33:04.000 And a willingness to engage with really harsh...
00:33:12.000 Nasty comments, and just to dive into it, you would see it all day.
00:33:17.000 I mean, there were certain Twitter followers that I think are pretty much human beings, but I would follow them, and they would just be engaged with people all day long, just shitting on people and criticizing this and insulting that, and it seemed like It seemed dangerous.
00:33:35.000 It seemed like things had moved into a much more aggressive, much more hostile and confrontational sort of chapter in American history.
00:33:45.000 If this is all done at the same time that this is happening, how much of an influence do you think this IRA agency had on all this stuff?
00:33:58.000 That's the question that we would all like the answer to, and I unfortunately can't give it.
00:34:03.000 In your mind, though.
00:34:05.000 Yeah, let me kind of caveat that.
00:34:08.000 The thing that we don't have, that nobody who looks at this on the outside has, is we can't see what people said in response to this stuff.
00:34:17.000 So I've looked at now almost 200,000 of these posts, is what I spent most of last year doing, was this research.
00:34:26.000 And we can see that they have thousands of engagements, thousands of comments, thousands of shares.
00:34:33.000 We have no idea what happened afterwards, and that's the problem.
00:34:37.000 So once the stuff comes down, it's really hard to go back and piece it together.
00:34:42.000 So I can see that there are some, per your point, the really, really just fucking horrible troll accounts that they ran.
00:34:51.000 They didn't necessarily have a lot of followers, but you see them in there like adding people.
00:34:55.000 So they're, you know, at and then the name of a reporter at the name of a prominent person.
00:35:00.000 And so they're in there kind of like draft on the popularity of, you know, famous people basically.
00:35:05.000 And they're just saying like horrible shit.
00:35:09.000 And the tone is so spot on.
00:35:11.000 And one thing that was interesting with a couple of them is if you go and you look at their profile information, which was also made public, they would have a Gab account in their profile.
00:35:22.000 So it was a remarkable piece of the culture in which you see that they're actually sitting on Gab too.
00:35:32.000 And so they can also go and they can draw on.
00:35:34.000 They're in Reddit.
00:35:36.000 900 or something troll accounts were found on Reddit.
00:35:38.000 They're on Tumblr.
00:35:39.000 And so they're just picking the most divisive content, and they're pushing it out into communities.
00:35:46.000 And at the same time, we can see that they're doing it, but we can't see what people do in return.
00:35:51.000 We can't see, did they just block?
00:35:52.000 Did they have the fight back?
00:35:54.000 Was there a huge...
00:35:55.000 When this happens on a Facebook page, and they're doing something like telling black people not to vote, as black people, we shouldn't vote.
00:36:06.000 What do people say in response?
00:36:07.000 And that's the piece that we don't have.
00:36:09.000 So when we talk about impact, a lot of the impact conversation is really focused on, did this swing the election?
00:36:16.000 We don't have, nothing that I've seen has the answer to that question.
00:36:20.000 The other thing is, but the second question, the thing, when I think about impact, I think you and I agree on this, it also matters how does this change how people relate to each other.
00:36:32.000 And we have no real evidence of, no information on that either.
00:36:37.000 This is the kind of thing that lives in some, you know, Facebook has it.
00:36:40.000 The rest of us haven't seen it.
00:36:41.000 Now, are most of these people, is this mostly Facebook?
00:36:45.000 Is it mostly Twitter?
00:36:47.000 How does it break down?
00:36:48.000 Yeah, so there were, here are my little stats here because I don't want to give you the wrong data.
00:36:53.000 There were 10.5 million tweets, of which about 6 million were original content created by about 3,800 accounts.
00:37:02.000 There were about 133 Instagram accounts with about 116,000 posts and then 81 Facebook pages and 17 YouTube channels with about 1,100 videos.
00:37:18.000 And so they got about 200 million engagements on Instagram and about another 75 million or so on Facebook.
00:37:26.000 Engagements are like likes, shares, comments, reactions, you know.
00:37:32.000 So, it's hard to contextualize what we think happened.
00:37:36.000 You know, you can go and you can try to look at how well did this content perform relative to other real authentic media targeting these communities.
00:37:46.000 And what you see with the black community in particular is their Instagram game was really good.
00:37:53.000 So on their Instagram accounts, the top five, three of them targeted the black community and got tens to hundreds of millions of engagements.
00:38:03.000 So I would have to pull up the exact number.
00:38:05.000 Is it mostly memes?
00:38:07.000 Yeah, it's on Instagram, it's all memes.
00:38:11.000 And then, you know, so we have the memes and then we have the text.
00:38:14.000 On Instagram, you can't really share.
00:38:16.000 So it's amazing that they got the kind of engagement that they did, even without the sharing function.
00:38:22.000 One of the things you can do is if you know the names of the accounts, and a lot of them are out there publicly now, You can actually see them in regram apps.
00:38:32.000 So people were regramming the content.
00:38:35.000 So Facebook says about 20 million people engaged with the Instagram content.
00:38:40.000 But what isn't included in that is all of the regrams of the content that were shared by other accounts.
00:38:47.000 So, the spread and the dispersion of this, it's an interesting thing to try to quantify.
00:38:55.000 Because we have engagement data, but we don't know did it change hearts and minds.
00:39:00.000 We don't know if it influenced people to go follow other accounts.
00:39:04.000 We don't know if it influenced people to not vote.
00:39:07.000 There's just so much more, I think, still to understand about how these operations work.
00:39:13.000 We can assume that it had some impact, right?
00:39:16.000 I mean, as you were saying earlier, when a new person enters into a conversation, it changes the tone of it.
00:39:22.000 How much of what they did was their own original post, and how much of it was commenting on other people's posts?
00:39:29.000 So, I thought you were actually going to ask a different thing there.
00:39:32.000 Well, what did you think I was going to ask?
00:39:33.000 How much of it was them repurposing our own posts, right?
00:39:36.000 Repurposing real American content.
00:39:39.000 Did they do that as well?
00:39:40.000 Yeah, tons of times.
00:39:43.000 So, they created a lot of their own stuff, particularly in the early days.
00:39:47.000 And so, you can actually read the data set.
00:39:49.000 And one of the things, when we started finding these posts, I was struck by...
00:39:54.000 How sometimes it read like ESL and then sometimes it read like perfect, flawless, professional English.
00:40:02.000 And then other times it read like normal English, vernacular, just the way that we would talk to each other.
00:40:06.000 And I started digging into what that was.
00:40:10.000 So when it was vernacular English, when it read like fluent American English, it was usually cribbed from somewhere else.
00:40:18.000 So they would go and they would find a local news story from some obscure local paper and they would crib and then they would paste that and then so the Facebook post would be that cribbed sentence from that article and then their meme.
00:40:30.000 Maybe they would add a sentence underneath it to give it some kind of context or angle.
00:40:35.000 When they would write their own stuff, you would see the sloppiness.
00:40:38.000 That's where you could see subject-verb agreements not quite there.
00:40:42.000 The ways in which Russian possessives are different than American possessives.
00:40:48.000 The slips there.
00:40:50.000 And then the other thing was the really funny stuff, which was a post that's supposedly written by a Texas secessionist.
00:40:58.000 So you can probably have an image of a Texas secessionist in your mind as I say this.
00:41:02.000 And it would be things like...
00:41:17.000 Right.
00:41:17.000 Right.
00:41:19.000 It is clear that.
00:41:20.000 And I'm like, it reads like, remember you're like in, you know, English in college or something and you've got to like write a formal essay.
00:41:25.000 I was like, okay, come on.
00:41:27.000 You bullshit your way through it.
00:41:28.000 Right.
00:41:28.000 So nobody actually talks like this, especially not, you know, your stereotypical Texas secessionist.
00:41:34.000 So it was funny seeing these incongruities.
00:41:37.000 And that's unfortunately one of the best ways to tell what you're dealing with is actually to kind of look for those incongruities now and see as you read communications online, like, Does this read like an American?
00:41:55.000 Does this read like a communication?
00:41:57.000 And what we started to see was one way to not get caught for your lousy English or your cultural lack of kind of Native language Abilities is to just repurpose other people's stuff.
00:42:16.000 And so that's where you would see memes getting shared from on both the right and the left.
00:42:22.000 You know, you'd see a lot of these like Turning Point USA memes that they were repurposing and pushing out.
00:42:26.000 Or you would see Occupy Democrats or the other 98%.
00:42:30.000 So memes from real American pages, real American culture.
00:42:35.000 And they would just sometimes slap a new logo on and just repost it as if it was theirs.
00:42:40.000 So it does in those instances read just like, you know, authentic American content.
00:42:45.000 And in many ways it is authentic American content.
00:42:47.000 How many people are working for this agency?
00:42:50.000 Do we understand?
00:42:52.000 I don't remember.
00:42:53.000 Off the top of my head, it was somewhere between a couple hundred and a thousand, I think.
00:42:57.000 I don't know if it's bigger than that now.
00:42:58.000 They just moved offices.
00:43:02.000 It's a funny story, I guess.
00:43:03.000 They moved offices, and then people started calling in bomb threats to the office.
00:43:10.000 And it was just like every day a new bomb threat would get called in, so they couldn't work, basically.
00:43:14.000 I assume this is like some American intelligence agency just fucking with them.
00:43:18.000 So there's people calling in these bomb threats to try to keep them from working.
00:43:21.000 And I think there was an article that came out really recently that said that Army Cyber, one of our agencies, worked to just take them offline during the midterms, a couple days around the midterms.
00:43:35.000 I wonder if whoever's calling it in is doing it in bad Russian.
00:43:39.000 That'd be so ironic.
00:43:40.000 I thought it was really funny.
00:43:41.000 Like, move to this nice new office building and someone chucked a Molotov cocktail through the window at some point.
00:43:45.000 Oh, of course.
00:43:46.000 Yeah, of course.
00:43:47.000 It's spy versus spy.
00:43:49.000 Yeah.
00:43:51.000 I mean, it only makes sense that in this bizarre and unpredictable and really unprecedented environment that we find ourselves in, that something like this would come up and just sort of throw a monkey wrench into the gears of real conversation online.
00:44:07.000 I mean...
00:44:09.000 It's a really amazing time in that we're getting to see this kind of stuff happen in real time.
00:44:16.000 We're getting to see these sort of weird attempts at manipulating things.
00:44:22.000 And I think in a lot of ways successful, especially with less sophisticated people that don't really understand that they're being trolled and that someone is fucking with them.
00:44:32.000 And there's, it seems, I mean, I've, there's a bunch of accounts that I have bookmarked that I follow, but I don't follow.
00:44:39.000 So I don't follow them online because I don't want them to know I'm following them, but I just go to them.
00:44:43.000 And some of them are so strange.
00:44:46.000 A few of them are flat earth accounts.
00:44:48.000 This is something that I'm finding.
00:44:49.000 Yeah, yeah, yeah.
00:44:50.000 The conspiracy theorists.
00:44:51.000 Yes.
00:44:51.000 And some of them literally have almost no, it's all memes, and they don't say much, if anything, underneath the memes.
00:45:01.000 And I go to it, I'm like, what exactly are they doing here?
00:45:05.000 Like, what exactly are they trying to do with these?
00:45:08.000 Because they just, they're very weird.
00:45:12.000 There was one that I came across.
00:45:13.000 I was looking at the conversation around GMOs.
00:45:18.000 And because we have seen one of the things that Russia does besides the social bots and the, you know, the, you know, screwing with like Americans directly is the House.
00:45:30.000 So this was a Republican House Committee, House Science and Technology Committee about a year ago said that they were Seeing evidence of both kind of overt propaganda and then ways of disseminating the propaganda.
00:45:44.000 So there's always the dissemination and then the accounts and then the content.
00:45:47.000 So it's like you look at three different things to try to get a handle on whether or not this is real or fake.
00:45:53.000 So when we talk about the accounts, we're looking at are they real people?
00:45:56.000 Are they, you know, automated?
00:45:58.000 Are they not automated?
00:45:59.000 When you're looking at the content, you're usually looking at the domains.
00:46:03.000 And that's kind of the last piece because you don't want to have any kind of bias get in there, but you're just trying to see is it being pushed through like overt Russian propaganda domains?
00:46:12.000 Like their think tanks and things.
00:46:14.000 And then the third is the dissemination pattern.
00:46:16.000 Is it being pushed out through automated accounts?
00:46:18.000 Is it spreading in ways that look anomalous versus how normal information would spread?
00:46:24.000 So one of the things that the House Committee looked at was using that kind of rubric, Russian, you know, these dubious pieces of content and narratives around American strategic industries.
00:46:38.000 So the energy industry, oil and fracking, for example.
00:46:41.000 Or you see a lot of stuff with GMOs and agriculture.
00:46:48.000 This narrative of Putin and Russia being the land of organic plenty in the United States, serving its people toxic, poisoned vegetables, this sort of stuff.
00:46:58.000 And Meanwhile, at the same time, there's competition for who's going to get the, you know, large contract to provide rice to some part of the world.
00:47:08.000 So there's like an economic motivation underlying this kind of narrative.
00:47:12.000 And I was looking at one of these accounts, and it was tweeting an article about Hillary Clinton.
00:47:19.000 A vote for Hillary is a vote for Monsanto.
00:47:21.000 But it was tweeting this just like three months ago or something.
00:47:25.000 It was like mid-2018 or late 2018 when I was looking at this.
00:47:29.000 I'm like, well, that ship sailed a long time ago, guys.
00:47:32.000 Why are we tweeting about Hillary's votes?
00:47:34.000 It's because they're just there to...
00:47:36.000 It was written by a Russian think tank.
00:47:38.000 And so they just have these automated accounts retweeting, repurposing this content from forever ago.
00:47:45.000 And it doesn't even make sense.
00:47:46.000 It's just out there to amplify a particular point of view or bump up...
00:47:52.000 What were they trying to do with the anti-vaccine posts?
00:47:55.000 Yeah, that was an interesting thing.
00:47:57.000 So I would say not much, to be honest.
00:48:02.000 There were 900, maybe 800, I think, tweets about vaccines in the content.
00:48:11.000 And so Facebook and Twitter, you have this, sorry, Facebook and Instagram, you have this building up of tribes.
00:48:19.000 Twitter, you have instead, they're just talking about whatever's popular, right?
00:48:25.000 They're shitposting, they're talking about whatever's current and new, whatever scandal has just broken anywhere, you know.
00:48:31.000 So, Twitter is less about establishing relationships and more about joining the conversation and nudging it.
00:48:38.000 And so, most of the vaccine-related posts, it was not a big theme for them.
00:48:42.000 It wasn't something that was on, like, Facebook and Instagram, where that's where they're really leaning in, like, this is what we want Americans to think about.
00:48:48.000 So, no mention of vaccines on those platforms, not on YouTube.
00:48:51.000 On Twitter, you see it in 2015, funny enough, during the Disneyland measles outbreak.
00:48:56.000 Much like there's a whole lot of conversation around vaccines right now because of the outbreaks in Washington and New York, back in 2015 you saw the same thing.
00:49:04.000 Lots of conversations about measles because the Disneyland thing that happened down here.
00:49:08.000 And so they're in there and they're saying, Vaccinate your kids, don't vaccinate your kids.
00:49:13.000 They had a couple of conspiracy theorist accounts.
00:49:18.000 I am trying to remember the name.
00:49:21.000 It looked like a blonde woman.
00:49:22.000 I think its name was Amy.
00:49:25.000 And Amy was a conspiracy theorist.
00:49:28.000 And Amy was a fake person.
00:49:29.000 Amy was a fake person, yeah.
00:49:31.000 I wish I could remember.
00:49:33.000 It was a Twitter page?
00:49:33.000 Yeah, it was a Twitter account.
00:49:35.000 God, what the hell was her name?
00:49:37.000 Amy Black?
00:49:39.000 There are certain of their personas actually got a lot of lift.
00:49:43.000 There was one called Woke Louisa that was a black woman.
00:49:45.000 There was, yeah, I mean, they nail it, right?
00:49:48.000 They're not dumb.
00:49:50.000 There was 10GOP, the fake Tennessee GOP page.
00:49:54.000 How much autonomy do you think these people have that are creating these things?
00:49:57.000 I mean, are they creative?
00:49:59.000 It sounds like some of them are actually pretty funny.
00:50:02.000 Yeah, they're funny.
00:50:03.000 They're funny.
00:50:03.000 That's why they work.
00:50:04.000 Everybody thinks it's just like, you know, incompetent shit.
00:50:07.000 It's not.
00:50:08.000 It's actually really good.
00:50:09.000 That's where, that's I think the thing that, you know, even with whatever, you know, political proclivities you may have, I think you can at least recognize humor, even if it's laughing at your side.
00:50:23.000 And I will say that some of the stuff, especially targeting the right-wing, you know, the right-wing like youth kind of pages, they were funny.
00:50:31.000 Yeah.
00:50:31.000 They really were.
00:51:04.000 I think?
00:51:21.000 And it's like, nice page you've got.
00:51:23.000 Be ashamed if anything happened to it.
00:51:25.000 And that's the meme that they're putting out there when they're complaining that their fake page got taken down.
00:51:30.000 There was tons and tons of these memes also about the Russians did it.
00:51:37.000 Mocking the idea that the Russians did it.
00:51:40.000 As the story is beginning to come out, before we've had the tech hearings, before we've had the Mueller indictments, before we've had the investigation, you see these memes where it's like, oh, my speedometer was broken, it must have been the Russians, or a picture of Hillary Clinton,
00:51:57.000 and it's in a little golden book kind of thing, and it's like the whiny child's guide to blaming Russia for your failures.
00:52:03.000 And again, it's funny.
00:52:04.000 The stuff is funny.
00:52:05.000 And they're meta-trolling.
00:52:07.000 And you imagine them sitting there.
00:52:08.000 They have a picture of some buff guy carrying a gun.
00:52:11.000 And they're like, I'm not a Russian troll, man.
00:52:13.000 I'm an American.
00:52:14.000 And I'm like, okay.
00:52:17.000 So you're looking at this and you're like, it's just so spot on.
00:52:21.000 And again, I can't see what the people commented under it.
00:52:24.000 If they were like, right on.
00:52:25.000 Or if they were like, ah, this is bullshit.
00:52:27.000 So that's where you get at the...
00:52:31.000 You know, people think like, oh, I'm too smart to fall for it, or oh, this is targeting those other people.
00:52:37.000 No, it isn't.
00:52:37.000 That's the problem.
00:52:38.000 It's just, it's going to target you with the thing that you're most likely to be receptive to just because of psychological bias and tribal affiliation.
00:52:47.000 And you're not sitting there thinking, how is this person who is purportedly just like me screwing with me?
00:52:54.000 And that's why it does manage to attract a following and get retweeted, get reshared.
00:53:01.000 It's good.
00:53:06.000 Yeah.
00:53:23.000 Evil warlord.
00:53:24.000 Crimea.
00:53:25.000 He invade.
00:53:26.000 We have a four-year-old's understanding.
00:53:30.000 If you just grabbed a person, a random person on the street, a college-educated person, and asked them to describe what's so bad about Russia, wow, it's like a communist over there or something.
00:53:40.000 They hate us.
00:53:42.000 First of all, they have bombs.
00:53:44.000 There's so little understanding of their culture, but yet they know so much about ours.
00:53:50.000 We're good to go.
00:54:09.000 You know, it's crazy.
00:54:10.000 It's weird.
00:54:12.000 And these people must have, like, a deep education on American culture, American politics.
00:54:19.000 Do you think they're training these kids?
00:54:22.000 Yeah.
00:54:22.000 Yeah, absolutely.
00:54:23.000 So they did a couple things that came out in the Mueller indictment.
00:54:26.000 First of all, a couple people actually came here and did a road trip around America.
00:54:30.000 Oh, wow.
00:54:31.000 Just to learn?
00:54:31.000 Went to Texas.
00:54:32.000 Yeah, yeah, yeah.
00:54:33.000 Basically.
00:54:34.000 Hello, Texas.
00:54:34.000 Tell me where you keep the beef jerky.
00:54:37.000 But that was in one of the, I think the Mueller indictment from February 2018. There have been three kind of big documents that have come out, two from Eastern District and one from Mueller on how it all worked out.
00:54:52.000 I think another misconception is this notion of $100,000 in ads.
00:54:57.000 They spent 18 million dollars in 2017, I believe, was the stat that came out during another one of the Mueller indictments.
00:55:04.000 So they're not just, you know, the money is not just going for the salaries and the ad buys, the money is also going for, they were talking about using kind of consultants.
00:55:14.000 And this is where you get at this thing that comes out during the stand-up where they're like, black people who are LGBT don't want to see white LGBT memes.
00:55:21.000 And And this degree of granularity, the degree of sophistication, but then also what you see them doing is engaging one-on-one.
00:55:29.000 And that's where it crosses the line from social media operation to this is much more like spying.
00:55:37.000 Did you watch The Americans?
00:55:38.000 No, I didn't.
00:55:39.000 Oh, I love that show.
00:55:40.000 You should definitely watch it.
00:55:41.000 I heard it's great.
00:55:41.000 It's great.
00:55:41.000 There's too many things to watch.
00:55:43.000 Totally.
00:55:44.000 But what's interesting is the—it does paint a pretty interesting picture of this couple under deep cover that are engaging with and pretending to be Americans and forming relationships with people.
00:55:56.000 And apparently it's based loosely on real people.
00:55:58.000 Yeah, that's what I've read also.
00:56:00.000 But what you see in the Mueller indictment is the text messages, the Facebook Messenger messages, where they're going back and forth with real activists— And they're saying things like, you know, hey, my ad account got shut down.
00:56:15.000 Can you run some ads for me?
00:56:18.000 Or the, hey, I want to help your protest.
00:56:23.000 We're fellow Black Lives Matter activists and we see you're running a protest up and I think it was like Ithaca or something, Binghamton.
00:56:30.000 How can we help you?
00:56:31.000 We can give you some money for posters.
00:56:33.000 And they're sending money for posters.
00:56:35.000 Or they reach out to a Trump supporter and they say, like, we think it'd be really funny to have a Hillary Clinton impersonator sitting in a truck, flatbed truck, that's made up to look like a jail.
00:56:46.000 Let us, you know, if we give you some money, will you find the Hillary Clinton impersonator and put her in jail and do this Hillary for prison thing?
00:56:54.000 And so this is where another thing that they did was using Facebook events to create real-world protests.
00:57:02.000 So they're not limiting it to shitposting online and making people feel tension online.
00:57:07.000 They're actually sending people out into the real world to have in-street violence.
00:57:12.000 And so one of the things that they did was they coordinated a Facebook event, one for the Texas secessionist page and one for the, there was a pro-Muslim called United Muslims.
00:57:23.000 And on the same day, at the same time, in Texas, they have a rally to support Texas culture and resist the Islamicization of Texas across the street from a rally to defend Muslim culture.
00:57:38.000 And so they, like, there's literally no, you know, they just create these Facebook events on these pages, and then they promote them with ad dollars and other things.
00:57:47.000 And you literally, if you go and you look at the Texas reporting from that day, I don't remember if it was dozens or hundreds, but a sufficient number of people Showed up that they had literally on opposite sides of barricades, police officers in the center screaming at each other because one group is there for the resist the Islamicization of Texas and the other group is there to like defend Muslim culture.
00:58:09.000 So you get two, you know, two opposite sides of the spectrum in the same place at the same time and you literally incite like a mini riot.
00:58:17.000 So there were about 81 of these events where they were holding...
00:58:22.000 Black Lives Matter-style rallies for victims of violence, police violence, memorials for people who were killed by police officers, you know, things that real Americans would do, but this wasn't being done by real Americans.
00:58:35.000 And that's the insidious thing, right, is how does Facebook detect that?
00:58:41.000 How do you, when you see this come to defend Texas culture and you're a diehard, proud Texan, you know, You're not thinking like somebody in St. Petersburg is organizing this.
00:58:54.000 And that's the, I mean, I think that the idea that this was just some memes is just not, it doesn't respect the significance of what they were trying to do and how effective that they were.
00:59:11.000 With these other things, or even if they're just trying it out just a little bit, just working to see what works, they're always experimenting, they're always trying to find ways to create that tension.
00:59:21.000 And that's the thing that I think is so interesting about this, right?
00:59:25.000 This evolving this idea of an information war where these tactics evolve, and you are really at a disadvantage when it comes to actually detecting them.
00:59:35.000 Yeah, and on the outside, if you're looking at that, you'd say, well, okay, what is their objective?
00:59:39.000 Why would they have this Texas secessionist rally across the street from a pro-Islam rally?
00:59:46.000 Why would they do that?
00:59:47.000 You know, if you're on the outside, you think about the amount of effort that's involved in doing something like this, and they're also doing this with no leaders, right?
00:59:55.000 There's no one there that's running it when they get there.
00:59:57.000 So all the pro-Texas people go, here we are!
01:00:04.000 I think a couple times there were comments on some of the like archived pages and things where you could see the screenshots of people being like, dude, you held us all come out there and like nobody showed up.
01:00:19.000 Right.
01:00:20.000 Who is in charge?
01:00:22.000 They're probably throwing a lot of things against the wall, hoping that they stick.
01:00:26.000 Well, you see this on the, there was a page called Black Matters.
01:00:31.000 And Black Matters was interesting because they went and made a whole website.
01:00:34.000 So they made a website, blackmattersus.com, which I think is still active.
01:00:40.000 It's dormant.
01:00:40.000 They're not updating it, but I believe you can go and read it.
01:00:44.000 And it was designed to call attention to police brutality type things.
01:00:48.000 And so they have this Black Matters US page.
01:00:50.000 And then there's the Black Matters Facebook page, the Twitter account, the Instagram page, the YouTube channel, the SoundCloud podcast, the Tumblr, the Facebook stickers.
01:01:01.000 They had Facebook stickers that looked like little black panthers, like little cats.
01:01:05.000 Yeah, little black cats.
01:01:06.000 They were actually really cute, very well done.
01:01:09.000 So you have this entire fake media ecosystem that they've just created out of whole cloth, all theirs.
01:01:15.000 And then what they start to do is they start to put up job ads.
01:01:18.000 And so it's come be a designer for us.
01:01:22.000 Come write for us.
01:01:23.000 Come photograph our protests.
01:01:27.000 They have a black guy dressed in a cool outfit, hipster, holding a sign, join Black Matters.
01:01:34.000 You see them go through a couple different logos the same way you would if you were starting a media brand.
01:01:41.000 Yeah.
01:01:58.000 They had a physical fitness thing.
01:01:59.000 It was called Black Fist.
01:02:00.000 And the idea was that it was kind of vaguely militant-esque in that it was supposed to teach black people how to handle themselves at protests should there be police violence, how to fight back.
01:02:15.000 And they actually went and found a guy, a physical fitness, you know, a martial arts guy, and they were paying him via PayPal.
01:02:22.000 So he was running classes for the black community under this Black Fist brand, and they would, like, text him or call him.
01:02:33.000 He played some of the voicemails on TV, actually, I heard them.
01:02:36.000 After my report came out, I think they tracked him down, and he just talks about how they, yeah, they just PayPal'd him, you know, a couple hundred bucks every time he ran a fitness class.
01:02:46.000 What were the voicemails like?
01:02:48.000 It was, um...
01:02:49.000 Hello, we are fellow black men concerned about police.
01:02:52.000 You'd be surprised.
01:02:55.000 They actually had a YouTube channel with two black men named Williams and Calvin, and there was this channel, Williams and Calvin, and...
01:03:02.000 They were actual black men?
01:03:04.000 Actual, yeah, yeah.
01:03:04.000 So they hired these gentlemen?
01:03:06.000 They hired these guys to be a fake YouTube channel.
01:03:10.000 And it was called A Word of Truth, I think was the name of it.
01:03:15.000 And so Williams and Calvin, these two guys, would give their word of truth.
01:03:20.000 And their word of truth was usually about how fucked up America is, which, I mean, there are very real grievances underlying all of this, and that's the problem, right?
01:03:28.000 They have things to exploit.
01:03:31.000 Were they writing these things for these gentlemen?
01:03:33.000 I imagine.
01:03:35.000 I mean, imagine they were.
01:03:36.000 But they were definitely paying them, and they organized the channel.
01:03:38.000 Seems likely.
01:03:39.000 The channel's organized, yeah.
01:03:40.000 So these guys...
01:03:41.000 Well, that particular one, I think that they were actually...
01:03:43.000 They were in on it.
01:03:44.000 They knew what they were doing.
01:03:45.000 Oh, really?
01:03:46.000 They knew they were working for the Russians?
01:03:48.000 One of the guys who was in that channel...
01:03:51.000 Popped up again in 2018, right before the midterms, like maybe even the day before.
01:03:56.000 I'm trying to remember the timeline here.
01:03:58.000 And he made a different video saying he wanted to...
01:04:02.000 This was amazing.
01:04:03.000 Saying he wanted to leave the Internet Research Agency, kind of.
01:04:05.000 So he was saying...
01:04:08.000 Basically, I'm tired of doing this work.
01:04:11.000 I want to do a leak.
01:04:13.000 I want to show you all of the things that the Internet Research Agency has done.
01:04:16.000 And so they actually put out this...
01:04:18.000 So this guy who had been in the Williams and Calvin videos, so people recognized his face, in the 2018 midterms goes and says he wants to leave and he's going to leak all this information and...
01:04:31.000 Sorry, this damn cough.
01:04:32.000 And he wants to confess.
01:04:35.000 I don't remember all the specifics because it was right before my thing came out and I was so busy working.
01:04:41.000 But yeah, he pops up again and he's saying he wants to expose the truth.
01:04:46.000 And I think most people didn't cover it, didn't pay attention.
01:04:51.000 YouTube shut down the channel and deleted the video immediately.
01:04:54.000 Why did they do that?
01:04:55.000 I think that it was seen as another influence operation.
01:04:58.000 You don't trust the intentions.
01:05:00.000 So even him saying that he's going to expose it was probably just another level.
01:05:06.000 Well, what wound up coming out, this is so convoluted.
01:05:11.000 I'm sorry.
01:05:11.000 I know it's hard to explain without visuals.
01:05:14.000 What they wound up doing was they did drop a bunch of docs.
01:05:17.000 So they did release a pile of documents in which they claimed they actually hacked the Mueller investigation and Mueller had nothing.
01:05:25.000 And so this is, again, another kind of convoluted piece of this where they do release information.
01:05:33.000 And so in this particular example, they release information that we believe they actually got through legal discovery.
01:05:39.000 So the documents that the investigation provided to one of the indicted Russians were the documents that they then leaked claiming they had hacked the Mueller investigation.
01:05:48.000 So they're constantly doing these things to generate press, generate attention, create just that degree of people don't know what's real.
01:05:57.000 Or they read the headlines that are then released by the more propagandist, overt Russian propaganda, and they think that that is the true story, that the Russians hacked the Mueller investigation.
01:06:11.000 So there's always this...
01:06:14.000 How do we create fear, uncertainty, and doubt?
01:06:16.000 How do we throw people off?
01:06:35.000 I do still regularly get surprised by the sheer kind of ballsyness and ingenuity of some of the stuff as it comes to light.
01:06:46.000 Well, it's really fascinating that they went so far as to hire people to make a fake account on YouTube and hired these black guys to pretend that they're doing that on their own and they're really being hired by the Russians to And then when the guys leave,
01:07:03.000 you don't know if they really did leave.
01:07:05.000 You don't know if this is just more bullshit.
01:07:07.000 It's like, like you were saying earlier, if they get you and you buy into it hook, line, and sinker, they win.
01:07:13.000 If they get you to think, well, how much else is bullshit, they still win.
01:07:18.000 Because you're looking at everything with sort of this tainted lens now.
01:07:23.000 And in a sense, that's probably the ultimate goal, is to disrupt Our social media environment and to sort of hijack the natural conversations that are taking place.
01:07:37.000 Yeah, and I think it's, I mean, it's effective.
01:07:39.000 There's certain, you know, I was in Estonia last year, and they've been targeted by this stuff for decades now.
01:07:46.000 You know, they have a 25% Russian-speaking population.
01:07:48.000 Most of the news that they get is from Russian media, right, you know, right on the border.
01:07:53.000 They talk a lot about the extreme commitment to educating their citizens, to make them realize that this kind of thing does happen.
01:08:04.000 This is what it usually looks like.
01:08:07.000 Don't share it.
01:08:08.000 Just ignore it.
01:08:09.000 Let it go by.
01:08:11.000 And I don't think we are quite there yet.
01:08:14.000 I think that there's still plenty of people in the country who don't believe it happened or for some reason are completely incapable of separating the Russian social media influence campaign happened from it means Donald Trump's election is illegitimate or it means Donald Trump colluded.
01:08:35.000 Those are very different statements.
01:08:37.000 You don't have to collude in order for someone somewhere to unsolicited go and support your candidacy.
01:08:44.000 So you can believe two things simultaneously.
01:08:47.000 One, that Trump did not collude and that his election is perfectly legitimate and that this had no impact.
01:08:52.000 And two, that it still happened.
01:08:54.000 And that, I think, is...
01:08:55.000 I am consistently amazed when I read my social media mentions at...
01:09:04.000 Right.
01:09:09.000 Right.
01:09:12.000 Right.
01:09:27.000 So that's where it plays out very differently depending on which part of the political spectrum you sit on.
01:09:35.000 Well, it falls right into the issue that we have with cognitive dissonance.
01:09:40.000 If we believe in someone or if we want someone to win, especially if it's our team or our person or on our side.
01:09:47.000 You know, I saw a lot of this when Donna Brazile released her book detailing how the DNC sort of rigged the primaries Against Bernie Sanders and for Clinton.
01:10:00.000 There were so many people that were Clinton supporters that just didn't want to believe it.
01:10:03.000 I was like, well, why wouldn't you believe this woman?
01:10:05.000 Like, you believed her before when she was supporting Clinton.
01:10:07.000 And then when she leaves, now you won't believe her.
01:10:10.000 It's because it's inconvenient.
01:10:11.000 And we're real weird in our binary view.
01:10:15.000 We want things to be good or bad.
01:10:18.000 One or zero.
01:10:19.000 This is it.
01:10:20.000 And this is a super complex issue.
01:10:23.000 It seems like they've been doing this for a long time, and they've gotten really sophisticated at it.
01:10:29.000 And I think there's a lot of people that have been sucked into it that have no idea that it's actually influenced the way they've formed their own opinions.
01:10:35.000 This is where it gets really strange.
01:10:37.000 People are so malleable.
01:10:38.000 And they're so easily manipulated, many people are, that something like this, like a real good, solid, concentrated effort to try to target these groups that have these very specific interests and really dig in and form roots and then go out.
01:10:57.000 I mean, it's so sophisticated.
01:10:59.000 Their approach is, on one hand, horrified, on the other hand, deeply impressed.
01:11:04.000 Yep.
01:11:05.000 Me too.
01:11:07.000 Now, was this freaking you out when you had to go over all these memes and you were actually laughing at them?
01:11:13.000 And you're like, God damn it.
01:11:16.000 Well, you know, there's that tweet that goes around every now and then.
01:11:20.000 You don't have to hand it to them.
01:11:21.000 And I'm always like, how do I properly convey recognition for the...
01:11:32.000 I don't think we do ourselves any favors by pretending it all sucked and didn't matter and they're incompetent.
01:11:37.000 I think that you have to acknowledge that you have a sophisticated adversary that is very capable, that is very determined, that is constantly evolving, and to treat that with the degree of respect it deserves.
01:11:47.000 I think that that's just common sense, actually.
01:11:54.000 I read media on both sides of the aisle and I, you know, I feel I try to stay current actually on what memes are percolating in lots of different spaces in part just because I am always curious about what's organic versus what seems to be disproportionately amplified or what new communities are popping up.
01:12:12.000 I just think it's an, I think the spread of information among people is just a very interesting thing.
01:12:18.000 It's something that interests me a lot.
01:12:19.000 I think crowd psychology is really interesting.
01:12:22.000 I think ways that crowd psychology has transformed as the internet has kind of come into being, particularly with things like the mass consolidation, the ease with which we can target people.
01:12:34.000 We didn't even really talk about that, but the...
01:12:37.000 One of the things with...
01:12:38.000 Even in the decentralized internet, there's always been propaganda, there's always been crazy conspiracy theories, all this stuff.
01:12:45.000 But it's that you can reach the people who are likely to be receptive to it now.
01:12:49.000 And as people self-select into tribes, particularly in this country right now, one of the things that's remarkable is the way in which once you've...
01:12:58.000 Self-selected into that tribe, and this is the media in your ecosystem, and you share it with your friends, and Facebook ensures that the people who see it are the people who are most likely to be receptive to it.
01:13:07.000 Or if you run the ad targeting, you directly send it into the feeds of people most likely to be receptive to it.
01:13:14.000 We have this interesting phenomenon where consolidation, targeting, and then these gameable algorithms mean that it's just...
01:13:23.000 This kind of information goes way farther, way faster than it ever could have in the past, regardless of whether it's Russia pushing it or Iran, as we've seen a network of Iranian pages went down recently.
01:13:37.000 We see this globally now.
01:13:40.000 We see countries targeting their own people with it.
01:13:42.000 And it's just, this is the information ecosystem.
01:13:47.000 This is like the new infrastructure for speech.
01:13:50.000 And it, sorry, privileges this kind of sensationalist content.
01:13:56.000 Yeah, do you have one?
01:13:56.000 That'd be great.
01:14:00.000 I have some.
01:14:02.000 Colds are going around.
01:14:04.000 Don't feel bad.
01:14:05.000 I'll get you one.
01:14:06.000 Hold on a second.
01:14:07.000 Who's seen all this stuff?
01:14:10.000 Is this stuff...
01:14:11.000 Obviously, Facebook has checked this out.
01:14:15.000 I'm sure Twitter's aware.
01:14:17.000 What has the reaction been?
01:14:19.000 And is there any sort of a concerted effort to mitigate some of the impact that these sites have?
01:14:25.000 Yeah, lots of it, actually.
01:14:26.000 So, I think in...
01:14:35.000 In 2017 was when we started, like, we being independent researchers, I guess, people on the outside of the companies, academics, began to find the content, you know, really began to, investigative journalists would identify the name of a page,
01:14:52.000 and then me and people like me would go and we would scour the internet looking for evidence of what was on that page.
01:14:57.000 So I found a bunch of the stuff on Pinterest, for example, wrote about it.
01:15:01.000 A guy by the name of Jonathan Albright found a CrowdTangle data cache.
01:15:06.000 And with that, we got the names of a bunch more pages, a bunch more posts, and we had some really interesting stuff to work with.
01:15:11.000 Originally, the platforms were very resistant to the idea that this had happened.
01:15:16.000 And so as a result of that, they were...
01:15:23.000 In, you know, there was a, the first thing that Zuck said in 2016, when, you know, Trump gets elected, Twitter, it goes crazy that night with people who work at Twitter, saying, Oh, my God, were we responsible for this, which is a very Silicon Valley thing to say.
01:15:40.000 But what I think they meant by that was, their platform had been implicated as hosting Russian bots and fake news and And harassment mobs and a number of other things.
01:15:49.000 And there was always the sense that it didn't have an impact and it didn't matter.
01:15:51.000 And so this was the first time that they started to ask the question, did it matter?
01:15:56.000 And then Zock made that statement.
01:15:59.000 Fake news is a very small percentage of whatever on Facebook, the amount of information on Facebook.
01:16:04.000 And the idea that it could have swung an election was ludicrous.
01:16:08.000 So you have the platforms, kind of the leaders at the platforms, digging in and saying it's inconceivable that this, you know, could have happened.
01:16:19.000 And as the research and the discovery begins to take place over the next nine months or so, you get to when the tech hearings happen.
01:16:30.000 So I worked with a guy by the name of Tristan Harris.
01:16:34.000 He's the one who introduced me to Sam...
01:16:37.000 And he and I started going to D.C. with third fellow, Roger McNamee, and saying, hey, there's this body of evidence that's coming out here, and we need to have a hearing.
01:16:51.000 We need to have Congress ask the tech companies to account for what happened.
01:16:56.000 To tell the American people what happened.
01:16:59.000 Because what we're seeing here as outside researchers, what investigative journalists are writing, the things that we're finding just don't line up with the statements that nothing happened and this was all no big deal.
01:17:11.000 And so we start asking for these hearings.
01:17:15.000 And actually, myself and a couple of others then began asking them, in the course of these hearings, can you get them to give you the data?
01:17:23.000 Because the platforms hadn't given the data.
01:17:26.000 So it was that lobbying by concerned citizens and journalists and researchers saying, we have to have some accountability here.
01:17:33.000 We have to have the platforms account for what happened.
01:17:36.000 They have to tell people, because this had become such a politically divisive issue, did it even happen?
01:17:42.000 And we felt like having them actually sit there in front of Congress and account for it would be the first step towards moving forward in a way, but also towards changing the minds of the public and making them realize that what happened on social platforms matters.
01:18:03.000 And it was really interesting to be part of that as it played out.
01:18:10.000 Because one of the things that Senator Blumenthal, one of the senators did, was actually said, Facebook and Twitter have to notify people who engage with this content.
01:18:19.000 And so there was this idea that...
01:18:23.000 If you are engaging with propagandist content, you should have the right to know.
01:18:28.000 And so they started to push messages.
01:18:30.000 Twitter sent out these emails to all these people saying, you engaged with this Russian troll.
01:18:37.000 And Facebook created a little field, a little page that told people if they had liked or followed a troll page.
01:18:45.000 So it was really trying to get at making the platforms accountable.
01:18:49.000 But they did it outside the platform through email, huh?
01:18:52.000 Which is interesting because I would never read an email that Twitter sends me.
01:18:56.000 Right?
01:18:56.000 You're like, this has just got to be nonsense.
01:18:58.000 I didn't get one, so I maybe...
01:19:00.000 I guess I just got lucky, but...
01:19:04.000 I might have had a multiple day back and forth with some Russian troll.
01:19:09.000 But that was, I think, one of the first steps towards saying, like, how do we make the platforms accountable?
01:19:14.000 Because the idea that platforms should be accountable was not a thing that everybody agreed on in 2015 when we were having this conversation about ISIS. And that's where there's the through line here, which is, and it does connect into some of the speech issues too, which is,
01:19:30.000 what kind of monitoring and moderation do you want the platforms to do?
01:19:43.000 That we're really concerned that if we moderated ISIS trolls on Twitter, now not the beheading videos, there was sort of universal agreement that the beheading videos should come down.
01:19:55.000 But if we took out what were called the ISIS fanboys, which were like 30-40,000 accounts at their peak, that we would, yeah, there's a document called the ISIS Twitter Census for anyone who wants to actually see the research done on understanding the Twitter network in 2015. There was a sense that,
01:20:11.000 like, one man's terrorist was another man's freedom fighter.
01:20:14.000 And if we took down ISIS fanboys, were we stifling their freedom of speech, freedom of expression, and like, goodness, what would come next?
01:20:22.000 And that, when you look at that, that fundamental swing that has happened now in 2018, 2019, Where there's that same narrative because originally no moderation was taking place and then now there's a feeling that it's kind of swung too far in the other direction.
01:20:41.000 But the original conversations were really...
01:20:45.000 How do we make Twitter take responsibility for this?
01:20:49.000 And legally, they aren't responsible for it, right?
01:20:54.000 They are legally indemnified against the...
01:20:57.000 They're not responsible for any of the content on their platforms.
01:21:00.000 None of the platforms are.
01:21:01.000 There's a law called Communications Decency Act Section 230, and that says that they're not responsible.
01:21:07.000 They have the right to moderate, but not the obligation to moderate.
01:21:11.000 Because they are indemnified from responsibility.
01:21:14.000 So the question becomes, now that we know that these platforms are used for these kinds of harms and they are used for this kind of interference, where is that balance?
01:21:23.000 What do we want them responsible for monitoring and moderating?
01:21:28.000 And how do we recognize that that is occasionally going to lead to incorrect attributions, people losing accounts and things like that?
01:21:39.000 So...
01:21:41.000 Yeah, they're in a weird conundrum right now where they're trying to keep everything safe and they want to encourage people to communicate on the platform so they want to keep people from harassing folks.
01:21:53.000 But because of that, they've also got these algorithms and they tend to miss Very often, like this whole learn to code fiasco, where people are getting banned for life for saying learn to code, which is about as preposterous as it gets.
01:22:10.000 I think the learn to code fiasco is going to be the tipping point where a lot of people in the future, when they look back on when did the heavy-handedness become overreach, Learn to code.
01:22:22.000 Because, I mean, Jesus Christ, I mean, if you can't say learn to code...
01:22:25.000 I mean, I look at my mentions, I mean, on any given day, especially like yesterday, I had a vaccine proponent.
01:22:32.000 Yeah, I watched it.
01:22:33.000 Peter Hotez.
01:22:34.000 He's a great doctor.
01:22:34.000 Yeah, Peter's great.
01:22:35.000 And, you know, and...
01:22:36.000 It seemed like what was really disturbing to me was like the vast majority of the comments were about vaccines and so few about these unchecked diseases that are running rampant in poor communities, which was the most disturbing aspect of the conversation to me.
01:22:51.000 That there's diseases that rob you of your intellectual capacity that are extremely common that as many as 10% of people in these poor neighborhoods have.
01:23:00.000 Almost no discussion.
01:23:01.000 It was all just insults and, you know, you fucking chill and this and that.
01:23:07.000 I know.
01:23:08.000 My mentions are going to be interesting.
01:23:09.000 Well, they're going to be a disaster today.
01:23:11.000 I know.
01:23:12.000 I know.
01:23:13.000 Well, let me...
01:23:16.000 I think that one of the challenges for the platforms is a lot of things start out, like learn to code.
01:23:23.000 I remember I watched that play out.
01:23:25.000 Covington Catholic was another thing that, I mean, God.
01:23:30.000 With learn to code, there was some of the people who were trolling and just saying learn to code and, you know, whatever.
01:23:35.000 You don't have a right to not be offended.
01:23:37.000 Right.
01:23:37.000 But then there was the other accounts that kind of took it that step further and began to throw in like the ovens and the other stuff with Learn2Code, right?
01:23:46.000 And that's one of the challenges with the platform, which is if you're trying to assess the...
01:23:56.000 Just the content itself.
01:23:58.000 If you start doing keyword bans, you're going to catch a lot of shit that you don't want to catch.
01:24:03.000 But the flip side is, this is the challenge of moderating at scale, which is what side do you come down on?
01:24:13.000 Do you come down on saying 75% of people with hashtag learn to code are just...
01:24:26.000 Yeah.
01:24:37.000 I don't know that there's an easy answer.
01:24:39.000 I think that we are, you know, even today, what was the latest kerfuffle?
01:24:44.000 Elizabeth Warren got an ad taken down on Facebook and then there was a whole conversation about was Facebook censoring Elizabeth Warren.
01:24:51.000 I personally didn't think that it read like censorship.
01:24:53.000 What was the ad about?
01:24:54.000 It was an ad about, funny enough, her platform to break up Facebook.
01:25:00.000 Whoa, so Facebook took that down?
01:25:02.000 Like, yeah, listen, Hooker.
01:25:05.000 It sort of read more like a cell phone.
01:25:07.000 Like, she had a picture of Facebook's logo in the image, and that violates the ad's terms of service.
01:25:15.000 And the reason behind that is actually because Facebook doesn't want people putting up ads that have the Facebook logo in it, because that's how you scam people, right?
01:25:23.000 That's a great way to rip people off.
01:25:25.000 And so, probably just like an automated...
01:25:28.000 You know, an automated takedown, like an automated, like it halts the ad.
01:25:31.000 You have to go and make some changes and then you can push the ad back out again.
01:25:34.000 But it just happens at a time when there's like so little assumption of good faith and so little assumption of such extreme anger and polarization and, you know, assumption that the platforms are censoring with every little kind of moderation snafu that it makes it,
01:25:54.000 I think...
01:26:00.000 I think...
01:26:17.000 I don't have any good answers.
01:26:18.000 No one does.
01:26:19.000 That's part of the issue.
01:26:20.000 And Vidja discussed that pretty much in depth when she was saying this is about moderating in scale when you're talking about millions and millions and millions of posts and a couple thousand people working for the organization.
01:26:35.000 And then algorithms and computer learning that's trying to keep up and that's where things like learn to code and people are so outraged and pissed off because when they do get banned they feel like they've been targeted but you really just ran into some code and then it's really hard to get someone to pay attention to your appeal because there's not enough people that are looking at these appeals and there's probably millions of appeals every day.
01:26:59.000 It's almost impossible.
01:27:01.000 Yeah, and there's, you know, depending on which side you're on, you also hear, like, this person is harassing me and I'm demanding moderation and nobody's doing anything about it.
01:27:11.000 Right, yes.
01:27:11.000 You hear that.
01:27:19.000 It's interesting to look back at 2016 and wonder how much of where we are now is in part because not a whole lot happened in 2016. In 2015 in particular, very light, like almost no moderation,
01:27:34.000 just kind of let it all hang out there.
01:27:36.000 And I look at it...
01:27:39.000 I look at it now, particularly as it evolves into this conversation about free speech, public squares, and what the new kind of infrastructure for speech, what rights we should expect on it.
01:27:53.000 It's a really tough...
01:28:00.000 You know, I think some of it is almost like the people who hear the words free speech and they just assume that it's people asking for carte blanche right to harass and saying, you know, how do we balance that?
01:28:13.000 I think Jack and Vijaya were saying this on your show.
01:28:17.000 How do we maximize the number of people who are involved and make sure that all voices do get heard without being unnecessarily heavy-handed and moderating a thought or content and instead moderate behavior?
01:28:31.000 And instead moderate particular types of signatures of things that are inauthentic or things that are coordinated and looking at this again gets to disinformation too rather than trying to police disinformation by looking at content really looking instead at actions and behavior and account authenticity and dissemination patterns because a lot of the worst trolls and stuff are just using these throwaway accounts and then they disappear.
01:28:57.000 Well, I have the impression myself that when we're talking about censorship, we're talking about moderating content, that really we're talking about this current era and that what's coming is essentially we're like putting up a small twig fence and a herd of stampeding buffaloes on the way in terms of the more invasive or the more The
01:29:28.000 more potent levels of technology that are on the way, I just feel like everything is moving in a very specific direction.
01:29:37.000 And that very specific direction is less boundaries between people and information.
01:29:42.000 And that includes communication.
01:29:44.000 And it's going to be insanely difficult or nearly impossible to moderate in 10 years.
01:29:52.000 I just don't think it's going to be in the wheelhouse.
01:29:57.000 I think we're entering into some weird place where we're either going to have to stay off of social media because it's just too toxic or grow a thick skin and just be able to deal with anything.
01:30:08.000 And then if that's the case, how are we going to be able to differentiate between things that are particularly designed to manipulate us, specifically designed to manipulate us and change our opinions by foreign entities like this Russian troll farm?
01:30:24.000 I do think the...
01:30:25.000 You know, when I think about like...
01:30:29.000 So if we believe that disinformation is in part facilitated by gameable algorithms, consolidation, and then the targeting, the kind of things we've talked about through this conversation, then I think that the algorithms piece,
01:30:45.000 the manipulatable algorithms, that's really squarely the responsibility of the platforms.
01:30:49.000 I don't think that there's any regulation or any kind of framework that's going to come from Congress that's going to address that.
01:30:57.000 Well, that's pretty clear from the Facebook hearings, right?
01:30:59.000 I mean, they barely understood the difference between an Android phone and an iPhone.
01:31:03.000 They really don't know what's going on.
01:31:05.000 Tim Apple.
01:31:06.000 Yeah.
01:31:07.000 That's the king of the world.
01:31:09.000 Right.
01:31:11.000 He says he did that on purpose, which is even more hilarious.
01:31:15.000 Just say you fucked up, man.
01:31:17.000 I mean, to say that you knew Tim Cook, his real name wasn't Tim Apple.
01:31:22.000 It is funny calling him Tim Apple, though.
01:31:25.000 I appreciated the rest of every CEO in Silicon Valley changing their Twitter handle afterwards.
01:31:30.000 Yeah, that was funny.
01:31:31.000 But I think the...
01:31:32.000 I just lost my train of thought on that, too.
01:31:34.000 Sorry.
01:31:35.000 No, it's okay.
01:31:37.000 You were talking about them...
01:31:39.000 Oh, yeah, regulating.
01:31:40.000 ...gaming algorithms.
01:31:40.000 Regulating.
01:31:41.000 Yeah.
01:31:41.000 So I think that ultimately the algorithmic piece does remain squarely in the purview of the platforms, and that's because...
01:31:50.000 It's an arms race, right?
01:31:51.000 As they change their algorithm a little bit, tweak it for the product function, which they just do in their role as business, there is no regulation that's going to come down fast enough to catch that.
01:32:01.000 I think actually finance is an interesting parallel here.
01:32:05.000 Because in the financial markets, there are these multi-tiered levels of regulation and oversight so that there's always some entity responsible, whether it's the exchange or a self-regulatory organization or the government and the SEC, looking to see if information integrity in the markets is being maintained.
01:32:23.000 There's no shitty algorithm coming in to manipulate people.
01:32:27.000 It's just making sure that we have that level of trust.
01:32:30.000 So I think that right now the tech ecosystem is lacking regulation in all of its forms.
01:32:35.000 So that will likely change.
01:32:37.000 But the argument for decentralization is, I don't know how you execute it.
01:32:42.000 The antitrust thing in particular, as it comes up so much more now.
01:32:47.000 Excuse me, I don't know under what economic grounds you make that claim that's way outside of my wheelhouse in my area, but there is something to be said for this, you know, return to decentralization in some way.
01:33:02.000 I feel like it lets people have what they want.
01:33:06.000 And it lets you, you know, Reddit's a great example.
01:33:08.000 You have these, it's almost like federalism.
01:33:11.000 You have this central platform, but then you have these little communities under it.
01:33:15.000 And each community has its own norms.
01:33:17.000 Each community has its own rules.
01:33:20.000 Nobody who violates the moderator rules in a Reddit and screams censorship is really taken seriously.
01:33:26.000 Right.
01:33:26.000 This is the rules of the community, you're in the community, there you go.
01:33:29.000 And this was how, in the olden days of the internet, like Usenets and things, you would have, this is the community that you've chosen to be a part of.
01:33:36.000 If you don't like the moderation standards, you go to this other community.
01:33:39.000 I think the concern with consolidation is that people who do get moderated away feel like there's nowhere for them to go.
01:33:48.000 That they've lost access to the entire world.
01:33:52.000 Right.
01:33:53.000 So I think if you have that decentralization, in some ways it It stops being quite so much of a freedom of speech issue if you can't speak on – if everything is like – if there's 50 different platforms and you fall foul of some sort of norms or standards or community membership in this one,
01:34:16.000 you can go over here to this other one.
01:34:18.000 Then the idea that somebody has moderated you away or deplatformed you or something is much less potent maybe.
01:34:26.000 Well, on Reddit, though, if you dock someone or something along those lines, it has to be very egregious.
01:34:31.000 Again, that federalism thing, you have the little moderation at the lower levels versus kind of top level, like you're summarily booted off.
01:34:40.000 Yeah, that seems like the best approach, right?
01:34:42.000 It seems like the best approach is to sort of let these communities sort of establish themselves.
01:34:47.000 But even inside those communities, then you have people that gain power...
01:34:51.000 Through moderation, and they start abusing it, and then it becomes some sort of a weird hierarchy.
01:34:59.000 The decentralization in general is probably the right move for all this stuff, but how does that happen with something like Facebook or Twitter without government intervention?
01:35:11.000 And, you know, that's one of the things that Tim Pool was bringing up.
01:35:14.000 Like, if you guys don't do this, if you don't handle this, it's entirely possible somewhere down the line you're going to be regulated by the government.
01:35:22.000 Do you really want that?
01:35:23.000 I think that that's an inevitability.
01:35:25.000 You think so?
01:35:25.000 Yeah, at this point, yeah.
01:35:27.000 What do you think is going to happen?
01:35:27.000 I think it just becomes...
01:35:30.000 Well, you know, honestly, I say that and then I think back to the reality, which is in this Congress with this executive, I don't know how we get any regulation through.
01:35:43.000 I think we've seen some examples, like the Honest Ads Act, which was introduced right before the first tech hearing, if I'm remembering the timeline correctly.
01:35:55.000 So that would have been like Late 2017. And what they said was, you can no longer have a free-for-all with ads on social platforms where nobody knows who paid for it or where it's coming from or anything like that.
01:36:14.000 So Senator Klobuchar and Senator Warner, and I think Senator McCain also was part of this, create this law saying that the platforms have to follow the rules that TV and radio already follow.
01:36:26.000 And this is an example of recognizing the role that, you know, these are no longer startups that can't meet these obligations.
01:36:36.000 It used to be that Facebook was exempt from these disclosure requirements because their ads used to be those, remember those tiny postage stamp-sized things that were on the right side of the page?
01:36:47.000 So they had a finding.
01:36:50.000 They were given the same exemption that campaigns get for skywriting and postage and pencils, where it's like literally the form factor of the content makes it such that you can't put the ad disclaimer on there.
01:37:04.000 And it used to be that all of the advertising on Facebook was regulated using that same finding that these postage stamp size things are too small to put the disclosures on.
01:37:15.000 And then, of course, as we know, that evolved into the ads looking much like an organic post, and so now they do have these little things that pop up where you can see why you got targeted and what it is.
01:37:25.000 I think that that's an example of the credible threat of regulation and the public opinion moving the platform to take an action that it wouldn't have necessarily done on its own.
01:37:35.000 So it's not regulation, but it's...
01:37:40.000 It's a nudge through public opinion and the credible threat of future regulation.
01:37:46.000 We've seen California go after the platforms also recently.
01:37:50.000 There was that California GDPR thing from last year.
01:37:54.000 California state legislature saying we're going to pass a privacy requirement.
01:37:59.000 And they did it.
01:38:02.000 They got it done.
01:38:03.000 What is the privacy requirement?
01:38:05.000 Oh boy.
01:38:07.000 I feel like I'm probably not the best person to explain this because I don't know the specifics, but the GDPR was the law in the UK and in Europe that protects the data.
01:38:18.000 It creates particular protections like you have to re-opt in for targeting.
01:38:24.000 There are certain kinds of targeting that they can't do, certain types of data that you can request they delete.
01:38:28.000 So this is a provision that took effect in Europe last year.
01:38:32.000 We don't have that same law here in the U.S. We don't have the same data protections as the Europeans.
01:38:37.000 And so California GDPR was the California state government, the state senate legislature, passing a law that basically mimicked a lot of the provisions of what the Europeans were given under GDPR. But what that did was it created a law that applied to the people of California.
01:38:55.000 And so Facebook and Twitter and the others don't want to be in a position of having to have this, you know, kind of balkanization of legal requirements.
01:39:05.000 And so they in turn have now, I believe, gone to Congress suggesting that we're going to need to have a federal solution that applies to all of the U.S., So a federal-level privacy regulation because they don't want to have to adhere to the privacy regulations of each individual U.S. state.
01:39:22.000 One solution that's been tossed up was that people would have to somehow or another confirm their identity.
01:39:30.000 That instead of it being an anonymous post, that it would have to say, Renee DiResta.
01:39:35.000 Like, I know who you are.
01:39:37.000 You have to have a photograph, have some sort of a government ID. It shows that it's you.
01:39:42.000 That we would somehow or another minimize trolling, minimize disinformation if your account was connected to a social security number or whatever it was.
01:39:53.000 The problem, of course, is that these damn things get hacked all the time.
01:39:56.000 And if your Twitter account gets hacked, now they have your social security number, they have your address, they have your information that you use to sign up.
01:40:06.000 Unless there's some way, and there isn't, to absolutely lock down all that data and make sure that it's inaccessible to some sort of third party, that doesn't seem like a likely course of action either.
01:40:19.000 And I think...
01:40:22.000 The question of identity, I think most social science research that I've read has suggested that that's not necessarily the be-all and end-all.
01:40:32.000 I think it depends on, per your point, what you're trying to do.
01:40:37.000 I'm thinking right now of the—there is this request for— I think?
01:40:59.000 You know those crappy data brokers, those horrible things where your name and your address is up there and no matter how hard you try, you can't get it down.
01:41:07.000 So scraping those to grab names and addresses and then email addresses and leaving comments pretending to be those people.
01:41:17.000 It's hard.
01:41:18.000 Most people don't want to enter their social security number into some form or validation.
01:41:24.000 A lot of people will point to things like, well, America has a strong commitment to anonymous speech.
01:41:33.000 So there's that cultural thing.
01:41:35.000 People will point to Federalist Papers and so on and so forth.
01:41:39.000 And whistleblowers.
01:41:41.000 And whistleblowers, yeah.
01:41:42.000 I think I've seen...
01:41:44.000 I remember when Facebook did make it a requirement to...
01:41:47.000 You have to validate your actual name and address.
01:41:50.000 They send a postcard to your house if you want to run political ads.
01:41:54.000 And then I remember people complaining that people who didn't have...
01:41:59.000 That this was going to...
01:42:01.000 What was it?
01:42:02.000 God, this was during the DACA arguments.
01:42:06.000 DACA? Yeah, during the...
01:42:09.000 This was during some of the illegal immigration debates.
01:42:11.000 Right as this was happening, people began complaining that immigration activists who were undocumented would not be able to run Facebook ads because they didn't have identification to verify with.
01:42:22.000 So no matter what people put out, there's going to be somebody who has a complaint about it.
01:42:29.000 Sure.
01:42:29.000 So we're in this gridlock.
01:42:34.000 Everybody recognizes that the situation sucks and that social media is a disaster on a myriad number of fronts.
01:42:43.000 And there's not much in the way of plausible solutions.
01:42:51.000 I think...
01:42:52.000 For disinformation in particular, just to stay in my wheelhouse, we're trying to push towards multi-stakeholderism, which is just to say, can we create the back channels of communication that have come up for election integrity and things over the last few years,
01:43:09.000 last year and a half?
01:43:11.000 Can we standardize that in some way?
01:43:13.000 Can we create an oversight body, maybe the FTC, that is at least responsible for having some oversight to make sure the platforms are doing enough But this is, I think, this is going to be the theme of 2019. Does it go the antitrust route?
01:43:28.000 Does it go the privacy route?
01:43:29.000 Like, does it do a kind of hybrid combination of multiple, you know, tackling multiple problems at once?
01:43:37.000 I'm really curious to see how we shake this out because it just seems like no, you know, even agreeing on what the problem is, we're not quite there yet.
01:43:47.000 Yeah, and you are seeing calls, particularly from Elizabeth Warren, for breaking up a lot of these larger institutions, not just even social media, but even Amazon.
01:43:59.000 She's talking about breaking up a lot of these bigger companies.
01:44:03.000 It's...
01:44:05.000 The problem with that is like, to what?
01:44:07.000 And make them what?
01:44:08.000 And then what happens?
01:44:10.000 And then, you know, what if one of those things that you broke up that becomes Twitter becomes more popular than the other Twitter, and it has much more attendance, then what do you do?
01:44:20.000 Yeah, you should get one of those people on to just talk about that, because Lena Conner, or somebody who really, Matt Stoller, knows this space in and out, and I just don't.
01:44:32.000 The...
01:44:35.000 I, you know, personally, I feel like a lot of people are moving into smaller communities.
01:44:42.000 A lot of people are moving into groups or moving into WhatsApp chats.
01:44:46.000 They're recognizing that the system as it is right now has this toxicity and are withdrawing a bit.
01:44:56.000 I don't know if you've seen this in your friends or community, but...
01:44:58.000 Well, Jamie and I were actually talking about it yesterday in terms of the use of Twitter.
01:45:04.000 The use of Twitter has dropped.
01:45:06.000 One thing that I notice is that my follower numbers doesn't move very much on Twitter as opposed to Instagram.
01:45:14.000 I don't really use Facebook, but Instagram, there's a giant difference in how many followers I get per day on either platform.
01:45:22.000 And it seems to me that the people that are using Twitter, they've kind of like locked in, they've found their little communities, and it's mostly toxic.
01:45:31.000 I mean, I'm sure I'm generalizing.
01:45:36.000 I am, for sure.
01:45:37.000 It's probably not even 10% toxic, but it seems toxic.
01:45:41.000 You know, when you look at those kind of comments and anytime something happens, it seems like the reaction to it is very rarely is it some sort of objective, rational discourse.
01:45:51.000 It's most likely just insults and, you know, swears.
01:45:56.000 It's weird.
01:45:57.000 It's just people are communicating in a way online that if they communicated in real life, there would be blood in the streets.
01:46:08.000 I think about that a lot.
01:46:10.000 Do you?
01:46:32.000 A national public square.
01:46:33.000 There's no such thing in the history of America as a national public square.
01:46:38.000 There are regional public squares or town squares, state squares, you know, where there is, again, this kind of federalism.
01:46:46.000 People who have self-selected to live in a particular community, there's norms in that community.
01:46:50.000 But if I were to go up to you in a public square and start screaming in your face or...
01:46:55.000 You know, being an asshole and trying to get a whole mob together to go after you, like probably somebody would intervene, either a bystander or the police.
01:47:03.000 And we have notions of like nuisance and things like that.
01:47:05.000 We have notions of like, there's more of an intuitive sense of the balance between speech, which is to be protected, and then the kind of fighting words and that sort of thing.
01:47:16.000 There's no clear lines on that.
01:47:18.000 There's not much in the way of norms.
01:47:20.000 And when you're online, there is nobody who's going to come and step in and intervene in a way that would play out in real life.
01:47:27.000 So I think that we just haven't quite ported those norms of basic good behavior in the real world into this I think there's actually a carryover to real life from social media that you can find in these protests at universities when conservative speakers come.
01:47:56.000 And then Antifa wants to shut them down, and then you have people like the Proud Boys fight with Antifa.
01:48:02.000 I don't remember that before, ever.
01:48:05.000 I think this is a byproduct of the type of communication that's the norm on social media.
01:48:11.000 I really think that's what's happening here.
01:48:13.000 I really think that instead of...
01:48:15.000 Social media mimicking real life.
01:48:18.000 Real life is starting to mimic the kind of interactions that people have on social media and with violent repercussions.
01:48:24.000 It's certainly, if not violent, very aggressive and angry in a way that...
01:48:29.000 Go back before social media.
01:48:31.000 When was the last...
01:48:32.000 I mean, in 2000, in the 1990s, how often were there these incredibly volatile...
01:48:41.000 Protests at universities where you have conservatives and liberals screaming at each other and you have these people that are being deplatformed and they won't let them speak at these colleges and then they're gathering up this online mob to try to bolster support and then people come to meet them.
01:48:59.000 We're going to stop them at all costs.
01:49:01.000 It's kind of flavoring real life versus real life being represented in social media.
01:49:10.000 Yeah, I don't remember it from when I was in college either.
01:49:12.000 No, it didn't exist.
01:49:14.000 Not the expert on college protests.
01:49:18.000 Maybe people would point to the 60s or something.
01:49:20.000 You've got to go back to Kent State.
01:49:21.000 That's what I was going to say.
01:49:22.000 You've got to go back to war protests.
01:49:23.000 But they were protesting something very specific, an unjust war that nobody wanted to be a part of.
01:49:30.000 This is a weird time.
01:49:31.000 Yeah, it is.
01:49:34.000 I know it feels unstable.
01:49:37.000 Yes.
01:49:38.000 Yes, that's the best way.
01:49:39.000 But it's also awesome.
01:49:42.000 I'm enjoying the shit out of it.
01:49:44.000 Why is that?
01:49:45.000 Well, because I love the fact that for the first time in my life, it does not seem like the government has a fucking handle on how people are behaving and thinking at all.
01:49:56.000 They don't know what's going on.
01:49:58.000 It's like you've got people that are...
01:50:01.000 You know, people that are trying out socialist tropes and socialist ideas for the first time in the mainstream, and they're getting a big groundswell of support behind it.
01:50:11.000 And you have a lot of people that are like pro-nationalists and pro-America for the first time in a long time, and that's getting a lot of support.
01:50:19.000 There's more discourse now, even if it's toxic.
01:50:22.000 And I think a lot of it isn't.
01:50:24.000 I think, like I said, if 10% is toxic, it seems like it's all toxic.
01:50:28.000 If 1 out of 10 people calls you a piece of shit, you're like, God, I gotta get out of this fucking forum.
01:50:32.000 Right?
01:50:33.000 I mean, that's really how it feels.
01:50:35.000 I think that's true.
01:50:35.000 I think that's true.
01:50:35.000 There's definitely a...
01:50:36.000 I've noticed that, too.
01:50:38.000 The actual numbers of horrible trolls are...
01:50:44.000 Maybe this afternoon will be different.
01:50:46.000 I don't know.
01:50:47.000 They'll try to prove you wrong.
01:50:51.000 I'm on Twitter a lot, and it's the platform I use more than any other.
01:50:56.000 So for all the complaints, I do feel like there's some real value there for me personally.
01:51:05.000 And I like...
01:51:11.000 I like the serendipity of the unexpected being pushed into my feed occasionally.
01:51:17.000 Sometimes it's, you know, sometimes of course I get angry or, you know, feel annoyed or think, why the hell, you know, why this?
01:51:26.000 But I think ultimately there is a lot of value to the platform.
01:51:31.000 I think we're unfortunately, I really do believe that so much of the The polarization in the conversation around speech is people who got burned during the laissez-faire days of 2015-2016 in mentally linking up the idea of free speech with the idea of being harassed online.
01:51:51.000 And I think when you look at—this is purely an opinion.
01:51:56.000 I have absolutely no data to bear this out.
01:51:57.000 But when you look at those Pew studies that show that younger people are more likely to want— Safe spaces or less offensive opinions.
01:52:12.000 I do sometimes wonder if that's an effect of coming of age at a point when random assholes were screaming at you on social platforms 24-7 versus like, I didn't have that experience growing up.
01:52:24.000 I didn't have that experience until I was like 25, you know?
01:52:27.000 So, maybe there is something, per your point about that, there is not much of a difference.
01:52:37.000 People spend so much time online.
01:52:39.000 This is where you're having your social engagements.
01:52:42.000 This is where you're having your conversations.
01:52:43.000 So, it does shape the way people think about it.
01:52:48.000 You know, their experience of what it means to have a conversation and what it means to speak freely.
01:52:53.000 I think that's one of the interesting, you know, it doesn't help that free speech has sometimes, in many cases, become a fig leaf for, I want carte blanche to, you know, say all kinds of mean shit to people all day long with no consequences.
01:53:09.000 Yeah, they think they should be able to do that because that falls under the blanket of free speech.
01:53:21.000 Yeah, me too.
01:53:38.000 But think about those interactions.
01:53:39.000 That's a shocking number of interactions with people that you're not even in direct physical presence of.
01:53:46.000 You're not looking at them.
01:53:48.000 You're not waiting for them to talk.
01:53:50.000 You're not considering what they're saying.
01:53:52.000 You're not reading social cues.
01:53:54.000 All the things that make us human, those are all thrown out the window.
01:53:56.000 You're just looking at text, and it might be coming from Russia.
01:54:01.000 The text that you're getting upset at and responding to, I mean, per your research, really, there's a direct possibility it's not even a fucking person.
01:54:10.000 Or a person, but not really representing their actual thoughts, just trying to push your buttons.
01:54:15.000 Yeah, no, that's true.
01:54:16.000 It's not even Russian trolls.
01:54:19.000 It's just the amount of...
01:54:22.000 I think it really is...
01:54:25.000 We're in a unique time where it's hard to know who you're engaging with.
01:54:30.000 It's hard to gauge whether it's good faith.
01:54:33.000 I mean, I react sometimes where I'll see a response and go click into the person's feed to try to decide if I should take this as a seriously good faith inquiry or if it's like a kind of vaguely cloaked fuck you, you know?
01:54:46.000 Right, right, right.
01:54:47.000 You don't have to do that when you're in person.
01:54:49.000 It's a very different experience.
01:54:51.000 Exactly.
01:54:51.000 Yeah, I have a friend of mine who's a young single male, and he was going through his direct messages.
01:54:58.000 And these girls were sending him all these naked photos and videos.
01:55:03.000 And he's like, look at this, man.
01:55:04.000 I go, let me see your phone.
01:55:05.000 And I said, let me click on that link.
01:55:07.000 I go, she has one picture on her page.
01:55:09.000 You fucking dummy.
01:55:11.000 This is probably not even a person.
01:55:13.000 Like, who knows what?
01:55:14.000 This is someone from Nigeria that's trying to get your credit card information.
01:55:17.000 Yeah, catfishing.
01:55:18.000 Yeah.
01:55:19.000 Yeah, I mean, they're trying to get you.
01:55:20.000 And he's like, oh yeah.
01:55:21.000 I go, how did you not go to her page?
01:55:23.000 Because I just, you know, I thought it was a girl sending me naked pictures.
01:55:28.000 You know, they just do that sometimes.
01:55:29.000 Well, they definitely do do that sometimes.
01:55:32.000 I mean, people are weird.
01:55:33.000 They do all kinds of weird things.
01:55:34.000 But, you know, there's a lot of these fake accounts, and I don't know what they're trying to do.
01:55:39.000 They're trying to get money from people?
01:55:40.000 Yeah.
01:55:41.000 That does come up.
01:55:42.000 I've seen every now and then, if you follow reporters on Twitter, particularly ones who have open DMs, they'll periodically post the insane catfishing stuff that they get where it's all about money.
01:55:54.000 Yes.
01:55:55.000 I don't know.
01:55:56.000 God bless people who have open DMs.
01:55:58.000 I don't know how they do it.
01:56:01.000 It makes me glad I'm not a journalist.
01:56:03.000 Yeah, I don't get it.
01:56:05.000 So overall...
01:56:07.000 Are you happy that you got into all this?
01:56:10.000 Does this change your perceptions of online communication?
01:56:16.000 I feel like there's...
01:56:18.000 I have tried over the years, whether it's conspiracy theorist communities or terrorists or Russia, Iran, the state-sponsored actors, the domestic ideologues, I have tried to always say,
01:56:33.000 here is the specific...
01:56:36.000 Kind of forensic analysis of this particular operation and then here is what we can maybe take from it and make changes.
01:56:45.000 We've seen some of that begin to take shape and so I feel grateful to have had the opportunity to work towards connecting those dots and work towards having this conversation.
01:56:59.000 Meaning helping people understand what's going on.
01:57:02.000 I think I am not I am most concerned about the...
01:57:12.000 As this gets increasingly easy to do, through things like chatbots, you know, now there's these, you've seen the website, thispersondoesnotexist.com?
01:57:25.000 No.
01:57:25.000 So there's a technology called, it's a machine learning technique, Generative Adversarial Networks, and basically they're, in this particular application, working to create pictures of people, faces of people.
01:57:39.000 And so this website is...
01:57:40.000 When you go to it, it pulls up a...
01:57:42.000 Yeah, there you go.
01:57:43.000 So this person does not actually exist.
01:57:44.000 This is not a real photo.
01:57:45.000 That's a fake human?
01:57:46.000 Yeah, and so these are all...
01:57:47.000 Whoa.
01:57:48.000 These are computer-generated images?
01:57:49.000 These are computer-generated, yep.
01:57:51.000 Oh, God.
01:57:51.000 So you can see created by a GANS. It says it down at the bottom there.
01:57:54.000 So these are not real people.
01:57:55.000 And so we have increasingly sophisticated chat technology.
01:57:59.000 We have increasingly sophisticated like you're not going to detect that image somewhere else.
01:58:03.000 That old trick of like right click and look and see if you're talking to someone with a stock photo.
01:58:07.000 That goes right out the window as stuff like this gets easier and easier to do.
01:58:11.000 Well then deep faking, right?
01:58:12.000 Yeah, the deep fakes on the video front.
01:58:14.000 I think that it does change.
01:58:17.000 I think we haven't quite adapted to...
01:58:20.000 What is it like to live in a world where so much of the internet is fake?
01:58:24.000 And I do think, per your point about identity, that there will be groups of people that self-select into communities where identity is mandatory, you know, where this is who you are and you have some sort of verification versus people who choose to live in the world of, you know,
01:58:40.000 drink from the fire hose, take it all in and try to filter it out yourself.
01:58:45.000 So we look at these evolving technologies and I don't necessarily feel, you know, particularly optimistic in the short term.
01:58:55.000 I think that ultimately it does, like we change as a society to a large extent in response to this.
01:59:01.000 We think about, you know, there are going to be some fixes that the platforms are going to be able to undertake.
01:59:08.000 They're going to be, we're going to get better at detecting this stuff.
01:59:11.000 Maybe, you know, the adversary will evolve.
01:59:12.000 Hopefully we get better at detecting it as it evolves.
01:59:15.000 But I think we fundamentally ultimately change.
01:59:21.000 People become more aware that this is a thing.
01:59:22.000 They are more skeptical.
01:59:24.000 That does change our ways of interacting with each other.
01:59:28.000 But I feel like that is going to be the direction that this goes.
01:59:34.000 The thing that keeps me up at night would be more the...
01:59:40.000 The ease of turning this from a social media problem into a real-world war problem.
01:59:46.000 Meaning, as an example, back in 2014, one of the first things the Internet Research Agency did, September 11, 2014, they created a hoax saying that ISIS had attacked a chemical plant down in Louisiana.
02:00:01.000 It's called the Columbia Chemical Plant Hoax.
02:00:05.000 There's a Wikipedia article about it now.
02:00:08.000 But what happened was they created a collection of websites.
02:00:12.000 They created fake CNN mockups, Twitter accounts, text messages that went to local people, radio station call-ins, you name it, everything, to create the impression that a chemical factory had just exploded in Louisiana and there was some attribution to ISIS. And this was done on September 11th.
02:00:30.000 So this is the kind of thing where this actually did go viral.
02:00:33.000 Like, I remember this happening not as a social media researcher.
02:00:35.000 I just remember it actually being pushed into my social media feed.
02:00:39.000 So you have these, and we didn't know that it was the Internet Research Agency for a year and a half after that.
02:00:45.000 But this is the kind of thing where you look at parts of the world that aren't the US, like the recent drama between India and Pakistan, and you can see how these kinds of things can go horribly, horribly wrong if the wrong person is convinced that something has happened or if this leads to a riot or if this leads to real-world action.
02:01:07.000 I think that's One of the main fears, as this gets better and better, the video fakes get better, the people fakes get better, what do you do then?
02:01:20.000 Yeah, what do you do then?
02:01:22.000 When you see those images, those fake images, those are stunning.
02:01:26.000 They're so good.
02:01:28.000 I mean, it just makes you wonder.
02:01:29.000 I mean, we're going to get to a point where if someone's not in front of you talking, you're going to look at a video, you're not going to have any idea.
02:01:36.000 You know, they're doing those deep fakes with famous actresses' faces and they put them in porn films.
02:01:42.000 And it's stunningly good.
02:01:44.000 I mean, it's amazing.
02:01:45.000 And they're also...
02:01:48.000 With someone like me, I'm fucking doomed because there's thousands of hours of me talking.
02:01:53.000 I've said everything.
02:01:54.000 So you could take this new...
02:01:57.000 There's these new programs that are editing audio and you could splice together audio and video now.
02:02:03.000 They don't even splice it.
02:02:04.000 The computer will generate it.
02:02:06.000 It's insane.
02:02:07.000 It generates your lip movements, everything.
02:02:09.000 I mean, it's really stunning.
02:02:11.000 It's really stunning, and it's only going to get crazier and crazier.
02:02:14.000 And it's going to be very difficult to, unless something is actually happening right in front of your face, it's going to be very difficult to be able to differentiate.
02:02:21.000 And then I'm more worried about augmented reality and virtual reality and this stuff making its way into it.
02:02:27.000 I mean, we're going to dive willingly with a big smile on our face into the Matrix.
02:02:31.000 Yeah.
02:02:32.000 Have you watched that recently?
02:02:33.000 No, I haven't.
02:02:34.000 I watched that on a plane like a month ago or something.
02:02:37.000 It holds up so well.
02:02:38.000 Does it?
02:02:38.000 No, it holds up.
02:02:39.000 Really?
02:02:39.000 It holds up.
02:02:40.000 Like, it's insane.
02:02:41.000 Except for the phone booths.
02:02:43.000 It's the one thing where, like, there's no phone booths anymore.
02:02:47.000 But everything else is...
02:02:48.000 Wow.
02:02:49.000 Yeah, it's...
02:02:50.000 I don't know.
02:02:51.000 I guess you've seen it recently.
02:02:52.000 I just bought it on 4K because, like, I forgot.
02:02:55.000 In 99, there wasn't...
02:02:57.000 Barely HD back then.
02:02:58.000 I just wanted to see what it was like.
02:02:59.000 Rewatched it.
02:03:00.000 Forgot how good it was.
02:03:02.000 Two and a half hours flew by.
02:03:04.000 The whole movie just went out.
02:03:05.000 It's amazing how that seemed preposterous in 99 or whatever it was.
02:03:13.000 Like, oh, this is just sci-fi.
02:03:14.000 And now you're like, hey, this is a little closer.
02:03:19.000 The idea of...
02:03:21.000 Have you messed around at all with HTC Vive or Oculus?
02:03:25.000 No.
02:03:25.000 I, I tried, I was a VC briefly, like back in, back in five, six years ago now.
02:03:32.000 And I tried one from, I think it's USC, right?
02:03:35.000 The Southern California is a bunch of really good labs down here.
02:03:38.000 And I tried one where it was like a zombie holodeck simulator.
02:03:41.000 Mm-hmm.
02:03:42.000 And it wasn't just the VR. It was immersive, so they had a backpack on me.
02:03:48.000 And it was actually scary as hell.
02:03:51.000 I was like, this is really good.
02:03:54.000 I love first-person shooters.
02:03:55.000 I think they're so much fun.
02:03:56.000 But this was just the first time where in the game you have a bat or something, and you're trying to beat zombies together with a bat, and they're all up in your face.
02:04:06.000 I don't know if that thing ever came to market, but damn, was it good.
02:04:08.000 Yeah.
02:04:09.000 They have a company called...
02:04:11.000 I think it's called The Void now.
02:04:12.000 And they have this Wreck-It Ralph one.
02:04:15.000 And I did it with my kids recently.
02:04:17.000 And you put on a haptic feedback vest.
02:04:19.000 And you go through this environment.
02:04:22.000 And it's crazy.
02:04:23.000 I mean, it's very clear that you're in a video game.
02:04:26.000 It doesn't seem real.
02:04:27.000 But it is so much better than anything that existed five years ago.
02:04:31.000 And you go, okay, well, what is...
02:04:33.000 With the exponential increase in power of technology, what is this going to be like in ten years?
02:04:37.000 What's it going to be like in 15?
02:04:39.000 It's going to be impossible to differentiate because now it's a vest.
02:04:42.000 It's just a vest.
02:04:43.000 You're not strapped into a chair.
02:04:46.000 You can move around.
02:04:47.000 So you're going through this whole warehouse that they have set up for these games.
02:04:50.000 You're even picking up physical objects and they look different in your hands than they do when you look at them without the headgear on.
02:04:57.000 Yeah, there's centers for that.
02:04:58.000 I know in Vegas they have them.
02:05:00.000 They have one outside of Disneyland.
02:05:02.000 It's in downtown Disney.
02:05:05.000 It's called The Void.
02:05:06.000 And then you go into these...
02:05:08.000 There's one Star Wars one.
02:05:09.000 There's a Wreck-It Ralph one.
02:05:11.000 Now they have them in malls, small ones.
02:05:14.000 You sit in these little eggs and you go on rollercoaster rides and you fight off zombies and go into a haunted house.
02:05:20.000 It's getting weird.
02:05:24.000 I was a kid when video games were these ridiculous Atari things where you stuck a cartridge in and you're playing Pong.
02:05:32.000 And now we're looking at something that you're looking at these images of people.
02:05:36.000 You see pores.
02:05:38.000 You see the glistening of their lips.
02:05:40.000 You see their eyes.
02:05:42.000 It's very strange.
02:05:44.000 It's very strange to think that those are not really people and that we are probably going to look at...
02:05:48.000 All you have to do is create something that is so large and the propaganda is so terrifying that it causes you to act.
02:05:59.000 Without double-checking, triple-checking, and making sure you verify the fact that something's really happened.
02:06:03.000 And then it sets into motion some physical act in the real world that you can't pull back.
02:06:09.000 And this is just not related necessarily, but what happened with Hawaii when they got that false warning that nuclear missiles were headed their way.
02:06:20.000 Yeah, that was a...
02:06:21.000 Can you imagine?
02:06:22.000 It was crazy!
02:06:23.000 And it was just someone hit the wrong button.
02:06:25.000 I mean, if we come to some sort of a point in time where someone does something like that on purpose and shows you video that you really think New York City just got nuked and, you know, you have to head to the hills and there's a giant traffic jam on the highway and people start shooting each other.
02:06:44.000 I mean...
02:06:45.000 If Russia really wants to fuck with us, what they're doing now with just this IRA agency and all of these different trolls that they've got set up that's sort of trying to get people to be in conflict with each other,
02:07:01.000 this is with primitive, crude text and memes.
02:07:08.000 What could be done in the future?
02:07:10.000 It's terrifying.
02:07:13.000 We live in a weird world.
02:07:14.000 Yep.
02:07:17.000 We're in agreement.
02:07:18.000 Yep.
02:07:19.000 Let's end it there.
02:07:20.000 Alright.
02:07:20.000 Thank you so much.
02:07:21.000 I really appreciate it.
02:07:22.000 Thanks for coming down.
02:07:23.000 Thanks for having me.
02:07:24.000 And thank you for all your work and exposing all this stuff.
02:07:26.000 It's really, really interesting and terrifying.
02:07:28.000 Tell people how they can get a hold of you so they can troll you on Twitter.
02:07:32.000 My handle's at noupside.
02:07:34.000 Noupside?
02:07:35.000 Yep.
02:07:35.000 And you don't necessarily have an Instagram?
02:07:37.000 It's more like family.
02:07:39.000 My Instagram's my kids.
02:07:40.000 Okay.
02:07:40.000 Beautiful.
02:07:41.000 All right.
02:07:41.000 Thank you, Renee.
02:07:42.000 I really appreciate it.
02:07:42.000 Thank you.
02:07:43.000 Bye, everybody.