#378 — Digital Delusions
Episode Stats
Words per Minute
177.12321
Summary
In this episode of the Making Sense Podcast, host Sam Harris talks with Renee DiResta about why it's time for 2020 Democratic presidential hopeful, Kamala Harris, to pivot to the center of the Democratic Party. They discuss why this is a good idea, and why it would be a mistake not to do so. They also talk about the Trump interview with the National Association of Black Journalists, and what it means for her chances of winning the Democratic nomination in 2020. And they talk about what it really means to be a liberal in the 21st century, and whether it's even possible to win a presidential election without pivoting toward the center. Sam Harris is a senior editor at The New York Times Magazine and contributor at The Weekly Standard, and is a regular contributor to The Daily Beast and The Huffington Post. He is the author of the book, "Invisible Rers: Turn Lies Into Reality: The New People Who Turn Lies into Reality," and has been widely published in The Atlantic, Wired, Foreign Affairs, and The Atlantic. He also has a new book, Invisible Rers, which we discuss the state of our digital landscape, the power of the information landscape, and the role of technology in shaping our reality, and how we can all of us in the process of shaping our own reality. Sam talks to Renee diResta, a Stanford researcher who focuses on the abuse of information technologies and propaganda, and her work on the use of fake news and fake news, about the dark side of the internet, about what we should be doing to make us all better at understanding the world we live in a digital world. We don t need to be so much more than what we already have access to the information we already, we just need to make sense of the world, we already do it, and we can do it in a real world that s a better, more of it, not less of it and we don t even need to ask for it. Thanks for listening to this episode, Renee, you're making sense of it? by Sam Harris, by the way, thanks for listening, and I hope you're listening to it, too, and sharing it on social media, and posting it on Insta-lyc, too! to let us know what you think about it on your social media account, and your thoughts on it, or not, and tweeting us what you're thinking about it, anyway?
Transcript
00:00:00.000
Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640
you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580
hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840
Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960
scholarship program, where we offer free accounts to anyone who can't afford one.
00:00:28.340
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:32.860
of our subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:45.320
Well, did you see Trump's appearance at the National Association of Black Journalists?
00:00:52.540
That was spectacular. As you probably know, it went off the rails at the first
00:00:58.080
question, which in Trump's defense, it was a very hard-hitting question. I didn't catch
00:01:05.240
a journalist's name. She was from ABC News, but she was great. And Trump performed like a robot
00:01:13.860
that had had its racial software updated somewhere around 1972, about the time that Archie Bunker
00:01:22.440
was the most famous character. His clueless and probably actual racism was just leaking out of
00:01:32.900
his pores in that context. And it was fascinating to watch. I would point out, however, that the man
00:01:39.500
spoke with reasonable fluidity, despite the absolutely bizarre hand gestures. He was a very different
00:01:47.100
near-octogenarian than President Biden. We should be very grateful that Biden is no longer in the
00:01:54.620
race. And from what I've seen, Vice President Harris has responded, well, I'll remind you, this was an
00:02:02.820
event where, in front of, I assume, an exclusively Black audience, Trump questioned whether the Vice
00:02:10.040
President was actually Black, and in fact claimed that she had only just turned Black, having previously
00:02:16.400
identified as an Indian her entire life. Of course, none of that's true, but true or not, it was an
00:02:23.300
amazing thing to allege in that context. Anyway, it seems like Harris has responded well to this by
00:02:29.960
just letting surrogates respond. She has just laughed it off, which I think is the right move.
00:02:37.580
I published a piece on Substack yesterday, talking about how I think Harris should pivot to the
00:02:45.400
center. I really do think this is necessary. She's just trailing so much video and audio where she,
00:02:53.380
in the 2020 campaign, played connect the dots with bits of woke, sanctimony, and delusion. She has to
00:03:01.600
perform an exorcism on that stuff. If in an interview or debate she gets led back onto that terrain,
00:03:07.560
and is asked about, you know, defunding the police or the new gender identity law in California,
00:03:15.520
what she thinks about the epidemic of teenage girls who apparently want double mastectomies
00:03:20.360
so that they can transition, unless she can show that she has her head screwed on straight amid those
00:03:27.680
kinds of topics, there is just a nuclear bomb waiting to detonate for her at the center of democratic
00:03:35.040
politics. And I just don't think she's going to be able to ignore it. It'd be great if she could
00:03:39.980
just talk about Trump's corruption and reproductive rights and gun control and uniting the country.
00:03:48.760
But unless she finds a path through the minefield that was patiently laid by progressive fanatics on the
00:03:58.220
far left of the Democratic Party that is sane and appears honest, it is just a disaster waiting to
00:04:05.620
happen. So anyway, in this piece on Substack, I argue that it would be very easy to pivot here,
00:04:11.780
and there's not much to explain, right? It does not need to seem like hypocrisy.
00:04:17.660
And I even scripted how I think she could do that. For better or worse?
00:04:21.580
Okay. And now for today's podcast. Today I'm speaking with Renee DiResta. Renee was the technical
00:04:30.880
research manager at the Stanford Internet Observatory, and she generally focuses on the
00:04:35.560
abuse of information technologies. Her work examines rumors and propaganda, and she's analyzed
00:04:42.180
geopolitical campaigns created by foreign powers, such as Russia, China, and Iran. She worries about
00:04:48.920
voting-related rumors and the integrity of our elections, health misinformation, and conspiracy
00:04:55.060
theories. And she has been widely published in The Atlantic, Wired, Foreign Affairs, The New York
00:05:01.500
Times, The Washington Post, Politico, and elsewhere. And she also has a new book, Invisible Rulers,
00:05:09.820
The People Who Turn Lies Into Reality. And we talk about the book. We discuss the state of our
00:05:16.080
information landscape, the difference between influence and propaganda, shifts in communication
00:05:22.100
technology, influencers and closed communities, the asymmetry of passion we see online, and the
00:05:29.060
illusion of consensus, the troublesome unwillingness to criticize one's own side, audience capture,
00:05:36.140
what we should have learned from the COVID pandemic, what is unique about vaccines,
00:05:40.900
Rene's work at the Stanford Internet Observatory, her experience of being smeared by Michael
00:05:46.960
Schellenberger and Matt Taibbi, Elon and the Twitter files, the false analogy of social media as a digital
00:05:54.520
public square, the imagined censorship industrial complex, the 2024 presidential election, and other
00:06:02.840
topics. And now I bring you Renee DiResta. I am here with Renee DiResta. Renee, thanks for joining me
00:06:15.660
again. Thanks for having me. So you've written a new book, which speaks to the insanity of the moment,
00:06:22.660
which is obviously quite important. That book is Invisible Rulers, The People Who Turn Lies Into Reality.
00:06:30.820
There are more than a few of these people, but we'll discuss a few of them. And maybe you've been
00:06:36.480
on the podcast before and we've touched some of these topics. I think the most recently was about
00:06:42.280
a year, year and a half ago, something like that. Yep. But maybe remind people of your areas of focus
00:06:49.900
that have culminated in this book. Yeah, I study influence and propaganda. And for about five years,
00:06:56.160
up until June of this year, I was the technical research manager at Stanford Internet Observatory.
00:07:00.820
where we study adversarial abuse online. Right, right. And we'll talk about the controversy around
00:07:07.000
the Internet Observatory. But let's just focus on the big picture to start here. One thing we think
00:07:15.620
about and talk about under various guises now is propaganda. But propaganda is a bad word for a certain
00:07:24.140
kind of influence and persuasion. Obviously, there are benign and even beneficial forms of influence
00:07:32.320
and persuasion. How do you differentiate the problem from the unavoidable and even good variants of just
00:07:40.380
the spread of information online? Yeah, I think propaganda didn't used to be pejorative, right? So
00:07:45.960
prior to the 1940s, it was just a word that meant the kind of desire to propagate information or the
00:07:52.840
need, rather, the need to propagate information comes from a word used by the Catholic Church.
00:07:57.600
After World War II, it becomes that kind of bad information that those people over there do,
00:08:03.000
right? So it becomes the sort of information that your adversary puts out into the ether to manipulate
00:08:07.080
people. And it becomes, you know, it takes on that particular tone. So I think roughly speaking,
00:08:13.080
you could define it as the systematic and deliberate effort to influence the attitudes and beliefs and
00:08:20.560
behaviors of a target audience, but in a way that often involves biased or misleading info to promote a
00:08:26.140
particular agenda. So information with an agenda, and oftentimes that agenda is unclear. So it's a lot of the
00:08:32.760
time it's differentiated from persuasion in that persuasion is seen as making an emotional appeal,
00:08:38.600
but doing it more ethically. Persuasion kind of respects the autonomy of the audience. It doesn't
00:08:44.000
necessarily aim to manipulate them. It isn't using fakery. It isn't selectively leaving out
00:08:51.400
significant pieces of information. It's always been, I think, a fuzzy term and one that people kind of
00:08:59.320
quibble around. So in the book, I really tried to differentiate it in part by this very active,
00:09:04.780
systemic effort to shape public opinion as opposed to something that is more organic.
00:09:11.960
And how have our online lives and the various platforms and tools changed this problem? I mean,
00:09:19.780
in your book, you go back to various cases, you know, a hundred years ago and beyond. People have
00:09:27.760
drawn analogies to the printing press and you talk about the alarming case of Father Coughlin,
00:09:35.040
which is now almost a hundred years old. What has changed for us?
00:09:40.380
So in any new media ecosystem, anytime there's a shift in technology, communication technology,
00:09:47.320
you have new means by which you can reach large audiences. Social media was pretty distinct in that
00:09:53.440
you could reach very, very targeted audiences. So it really enabled propaganda to go from something
00:09:58.920
that was seen as a function of mass media, right? Manipulating large numbers of people or creating
00:10:05.320
a national narrative to something that became very niche. So I think that's particularly different.
00:10:11.720
You could point back to maybe the era of the printing press and the pamphleteering wars where there were
00:10:16.200
niches then, but there was a significant trend over time towards the mass, the mass media,
00:10:21.460
the mass narrative. And now we've reverted again to the niche.
00:10:24.680
Just to focus on that for a second. So what is the significance of it being a niche is that
00:10:30.060
one, your targeting can be that much more effective and, you know, bespoke, but there's also this
00:10:37.500
feature that the rest of the world can't really see what the niche is seeing, right?
00:10:43.480
Right, exactly. So the messages, the memes, the things that resonate really come up.
00:10:51.460
Not only in media, but in closed groups. So a lot of the time, one of the things I talk about in the
00:10:57.380
book is that there's always been this perception that media messages reach the public and that is
00:11:04.000
how public opinion is shaped. And that's actually not really true, right? For a very, very long time
00:11:09.040
since the 1940s, we've had these research studies that show that the media reaches certain groups of
00:11:14.940
people. And then those people, they're called opinion leaders, really sort of socialize and
00:11:21.240
communicate with people who are like them in trusted communities. And that's how opinion is
00:11:26.560
formed. So it's sort of moderated through these opinion leaders. This is called the two-step flow
00:11:32.460
theory of communication, right? And sort of communication theory. And so there's that piece
00:11:37.820
where it's not just that you've seen the thing somewhere. It's that your community accepts it.
00:11:43.280
Your community is talking about it. You are talking about it with them. So the interesting piece about
00:11:47.660
social media niches is that you have both the fragmentation of media, but you also have these
00:11:53.320
closed communities that are global yet simultaneously niche, right? So there are people from all over the
00:11:58.740
world in them. It's not people you know in real life. You are brought together because you share some
00:12:03.420
sort of alignment. And then in those closed communities, you talk a lot about the media that
00:12:07.740
you see in your ecosystem. So, you know, almost like a double niche, if you will, right? You have
00:12:12.400
the niche communicators, the niche influencers, the niche content producers, and you also have out of
00:12:19.160
the field of view or out of the broader, you know, zeitgeist, if you will, you're kind of discussing
00:12:25.740
the things that you see in the niche within the niche. And so that structure of social media is
00:12:31.100
interesting, both from a content perspective, but also from like how we form opinions and who we
00:12:35.280
talk about things with. And then there's this effect where the niche suddenly becomes a perceived
00:12:41.540
majority, even when in fact it remains a minority. You have this, this is something you discuss in the
00:12:46.700
book in various places, this asymmetry of passion, which masquerades as consensus. And maybe you can
00:12:55.380
say more about that. Yeah. So one of the things that I guess another, another thing that's
00:13:00.180
fundamentally different now is that you have a very participatory environment, right? So we all
00:13:07.400
can go out there and shape public opinion through our own posts and our own means of contribution.
00:13:14.360
And one of the things that you see, and it really starts, it starts to become very visible in 2015
00:13:19.080
on Twitter in particular, is that small groups of people who become very, very activated all decide,
00:13:25.260
okay, on this day, at this time, we're all going to talk about this thing, right? It's very,
00:13:28.980
very coordinated in the early days. There were even apps that were made to help you do this.
00:13:33.340
There was one called Thunderclap, where you could register your social media accounts and one kind
00:13:38.700
of activist organization could essentially co-opt your social media account to send out the same
00:13:43.880
message at the same time in hopes of triggering a trending algorithm, which makes it look like a
00:13:48.920
whole lot of people are talking about something all at the same time. That's not necessarily actually
00:13:54.160
a large number of people. It's just a large number of accounts. And one person can control
00:13:58.720
thousands of accounts potentially. So you have this interesting phenomenon that happens where the
00:14:03.240
perception of a majority opinion or a significant area of interest also becomes something that is a
00:14:10.340
function of, you know, algorithmic curation, surfacing content to people that makes it look like
00:14:19.600
Yeah. And I guess there are variants of that where it doesn't have to be enabled by any
00:14:24.540
gaming of the technology. It's just the fact that the loudest people online are the voices that
00:14:31.580
you hear. And especially when they're unpleasant and they use kind of trolling tactics that
00:14:38.680
really not merely become more salient, but actually just actively silence and block the participation
00:14:45.920
of more moderate voices. So you just get this sense that everyone agrees. I mean, I would have to be a
00:14:51.740
moral monster to even have a divergent opinion on this topic because, you know, the opinions I'm seeing
00:14:58.000
online are so strident and the reputational damage done to anyone who traduces one of these,
00:15:05.880
these, you know, what's any part of this new orthodoxy is so complete, you know, that it's just
00:15:12.060
you completely lose sight of the fact that most people on earth are, you're, you're, you're not
00:15:17.100
hearing from them at that moment on online or anywhere else.
00:15:20.180
Right. And, and it, it becomes a sort of a reinforcing phenomenon. There's always been an element
00:15:26.700
of, you know, demonizing your opponents in propaganda, right? This is a, you know, anytime you see a
00:15:30.960
conflict, of course, the, uh, you know, referencing World War II, as we did earlier, the, uh, those
00:15:37.000
people over there, right? The Nazis, the, you know, pick your kind of adversary. There is that
00:15:42.180
phenomenon of demonization that becomes very effective, you know, sometimes, sometimes justified,
00:15:47.640
sometimes not. And what you see though, on, on social platforms is when you go through that
00:15:53.120
demonization process, you can essentially push people out of the conversation entirely. Because as you
00:15:59.400
note, you can make it seem like it's a social liability to have an opinion that aligns with
00:16:06.940
something that is, you know, that belongs to them over there. And so you see that the group gradually
00:16:13.480
becomes more and more insular, more and more strident. And people who are seen as deviating in some way
00:16:20.500
are, uh, either sort of, you know, they, they self-censor or they are pushed out. And so you do see
00:16:25.800
those groups become more homogenous. You see them become often more combative and that, you know,
00:16:31.900
that, that comes to create the phenomenon that we see today of like social media as the gladiatorial
00:16:37.500
arena, right? It's not the public square at all. You're not there to debate and dialogue and be in
00:16:42.120
communion with your neighbors. You're there to, you know, own and destroy your enemies. And you're
00:16:46.100
going to do that, you know, using some, some very particular types of tactics to intimidate them out
00:16:50.640
of the conversation or harass them out of the conversation or, you know, or, or just create an
00:16:55.080
environment where nobody in their right mind wants to participate when the cost of participation is,
00:17:02.040
Yeah. One other aspect of this, which drives me fairly crazy is the unwillingness to criticize
00:17:08.320
one's own side, right? I mean, that is just a clear filtering function, which increasingly
00:17:13.900
radicalizes the various sides of, of any issue. And it is the, you know, among perhaps a few other
00:17:20.380
variables that in my, in my mind, it is the thing that more or less guarantees are a race to the
00:17:25.420
bottom. I mean, it's just, you get less and less honest and more and more strident and you're willing
00:17:32.120
to defame the other side by any means necessary. And it's just very quickly, no one is having an honest
00:17:39.240
conversation about real facts. Right. And I think what you see there is there's actually
00:17:45.400
incentives that drive people to do that at this point. And one of the things I tried to do with
00:17:50.460
the book was, you know, was, was talk about it in terms of like, what are the incentives that lead us
00:17:55.260
to this place? Like why there's, of course there's social psychology, there's crowd psychology, there's
00:17:59.920
human nature reasons why this happens. And I try to go into that, you know, Cass Sunstein's had a whole
00:18:05.540
body of literature on how crowds trend towards the most extreme over time. But one of the things
00:18:10.140
that happens on social media is the most extreme voices are rewarded, right? And they're rewarded
00:18:15.780
in terms of clout within the group, which has always been the case. That's the sort of social
00:18:19.480
component. But there's also an interesting financial component that's also very, very different today,
00:18:23.780
which is not only as propaganda participatory, but it can be quite profitable. And so you have that
00:18:29.880
phenomenon where the influencer has to appeal to the niche and there's a finite amount of money that's
00:18:36.200
going to be spent in the form of, you know, patronage or sub stack subs or, you know, attention,
00:18:41.520
various forms of attention. And, you know, in terms of Twitter, it's rev share, you know,
00:18:46.480
depending if your tweet is seen by a lot of people, there's that potential for rev share.
00:18:50.340
If you're on YouTube, it's who gets the sponsorships, right? So that accruing some attention
00:18:54.940
becomes like a flywheel effect for getting more and becoming, you know, developing more profit from
00:19:00.060
it as well. And so when you're catering to a niche, this is the phenomenon you see with audience
00:19:04.480
capture, where the influencer becomes, you know, it sort of feeds the crowd, right? The crowd gets
00:19:10.300
Mark's dream, so does the influencer. And that phenomenon is happening in tandem. And so it does
00:19:15.760
become very much a race to the bottom, in part, you know, motivated by the ideological and social
00:19:20.620
reasons. But also there's a, you know, a profit component to it that I think is important to
00:19:24.740
highlight because it's really unique and different in this particular media environment. The idea,
00:19:29.780
the figure of the influencer really comes out of social media in a very distinct way.
00:19:37.020
Yeah, this is a phrase which I think Eric Weinstein coined, which many of us use now,
00:19:43.100
the phenomenon is, you know, of audience capture, where, you know, a creator or influencer begins to
00:19:49.540
notice, signal in their audience that they cater to. And there's this effect of, you know,
00:19:55.840
where you see people, I mean, we will probably talk about some of them. You see people who've just
00:20:00.240
grown radicalized by the act of pandering to their own audience.
00:20:05.260
Well, I think one thing that happens is if you are not doing that pandering, somebody else will step
00:20:10.000
in and do it for you, right? And so if you are not selling those subs or, you know, doing those,
00:20:16.620
you know, capturing that attention in that moment, somebody is going to be there to do it. And so
00:20:21.120
you are going to lose in a sense. And there's, you know, kind of ego components to that. There's
00:20:27.040
financial components to that. You know, you do see a pretty short lifespan for people who become
00:20:33.720
influential in only one thing. And then when that thing ceases to become the thing of the moment,
00:20:38.500
they have to find ways to make themselves relevant. We saw this with, you know, people who became
00:20:43.860
highly influential during COVID as COVID has waned, right? As COVID is not the kind of all-encompassing
00:20:51.340
attention grabber that it was in 2020 or even 2021, you see them kind of pivoting and going off into
00:20:57.880
other adjacent kind of culture worry areas where they can continue to monetize their audience,
00:21:04.360
engage with their audience, and remain relevant to the public conversation.
00:21:06.940
Yeah. Well, I want to get to your personal story and just, you know, all the drama that has been
00:21:14.160
kicked off in your life of late. Perhaps COVID is the bridge. And we can talk about COVID and perhaps
00:21:20.740
what you've learned from dealing with the anti-vax community, you know, long before COVID, because
00:21:26.100
that's really where you got inducted into this network phenomenon. As a student of computer science,
00:21:31.960
you had the tools to respond to what you were seeing there, but it was the first part of,
00:21:37.260
you know, conspiratorial culture you got blindsided by was just as a parent dealing with, you know,
00:21:44.080
vaccine requirements or lack thereof in your kid's school. But to take COVID for a second,
00:21:49.080
what do you think happened during COVID? What did we get wrong? I mean, you know, when the history of that
00:21:55.740
moment is written by sober people who have not been deranged by their own bad incentives,
00:22:03.580
what do you think the verdict will be of what happened to us and what should we have learned?
00:22:09.480
What should we not do again when you're talking about the attempt to influence societies, you know,
00:22:17.440
in good ways in response to a moving target of a global pandemic that we, you know, did not
00:22:24.880
understand in the beginning and did not understand in the middle, but understood differently and,
00:22:29.920
you know, understood differently again in the end. And, you know, the evolving scientific
00:22:35.040
conversation was at every point entangled with the political needs of the moment and just a wilderness
00:22:42.980
filled with bad actors and grifters and, you know, actual lunatics. What happened and what should
00:22:50.540
we have learned? You know, it's such a complicated question. I'm trying to think of how to break it
00:22:56.860
down. I think first, for me, the very first inkling we had that there was going to be a very,
00:23:05.040
very serious outbreak of something was in December of 2019 or so. And I was paying attention to it,
00:23:12.580
actually, because I was looking at state media narratives at the time. I was doing a bunch of
00:23:15.760
work on Chinese propaganda and Chinese influence operations over the years, and they began talking
00:23:20.380
about it. And their state media began focusing very heavily on their response, right? The incredible
00:23:26.820
propaganda operation that began out of Chinese state media about their response. And then,
00:23:31.140
interestingly, ways in which they had these sort of online influencers, they're sometimes called the
00:23:36.200
wolf warriors, these Twitter accounts that began to reach for conspiracy extremely early on,
00:23:41.980
right? Well, yes, people are saying that this came from China, but what if the U.S. military brought
00:23:46.600
it during the World Military Games in Wuhan a couple months earlier? Some people on the internet
00:23:50.920
are saying that this is actually a bioweapon that came out of Fort Detrick, right? And so I thought,
00:23:56.060
okay, this is going to be a thing, right? This is going to be a narrative battle. And this was before it,
00:24:01.060
you know, before it reached American shores and before it became politicized through the unique lens of
00:24:05.660
the American culture war. What was very interesting to me was that the anti-vaxxers were on it,
00:24:09.800
right? They were on it. They were like, this is, you know, they started making videos. This is
00:24:14.960
fantastic for us because if it does come to the U.S., they're going to rush out a vaccine and then
00:24:19.280
people are going to see how shoddy vaccine science really is, right? They're going to come and we're
00:24:25.400
going to convert them. And they really saw this as an incredible opportunity because they also didn't
00:24:30.220
believe that it was real, right? So it was simultaneously not real, but also a thing that was real that might
00:24:36.300
get a vaccine and then the world would realize how corrupt vaccines were.
00:24:39.800
One of the things you notice is, you know, like in quantum mechanics, right? Two completely
00:24:44.940
conflicting states can be true simultaneously until you have the observation. I think about
00:24:48.720
that a lot when I watch kind of conspiratorial narratives evolve. But what happens with COVID
00:24:52.860
is you have the anti-vaxxers and the people who are well-networked and very well, you know,
00:24:57.400
kind of well-connected early on that are preparing to mount a response before it even gets here.
00:25:01.700
And then you have, meanwhile, the public health authorities who I talk about in the book,
00:25:07.980
my dealings with them back in 2015, 2016, during some of the measles outbreaks, they do not
00:25:12.600
understand modern communication. They, you know, there's this phrase that I've never forgotten.
00:25:18.320
These are just some people online, right? And that was something that was, you know, sort of,
00:25:22.460
it sounds very patronizing what they meant by it. And it is very patronizing, of course.
00:25:26.760
But what they meant by it was that, yes, there were anti-vaxxers. Yes, they had large followings.
00:25:31.500
But ultimately, people would vaccinate because they trusted their doctors. They trusted their
00:25:35.780
experts. And it was, you know, in the toss-up between experts and just some people online,
00:25:41.320
they thought the experts would continue to win. And I did not, you know. And I thought, okay,
00:25:48.940
somebody at some point is going to be responsible for modernizing communication within this institution
00:25:53.700
or any other institution. And that turned out not to be the case because there was nothing that was
00:25:58.080
really urgent, right, that really would galvanize them into recognizing what they had to do until all
00:26:04.520
of a sudden it was in front of them and live and they could not cope. And one of the ways that you
00:26:11.060
saw this play out very early on was in the conversation about masks, where you had influential
00:26:17.220
people with large followings on social media. You know, Balaji comes to mind saying, hey, this is a big
00:26:22.880
deal. People should be wearing masks. People shouldn't be traveling, even as you don't see
00:26:27.780
the health institutions kind of coming down on that side. They're reticent. They're not communicating.
00:26:32.720
So you have a situation where there is an information void and it's being filled by people
00:26:37.780
who, in this particular case, turn out, you know, to be, we thought, correct. Now the anti-maskers are
00:26:43.520
arguing that they were never correct. But you see this incomplete information and nobody knows what is
00:26:49.440
true. But in the meantime, the health officials are not speaking. The other people are. And so when
00:26:55.120
they finally do come out and say, yes, you should be wearing masks, they appear to be leading from
00:26:59.220
behind. So they take kind of a credibility hit. And one of the things that you see is scientists who
00:27:04.420
are waiting until they're absolutely sure of something to come out with commentary. Even as the
00:27:09.520
conversation is moving, the public is forming opinions, the influencers are driving the narratives,
00:27:15.080
and the health officials are still very much sitting on the sidelines. So that's one phenomenon.
00:27:20.540
But then the other thing that you see is it quickly becomes politicized, right? This is an election
00:27:25.320
year after all. But it's also, it, you know, the anti-vaccine movement did move from being kind of
00:27:32.960
crunchy lefty crazies in the, you know, Jenny McCarthy greener vaccines era to being much more of the,
00:27:39.240
you know, the sort of right-wing conspiracy theorist. That shift starts to happen around 2017.
00:27:43.440
And so it becomes an identity. And once it becomes an identity, you have influencers who politicize the
00:27:51.660
vaccine, who politicize the treatments, and everything becomes adversarial. You have to be
00:27:57.300
communicating about how evil and terrible the other side is. And that becomes the, the sort of
00:28:03.620
dominant means of engaging. There is always some form of agreement. There is always some complaint.
00:28:09.560
And so you have both real institutional problems, real institutional shortfalls, and then engineered
00:28:17.280
and exacerbated anger and aggrievement at institutions because it is profitable and attention,
00:28:24.680
you know, provides attention to the people who become the contrarians who are offering an alternative
00:28:30.720
point of view. They begin to develop large followings. And then they double down by constantly
00:28:36.300
implying that anything that comes out of an institutional communication is somehow compromised.
00:28:41.680
And moreover, any effort to surface good information and downrank bad information is some sort of horrific
00:28:47.700
act of censorship. So that becomes part of the, part of the discourse around that time as well.
00:28:52.460
Yeah, that's a, just a larger point we may come back to, but I mean, perhaps we should just touch it
00:28:57.700
briefly now. We're going to talk a lot about the reaction to perceived acts of censorship, but
00:29:05.500
one, one almost never hears the people who are most exercised about this issue entertain the question,
00:29:13.720
should the government or should institutions try to amplify any message preferentially under any
00:29:23.740
conditions, right? I mean, here you have the condition of a global pandemic in which it is believed that
00:29:29.940
millions will die, very likely, if we don't get our act together, or many more millions will die than
00:29:37.160
will, in fact, will die unnecessarily if we don't get our act together. And, you know, the tacit
00:29:43.900
assumption of all of these people for whom the Twitter files is the biggest story of the decade
00:29:49.280
is that any attempt to steer the conversation, any attempt to flag misinformation, any attempt to
00:30:00.660
amplify genuine signal and deprecate actual noise is sinister, right? No one can be trusted to do that.
00:30:10.740
And that is, I mean, I think if you take 30 seconds to think about that, you know, that as an algorithm
00:30:16.220
for dealing with any possible global emergency, that is just about the stupidest thing anyone has ever
00:30:22.920
thought, right? So, so then what are we arguing about? We're arguing about specific cases of influence
00:30:29.120
and whether or not they were ethically deployed, whether they were in fact were, you know, based on
00:30:34.240
facts or misunderstandings. But it's a little bit like the claim to free speech absolutism online of
00:30:41.160
which, you know, no, no one with a straight face can actually defend it when you look at the fact that,
00:30:45.900
you know, even a place like 4chan has to have a moderation policy.
00:30:49.360
Right. Alex Jones, yes, has one of the best. Yeah.
00:30:52.740
Yeah. You point that out in your book, Infowars has, I think you, you, you quote their terms of
00:30:56.660
service, which are, you know, as, you know, seemingly normal as any other terms of service.
00:31:00.700
Right. Well, I mean, it's, you know, you, there has to be some kind of, you know, guardrails and,
00:31:06.980
and one of the ways that that manifests, sometimes it's about, you know, harassment and that sort of
00:31:10.880
thing. There were, one of the things that was interesting about COVID was the rapidly evolving
00:31:16.980
policies. And this is where you do see the platforms recognizing that, hey, this stuff is
00:31:22.700
all going to be going viral on our site and we're going to have to, we're going to have to think
00:31:27.660
about that. And it is treated in some ways, I think, by people who hadn't been following the
00:31:33.640
conversation. It's treated as like a novel thing that just emerges with COVID, but it's actually not.
00:31:39.140
And one of the reasons why in the book, I try to draw the through line is that there had been,
00:31:44.020
for example, a measles outbreak. And in Samoa, there had been a measles outbreak in Brooklyn.
00:31:49.820
The one in Samoa killed about 85 kids. The one in Brooklyn hospitalized several hundred kids.
00:31:55.200
And what you see is the platforms beginning to come up with policies for trying to up-level good
00:32:00.940
health information very early on, right? It's not something that comes up during COVID. They build on
00:32:05.340
the policies that they've pulled together for these other outbreaks and these other situations.
00:32:09.140
And what they try to do is they try to amplify good information. And one of the things that's
00:32:13.140
interesting about that, and I talk about in the book, having conversations with them about this,
00:32:17.900
this was where the freedom of speech, not freedom of reach kind of model of thinking comes into play,
00:32:22.960
which is, you know, you allow these groups to stay on the platform. They're not taken down.
00:32:27.240
But what you see the platforms do around these other outbreaks is they stop accepting ad dollars,
00:32:32.160
right? They stop putting anti-vaccine targeting in their, you know, sort of list of targetable
00:32:38.320
criteria. They no longer proactively recommend these groups. That was something that happened
00:32:44.080
to me in, you know, 2015. I'm sorry, 20, gosh, 14. I had had a baby and all of a sudden these anti-vaccine
00:32:50.380
groups were being recommended to me, not because I searched for them, but because they were being
00:32:54.040
proactively pushed. And so you see the platforms begin to think about, hey, ethically,
00:32:58.060
what should we proactively amplify? Maybe this is not something we have to proactively amplify.
00:33:03.760
If people want to search and go find it, they can, but we're not going to proactively boost it.
00:33:08.080
So these are the sorts of frameworks and questions they've already been asking for
00:33:11.040
four or five years prior to when COVID happens. But one thing that they constantly emphasize
00:33:16.400
in conversations with, you know, researchers like me is, unfortunately, the institutional health
00:33:22.460
authorities produce really boring content. Nobody is picking it up and sharing it, right?
00:33:28.060
Nobody is like, hey, this is a really great, you know, PDF by the CDC. Let me go and boost it.
00:33:33.100
That's not happening. So what you see is physicians groups who know this, right? Like people who are
00:33:39.780
like normal people on the internet who are extremely online know that nobody is sharing that stuff. And
00:33:44.260
people who are engaging with patients all day long actually begin to also say, hey, I'm a doctor.
00:33:49.760
I have something of a following, not very big, but I understand what's happening.
00:33:53.300
How can I get my, you know, my experience as a frontline worker during COVID kind of out there
00:34:00.000
into the world so people understand what's happening? Or when the vaccine rolls out, how can I explain,
00:34:05.200
like how can I contextualize a side effect? So what you start to see is people who have never worked
00:34:11.860
to be influential on the internet, they unfortunately don't have very big followings, all of a sudden
00:34:16.460
realizing that the institutional communicators are not doing that great a job. The government is putting
00:34:22.320
out messages, but the government is distrusted by, you know, half the people at any given point in
00:34:26.320
time in the U.S. these days. So can they try to counterspeak? Can they try to put out content?
00:34:32.540
And they do when they are, but they're not getting any lift. They're not getting any amplification.
00:34:37.220
So this becomes a question of how should platforms uplevel good content and good speakers and
00:34:42.800
accurate information to the, you know, best possible extent that we understand what's accurate at a given
00:34:47.740
point in time. And it becomes very much a, you know, they make mistakes as they're doing it. You
00:34:52.680
do see policies come into play, like the decision to throttle the lab leak hypothesis, right? Which is
00:34:59.500
a weird one. It's kind of an outlier if you look at all the other policies that they pulled together,
00:35:03.800
because most of the others relate to some form of harm, right? A false cure can actually hurt you.
00:35:10.380
Misinformation about a vaccine that increases hesitancy can actually hurt you. But the lab leak,
00:35:15.320
like that one's sort of a, you know. No, this fell into the woke bucket of, you know,
00:35:21.260
this quasi-racist, right? Yes, that was how it was justified. But even so, it was one of these
00:35:26.920
things where, you know, it was a perplexing choice. And unfortunately, then it became a cudgel to
00:35:32.760
sort of undermine or to complain about every policy that they tried to implement, many of which did have,
00:35:39.180
as you know, like very real reasons for existing. Yeah, yeah. Is there something,
00:35:45.260
unique in your mind about vaccines here? Because I mean, I just noticed that there's, I mean,
00:35:51.100
maybe there are, you know, activist groups around other categories of medical error that I'm just,
00:35:57.520
or perceived medical error that I'm not aware of. But I don't see people get, I mean, this really does
00:36:05.060
have a cultic, you know, slash religious quality to it. This level of activism, advocacy, and
00:36:13.420
immunity to any kind of, you know, fact-based correction. And so, like, I mean, to take a,
00:36:21.680
like a personal example, like, you know, like, I think I've said this before on the podcast, like,
00:36:25.360
you know, you know, I tried to take statins at one point and got side effects and then discovered
00:36:29.460
that something like 5% of people just can't take statins because they get, you know, muscle aches and
00:36:34.060
they can, you know, get torn Achilles tendons. And, you know, it's just, statins are great for millions and
00:36:38.920
millions of people, but they really suck for, you know, about 5% of people who try them. And I'm in
00:36:43.680
that group. But so having had that experience, it would never have occurred to me to have become an
00:36:49.780
anti-statin activist, right? Or to find a group that could kind of ramify my personal concerns about
00:36:56.840
statins, you know, was I harmed? And, you know, was this experience, you know, even if I had it torn an
00:37:02.580
Achilles tendon, which I happily didn't, it would never have occurred to me to organize my life around
00:37:07.960
the dangers of statins based on my personal, you know, statin injury. It occurs to me now I have
00:37:14.220
another example of this. I had, you know, some hearing loss that just came out of nowhere about
00:37:18.980
20 years ago. You know, there are all kinds of drugs that are ototoxic, right? I mean, perhaps I took
00:37:25.280
a course of antibiotics or, you know, some other drug that destroyed, you know, some of the hair cells in my
00:37:31.520
cochlea, right? I don't know. But just again, it would never have occurred to me to then make
00:37:38.340
kind of go on a quasi-spiritual quest to figure out the connection here. And yet vaccines, I mean,
00:37:44.980
I guess I'm in the process of starting to answer my own question here that I guess because they relate
00:37:50.280
to, you know, something we're putting into the bodies of healthy children. But even there, there are
00:37:56.080
all kinds of interventions and drugs that children get exposed to that I just don't think draw this kind
00:38:00.860
of energy. I mean, what, do you have thoughts about this? So in the, so it is, I think, you know,
00:38:06.180
kids is a huge piece of it, right? You know, everybody, particularly after you've had a baby,
00:38:10.120
you know, you have a, first of all, you're deluged with information about how you should take care of
00:38:16.420
the baby, how you should deliver the baby. You know, the, the anti-epidural crew is, is very much,
00:38:21.680
you want to talk about people who make it their life's work to scream at you on the internet,
00:38:25.600
talk to anybody who's been in a, you know, mom board and is debating whether to have medication
00:38:31.140
when they deliver or not. But the, the thing that's interesting is that the first, so anti-vaccine
00:38:39.860
narratives are very old. They go back to the 1800s, you know, the first, the advent of variolation for
00:38:45.860
smallpox, right, is, is seen as something akin to witchcraft, right? You're taking, because it comes,
00:38:51.980
you know, it comes from like, you're taking material from a cow, right? You know, and you're sort of
00:38:55.160
swabbing it and it seems kind of gross, right? Yeah. Um, so there's a lot of, a lot of things
00:38:59.640
that, that, that, that triggers that make people uncomfortable. There's a liberty component. Again,
00:39:04.240
the idea that you should be compelled to do something for the public good. Uh, this is
00:39:10.260
something that various moments in history have, has been seen as, um, you know, something pro-social
00:39:15.180
that we do to help our neighbor versus something that the, you know, authoritarian tyrants demand of
00:39:19.500
us. There's some actually terrible stats coming out now about how Republicans feel about, um,
00:39:25.160
about school vaccines and just the precipitous decline in support among people who have the
00:39:30.860
strongest degree of support for Donald Trump. So very heavily correlated to that. And it's sort of
00:39:35.260
stats are just beginning to come out now as, uh, as, as it's become part of political identity.
00:39:40.140
But one of the things that happens, and I, I call it asymmetry of passion in the book, is that
00:39:44.960
when you have people who sincerely believe that vaccines cause autism and, you know, it is something
00:39:51.760
that parents are very afraid of. So a lot of the narratives around vaccines connect it to autism
00:39:57.780
or to SIDS is the other big one, sudden infant death syndrome. And so it creates a degree of
00:40:04.500
hesitancy because these are not small side effects. These are life altering, you know, potentially fatal
00:40:10.320
in the case of SIDS risks that, that, that the anti-vaccine propagandists are telling you, you are
00:40:15.980
taking when you don't need to, the argument is your baby is healthy. Why would you do this?
00:40:21.560
So the cost of, you know, the, the cost is what makes people very afraid, I think. And you have
00:40:29.220
most people who go and back, you know, go vaccinate, nothing happens. And they don't talk about the
00:40:34.020
many, many, many positive experiences or the- How fun it is not to get measles. Yeah.
00:40:38.200
Right. Exactly. And so what you hear instead is only the people who either do have a legitimate
00:40:44.840
documented vaccine reaction, and that is, you know, you have a higher chance of being struck
00:40:48.560
by lightning or things that they attribute to a vaccine themselves, all evidence to the contrary,
00:40:55.520
like autism and that, that narrative, even though it's been, you know, debunked over and over and over
00:41:02.100
again, people have to trust the debunkers, which means they have to trust the scientists or trust the
00:41:07.860
health authorities. And as distrust in government and health authorities has increased, you're going
00:41:13.500
to see, and we're already seeing a rise of distrust in childhood immunizations as well, that's not
00:41:20.180
rooted so much in the actual facts and evidence, but just in what is the information you're seeing
00:41:24.160
and who have you decided is, is a trustworthy source. Yeah. There's an asymmetry here between committing
00:41:30.880
some harm and, and the, that triggering, you know, loss aversion or harm aversion and balancing that
00:41:38.220
against this hypothetical risk that one is avoiding, but one will never really be confirmed to have,
00:41:44.200
have avoided it. Right. So the idea that you could do your child some lasting harm or even kill them
00:41:52.040
based on an effort to avoid something that is in fact abstract, you know, that's just the worst
00:41:58.560
possible emotional optics. All right. So we've kind of set the context for your own personal
00:42:03.900
adventure or misadventure here. What's the right place to enter here? I mean, actually one, so the
00:42:10.640
last time you were on the podcast, you were on with Barry Weiss and Michael Schellenberger. Barry has
00:42:18.340
kind of dropped out of the, you know, this conversation and this controversy. So I don't know that we need to
00:42:24.400
bring her into it, but Michael, I, in the, in the aftermath of that podcast, I just stumbled upon
00:42:31.620
an amazing clip of Schellenberger talking to Brett Weinstein about your appearance on my podcast and
00:42:39.480
your appearance on the Joe Rogan podcast and how nefarious all that, that appeared. It was very,
00:42:46.140
very strange that you had appeared on my podcast next to Schellenberger. And it was the, you know,
00:42:51.880
it was even more deeply strange that you had somehow gotten onto Joe Rogan's podcast. And all of this
00:42:57.280
was quite amusing from my point of view, because I know exactly how you got onto both of those
00:43:01.720
podcasts because we, you know, one of the podcasts was mine and I just happened to invite you and
00:43:05.500
Schellenberger because I wanted, you know, I wanted the two of you to, to have a conversation,
00:43:09.500
um, along with Barry. And I also happen to know that you, you got on Rogan's podcast because I
00:43:15.580
texted him and he said, he should invite you on the podcast. So, you know, I'm, I was the, uh,
00:43:21.380
the evil, um, nexus here, but they were spinning up a conspiracy about you that you are, you know,
00:43:27.580
kind of a deep state CIA plant that has, um, been weaponized to mislead the masses about many things,
00:43:35.940
but, you know, government censorship and a free speech, you know, perhaps, you know, first among
00:43:40.420
them. And then you were, they, along with, um, a few other people, I guess Matt Taibbi is prominent
00:43:46.880
here in unveiling the treasure trove of, uh, the Twitter files, uh, for the world to see,
00:43:54.640
they really, uh, went after you by name. And this has, you know, that, you know, you feel free to
00:44:00.760
pick up the story here, but this has really been extraordinarily disruptive to your life. I mean,
00:44:06.220
you and I haven't talked about this, but I mean, just, just viewing it from the outside,
00:44:09.980
it certainly seems to have been. So tell us what happened here.
00:44:14.020
Yeah. I, I think about it as, um, you know, it's, it's, it's opportunism and a fairly classic
00:44:19.440
smear campaign, right? Well, one thing that's interesting about my job is, uh, I've seen this
00:44:23.820
happen to so many people that when it happened to me, it was not either surprising nor, nor novel.
00:44:28.620
It was more, and we can talk about this, the frustration of how to respond to it because I
00:44:32.340
had opinions and Stanford had opinions and they were not aligned. But now that you mentioned Stanford,
00:44:36.460
I mentioned it earlier, but perhaps you should just say what you were doing at
00:44:42.640
Yeah. So Stanford internet observatory, I joined in 2019. I was the first research director and
00:44:49.120
we were a center for the study of adversarial abuse online. And that took, you know, several,
00:44:55.980
several different types of research. That was trust and safety research where the adversarial harms
00:45:00.700
were looking at things like spam and scams, pro-anorexia content, brigading harassment, you know,
00:45:06.640
bad experiences that people have on the internet, sort of human nature side of, uh, there's a phrase
00:45:12.080
sometimes that trust and safety employees use, like the problem with social media is people. So
00:45:15.660
it looks at, you know, how online conflict happens and things like that. But another area of work that
00:45:20.040
we did looked at propaganda and influence and, uh, state actor disinformation campaigns and things
00:45:25.360
like this. So I did a lot of work on Russia and China and Iran, sometimes the U S Pentagon, right?
00:45:29.980
It was the adversary running, uh, running influence operations. And, uh, and I did a bunch of work
00:45:34.840
studying those dynamics, including, and this is, I think what put me on Tybee's radar, uh, in the
00:45:39.620
context of Russia, right? And so, uh, I think the first time I came on your pod actually was before I
00:45:44.460
even started at Stanford. It was because I was one of the outside researchers who did the investigation
00:45:48.940
into the Russia dataset that the social media platforms turned over to the Senate intelligence
00:45:53.820
committee related to the internet research agency's propaganda operations from 2015 to 2018. So that
00:45:59.940
included their election interference. And then also the GRU Russian military intelligence and the sort
00:46:04.700
of hack and leak operations that they put together. And the work I did there was very honestly mundane
00:46:10.640
candidly, right? It was, um, how do we describe the stuff in this dataset? How do we understand the
00:46:15.820
strategy and tactics of a, of a modern Russian propaganda campaign carried out on social media? I never said
00:46:21.440
it swung the election. I never said anything about collusion, none of that stuff. But one of the
00:46:25.800
things that happens is when people want to smear you, they, they find ways to tie you into existing
00:46:31.460
hated concepts or groups. And so all of a sudden they tried to turn me into this Russiagate hoaxer is
00:46:37.420
the phrase that they use, um, alleging that I said that this somehow swung the election, which I had never
00:46:43.580
said. Uh, but again, one of the things that you learn very quickly is that when the allegation is made,
00:46:48.380
the audience is not going to go and dig up everything I've said or done or written over the last eight
00:46:53.600
years. They're just going to take the word of their trusted influencer. So one of the things that SIO did
00:46:59.860
going back to SIO is that in 2020, we ran a very big project called the Election Integrity Partnership in
00:47:06.040
conjunction with the University of Washington, uh, this group called Graphica and the Digital Forensics Research
00:47:11.600
Lab at the Atlantic Council, who also periodically gets tagged as being one of these, you know,
00:47:15.500
imperialist Russia hoaxers. And so the work that we did in the 2020 election just sought to study
00:47:21.840
narratives about voting. So not Hunter Biden's laptop that was completely out of scope for the
00:47:28.220
project. We were only looking at things related to allegations about procedures having to do with
00:47:33.940
voting. So for example, tweets that said vote on Wednesday, not on Tuesday. So procedural interference
00:47:38.740
things or narratives that attempted to delegitimize the election preemptively. And we laid out this scope
00:47:47.020
very clearly publicly. And we had a whole public website, a rapid response, you know, tweets, blog
00:47:53.200
posts, you name it. We worked in full public view. This was not funded by the government. And what we did
00:47:58.320
was we studied the 2020 election from August until November. And we just kind of documented and chronicled
00:48:05.440
the most wild viral narratives, these sort of election rumors and propaganda. At the time, we
00:48:10.740
used the word misinformation a lot too, you know, that we're spinning up about dead people voting and
00:48:15.240
voting machine fraud. And, you know, and of course, unfortunately, we started the project thinking it
00:48:19.500
was going to be a lot of state actors, right? Thinking it was going to be Russia and China and Iran. And
00:48:24.040
they were in there a bit, but the people trying to undermine the 2020 election is the president of the United
00:48:29.080
States. So this turns into, you know, we are, again, this is an academic enterprise. We have
00:48:36.420
about 120 students who are working on this project. And what they're doing is they're creating JIRA
00:48:40.880
tickets. So JIRA is just a kind of online, sorry, not online. JIRA is like a tech kind of customer
00:48:47.240
service ticket queue. If you've ever filed a bug report for a, you know, app, it's gone into a JIRA
00:48:52.780
somewhere. And people just kind of trace the tickets. They just follow it through. So we were using
00:48:57.300
that technology to trace election narratives. And periodically, we would tag social media platforms,
00:49:04.540
meaning we would let them know, hey, you've got a viral thing here. It appears to violate your
00:49:08.520
policies. You know, you should have a look at it. Or we would periodically tag state and local
00:49:13.980
election officials. So for example, hey, this narrative about Sharpie markers in Arizona isn't
00:49:19.660
going anywhere. It's getting bigger. It's actually probably worth a response. We're not going to tell you
00:49:23.680
how to respond. But, you know, this is a thing that is worth paying attention to. And they in turn
00:49:28.440
could also reach out to us. And they had the ability to reach out to platforms directly. They
00:49:34.680
didn't need us to serve as some sort of switchboard conduit to, you know, get platforms to take down
00:49:41.040
tweets that offended election officials or the Department of Homeland Security or whomever. But that
00:49:46.120
was how it was reclassified by the Twitter Files boys and the right-wing media that discovered that
00:49:53.600
we had done this work two years after we did it and got mad about it then. And this was around the time
00:50:00.720
that the election fraud narratives and court cases and the Dominion lawsuits and all of the sort of
00:50:08.640
big lie, right, the effort to actually delegitimize the election had kind of fallen away. They'd lost all
00:50:14.360
their court cases. There was no evidence that anything had been stolen. And so they turned to
00:50:18.960
looking at us. And they reframed the project that we had done as being something that kind of cataloged
00:50:26.440
and triaged and addressed election rumors and misinformation into something that had been
00:50:31.760
a vast cabal-like effort to censor conservative speech. And what you start to see is this individual
00:50:39.680
named Mike Benz, who, you know, connects with Tybee and Schellenberger ahead of their Twitter Files
00:50:45.040
testimony to Jim Jordan's committee. He begins to make these crazy allegations that we had somehow,
00:50:51.240
you know, used an AI censorship death star. I'm not making that up. It's a verbatim claim to mass
00:50:58.620
censor preemptively entire classes and categories of narrative. And again, this is one of these things
00:51:04.820
where if you stop and think for 30 seconds, the man is alleging that we somehow managed to
00:51:10.260
censor the discussion about Sharpiegate and, you know, Sharpie markers in Arizona or the discussion
00:51:17.380
about Dominion voting machines. I mean, these are things that anybody who paid even marginal attention
00:51:23.540
to the narratives that went viral on social media about the 2020 election will remember. And that's
00:51:28.860
because they were not censored at all. They were instead actually wildly viral. And one of the things
00:51:34.580
that we did was after the election, we added up how many tweets had gone into these sort of top 10
00:51:40.800
most viral procedural and, you know, procedural election rumors we'd looked at or these delegitimization
00:51:46.380
claims. And the number that we arrived at was 22 million. So again, after the election, we do this
00:51:51.820
analysis, we add up a bunch of numbers, and we say, okay, there are about 22 million tweets related to
00:51:56.260
these most viral election rumors. And then Matt Tybee and Michael Schellenberger, under oath to Jim Jordan's
00:52:02.360
committee, say that we censored 22 million tweets. So the actual number, and again, this is all in our
00:52:09.880
report. Anybody can go and read it. It sat on the internet for two years prior to this all happening.
00:52:14.860
But what they do instead is they reframe this as some vast cabal that has censored 22 million tweets.
00:52:21.200
No evidence is ever offered of this, mind you. There is no Twitter files dump in which they produce the
00:52:27.620
list of the 22 million tweets. Elon Musk has never made the supposed 22 million tweets available to
00:52:33.240
the public, nor has Schellenberger or Tybee produced any actual emails in which we are requesting that
00:52:38.640
this happen. But it doesn't actually matter at this point, right? Because they managed to sort of launder
00:52:43.940
this through the right-wing media apparatus, and then more importantly, through a series of congressional
00:52:49.000
hearings. And all of a sudden, the actual work that we did, just sort of studying these election
00:52:52.920
rumors, becomes, you know, oh, along with the Hunter Biden laptop, they censored all of these
00:52:58.600
things. So all of a sudden, we're also lumped into, you know, Hunter Biden land, which is just a
00:53:02.840
shibboleth, you know, and that just kind of conveys this idea that we had something to do with that
00:53:08.460
also, even though we never worked on it and never weighed in on it. So again, you know, smearing by
00:53:12.980
association is a very common tactic. It's just when it becomes, when it's not only internet lunatics that
00:53:19.840
are doing this, but when it is the sitting, gavel-holding members of the United States
00:53:25.360
Congress and attorneys general that are doing it, that's when you cross a line from, okay, this is
00:53:31.780
an online smear campaign, and that's, you know, kind of a pain in the ass, but, you know, cost of doing
00:53:35.540
business, to now the American government has been weaponized against its own citizens to go on these
00:53:40.540
crazy witch hunts based on nothing more than, you know, something some yokel put out in a tweet.
00:53:45.900
Yes, I want to echo a big picture point that you make in your book somewhere, I think explicitly,
00:53:53.120
which is that what's happening here is that you have an academic institution and a group of people,
00:54:00.200
most of whom are students, I think most of whom are undergraduates even, doing a project to study
00:54:07.060
the spread of misinformation online entirely within the scope of their own free speech rights,
00:54:15.280
and the product of that effort is of much greater concern to the people we're talking about, you
00:54:21.640
know, Matt Taibbi, Michael Schellenberger, this person, Mike Benz, who seems like a proper lunatic,
00:54:28.660
and everyone else who's trailing in their wake, you know, in Trumpistan, this is of a much greater
00:54:35.180
concern to them than the actual attempt in plain view by the sitting president to steal an election,
00:54:42.200
right? I mean, like, that's how upside down this whole thing is.
00:54:45.620
Well, the people who, the congressmen with the gavels who subpoena us and demand our documents
00:54:52.100
are congressmen who voted not to certify the election. The attorneys general who, you know,
00:54:58.780
begin to name check our undergraduate students in depositions, right? You can imagine what the
00:55:04.440
internet does with that. These are attorneys generals who joined amicus briefs to overturn the
00:55:10.620
Pennsylvania vote, right? To fight against the Pennsylvania vote alongside Ken Paxton in Texas,
00:55:15.480
right? So you have, and then the people who then subsequently sue us, by the way, because that's
00:55:20.120
the third part of this, you know, the, which I can't really talk about because I'm under pending
00:55:23.880
litigation now for a little over a year at this point. Stephen Miller and America First Legal sue us
00:55:28.980
based on this, you know, this allegation, this series of allegations, again, evidence-free,
00:55:32.740
baseless allegations. But again, the people suing us are people who also supported the big lie,
00:55:38.000
right? So there is something that these entities have in common. It's not accidental. This is
00:55:42.820
political retaliation. And I think that that is a piece that, you know, I found it a little bit
00:55:48.380
frustrating that we did not emphasize that in our communication about what was happening,
00:55:53.860
that that is like left out. That is, you know, the motivation.
00:55:57.000
If you'd like to continue listening to this conversation, you'll need to subscribe at
00:56:02.600
samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense
00:56:07.880
podcast. The podcast is available to everyone through our scholarship program. So if you can't
00:56:13.100
afford a subscription, please request a free account on the website. The Making Sense podcast
00:56:18.120
is ad-free and relies entirely on listener support. And you can subscribe now at samharris.org.