#310 — Social Media & Public Trust
Episode Stats
Length
1 hour and 8 minutes
Words per Minute
180.98781
Summary
In this episode of the Making Sense podcast, I speak with Barry Weiss, Michael Schellenberger, and Reanne Di Resta about the loss of public trust in institutions, and the way social media seems to have facilitated that loss, and about the role of expertise in the modern world, and what it means to be an expert in the 21st century. We discuss the role expertise plays in our society, and why we should trust in it, and how it can be used to improve our lives, and in particular, in order to solve our most pressing problems. This is a conversation about what I consider to be a very important issue we focus on through the lens of the so-called "tweets files," but it really is about the ways in which social media seem to have contributed to the erosion of trust in our institutions, as well as in our ability to trust in the people who run them, and their ability to do the things we need them to do, and to provide the things that matter most to us, the people we need to do to improve the world. If you're interested in learning more about expertise, then you'll want to sign up for my newsletter, "The Social Dilemma," where you'll get access to all sorts of ideas and ideas related to the social dilemmas we discuss the social dilemma. You can find it here. We don't run ads on the podcast, and therefore it's made possible entirely through the support of our subscribers, so that you can enjoy what we're doing here! by becoming a supporter of the podcast by becoming one of The Making Sense Podcast, where you can access full episodes of the making sense to access the full archive of all kinds of excellent episodes of podcasts, including the Making sense Podcasts, including "Making Sense Podcasts and much more. Thanks to our sponsors, Sam Harris and The Free Press, for making sense of it all! Make sure to subscribe to the podcast and subscribe to our newest episode of Making Sense, wherever you get the latest updates on the latest episodes of this podcast, making sense, and all things making sense. -Sam Harris . Sam, - Sam Harris - making sense? - Thank you for listening to this podcast? -- Sam Harris, "The Making Sense" -- (Making Sense? -- The Making sense? -- --
Transcript
00:00:00.000
welcome to the making sense podcast this is sam harris just a note to say that if you're hearing
00:00:12.500
this you are not currently on our subscriber feed and will only be hearing the first part
00:00:16.900
of this conversation in order to access full episodes of the making sense podcast you'll
00:00:21.800
need to subscribe at samharris.org there you'll find our private rss feed to add to your favorite
00:00:27.020
podcatcher along with other subscriber only content we don't run ads on the podcast and
00:00:32.500
therefore it's made possible entirely through the support of our subscribers so if you enjoy
00:00:36.540
what we're doing here please consider becoming one today i'm speaking with barry weiss michael
00:00:48.840
schellenberger and renee di resta barry is the founder and editor of the free press
00:00:54.480
and host of the podcast honestly from 2017 to 2020 she was an opinion writer and editor at the new
00:01:02.680
york times and before that she was an op-ed and book editor at the wall street journal and a senior
00:01:08.760
editor at tablet magazine and i highly recommend that you sign up for her newsletter and check out
00:01:13.820
what she's doing over at the free press and you can find that at the fp.com michael schellenberger
00:01:20.120
is the best-selling author of san francisco why progressives ruin cities and also apocalypse never
00:01:27.360
why environmental alarmism hurts us all he's been called an environmental guru a climate guru
00:01:34.240
north america's leading public intellectual on clean energy and a high priest of the pro-human
00:01:40.380
environmental movement he is the founder and president of environmental progress an independent
00:01:45.680
non-profit research organization that incubates ideas leaders and movements and a co-founder of
00:01:51.580
the california peace coalition an alliance of parents of children killed by fentanyl as well as
00:01:56.720
parents of homeless addicts and recovering addicts and he also has a newsletter over on substack titled
00:02:03.160
public and finally renee di resta is the technical research manager of the stanford internet observatory
00:02:09.500
a cross-disciplinary program of research teaching and policy engagement for the study of abuse and
00:02:15.640
current information technologies renee led an investigation into the russian internet research
00:02:20.340
agency's multi-year effort to manipulate american society and she has studied influence operations and
00:02:27.520
computational propaganda in the context of pseudo conspiracies terrorist activity and state-sponsored
00:02:34.480
information warfare she's advised congress the state department and other academic civil society and
00:02:41.620
business organizations on these topics she also regularly writes and speaks about these issues and is an ideas
00:02:48.400
contributor at wired and the atlantic and she appeared in the netflix documentary you might have seen
00:02:54.240
the social dilemma so this is a conversation about what i consider to be a very important issue
00:03:02.140
we focus through the lens of the so-called twitter files but it really is a conversation about
00:03:09.320
the loss of public trust in institutions and the way social media seems to have facilitated that
00:03:16.620
and one thing you might hear in this conversation at various points is a tension between what is often
00:03:24.200
thought of as elitism and populism and i should say up front in that particular contest
00:03:31.960
i am an unabashed elitist but that doesn't mean what most people think it means for me it has
00:03:39.220
nothing to do with class or even formal education it has to do with an honest appreciation for
00:03:46.540
differences in competence wherever those differences matter now when i call a plumber i have called him
00:03:53.920
for a reason the reason is i have a problem i can't solve right i don't know a damn thing about
00:04:00.620
plumbing so when my house is flooding with sewage backing up from the street and the plumber arrives
00:04:08.600
that man is my god jesus never received looks of greater admiration than i have cast upon my plumber
00:04:16.960
in a time of need and so it is with a surgeon or a lawyer or an airline pilot whenever there is an
00:04:25.780
emergency there is such a thing as expertise and we are right to want it because the alternative
00:04:34.100
is ignorance and incompetence and needless suffering and untimely death without plumbers
00:04:42.460
we live in our own filth and we've been doing that online for many years now and it's getting
00:04:50.460
disgusting of course i'm not saying that we should blindly trust experts and i'm not saying experts
00:04:58.140
haven't failed us in shocking ways but we are shocked against a background assumption that expertise
00:05:05.500
is a real thing and that our institutions have a responsibility to provide it if you didn't think
00:05:11.400
expertise was real then there would be no basis to be disappointed when our most esteemed institutions
00:05:17.360
fail to function and of course there are whole fields of pseudo expertise i've often criticized
00:05:24.440
organized religion for this me ask a catholic bishop about the holy spirit or about the problem of human
00:05:30.700
evil and you will get a lot of fake knowledge aimed at your brain we could add a few branches of the
00:05:37.440
humanities here along with scientific results that don't replicate but are still widely believed
00:05:43.140
and there are surely situations where real experts thinking about real problems can have their thinking
00:05:49.520
distorted by bad incentives it is absolutely true that institutions become corrupt all of this is
00:05:57.900
true and yet none of this suggests that the solution to our problems is for everyone to just quote
00:06:04.480
do their own research of course now we have chat gpt which would seem to make doing your own research
00:06:11.320
more promising eventually i think it will i think that many things will become like chess
00:06:17.020
what's the best next move in a game of chess should you ask the best human player on earth or should
00:06:23.640
you ask a computer the answer now is you should ask a computer and that will be the answer for as long as
00:06:30.020
computers are computers and we remain apes i think many other areas of human intelligence will be toppled
00:06:37.980
in just this way but not yet someone on my team asked chat gpt why i left twitter and it produced the
00:06:46.860
following paragraph sam harris quit twitter in december 2018 i wish i had quit in december 2018 it took me
00:06:56.100
four years longer than that he cited the platform's inability to handle discussions about controversial
00:07:02.180
topics without devolving into toxicity and harassment as the main reason for his decision
00:07:07.660
i agree with that but that's not really what i said at least i'm not aware of saying that anywhere
00:07:14.720
in a blog post harris wrote quote twitter is now one of the worst possible forums for the exchange of
00:07:21.100
ideas and that he had grown quote increasingly disturbed by the way the platform quote amplifies
00:07:27.780
the very worst aspects of our culture i'm actually not sure if those are accurate quotes from somewhere
00:07:32.740
or not but i definitely didn't write anything in a blog post he also expressed his dissatisfaction
00:07:38.820
with the way twitter is used to silence dissenting voices and spread misinformation after quitting
00:07:44.420
twitter harris continued to communicate with his audience through other means including his website
00:07:49.000
and podcast okay well it appears to know that i have a podcast but it seems that our robot overlords
00:07:56.340
haven't quite arrived yet and so we still need to rely on human experts for a while this point
00:08:02.740
between elitism and populism comes down to the following claim not everyone's opinion is valuable
00:08:12.420
on many many topics my opinion isn't valuable i shouldn't even have an opinion having a strong
00:08:21.940
opinion when you know nothing about a topic it's your political right sure but it's also a symptom of a
00:08:29.580
psychological problem and having a society filled with such people becomes a social problem and social
00:08:37.320
media has been a vector of strong divisive unfounded opinions and lies for over a decade i mean really
00:08:47.040
you just have to react to that thing that aoc said about that thing that tucker carlson said about that
00:08:53.740
thing the cops may or may not have done in a city you've never been to and will never go to even if you
00:08:58.380
live a thousand years and then you need to respond to all the people who didn't understand what you meant
00:09:03.200
or who were just pretending not to understand what you meant and you're going to do this a dozen times
00:09:07.880
a day for what the rest of your life oh you're not going to do that you're just going to watch other
00:09:12.840
people do it every day and then what you're going to find your real life in between all of that
00:09:18.560
scrolling what an astounding waste of time that was but the social consequences of our spending time
00:09:27.020
and attention this way are well worth talking about and the question of whether it's possible to build a
00:09:32.700
social network that is genuinely good for us is a very important one and those are among the topics
00:09:38.500
of today's podcast but i want you to keep a few distinctions in mind because there's been an
00:09:44.820
extraordinary amount of misinformation spread about what i think about free speech and content moderation
00:09:50.940
and censorship online so i just want to put a few clear landmarks in view the first is that i
00:09:58.380
absolutely support the right of anyone anywhere to say almost anything i don't think people should
00:10:05.360
be jailed for bad opinions so for instance i don't think the laws against holocaust denial
00:10:10.660
that exist in certain european countries are good as much as i agree that it's insane and odious
00:10:17.140
to deny the holocaust people should be free to do it now the question of whether they should be free
00:10:23.440
to do it on a social media platform must be decided by the people who own and run the platform
00:10:29.580
and here i think people should be generally free to create whatever platforms they want
00:10:33.840
so elon now owns twitter i think he should be free to kick the nazis off the platform
00:10:40.280
if that's what he wants to do i might not agree with his specific choices
00:10:44.380
he kicked kanye west off the platform for tweeting a swastika inside a jewish star
00:10:49.360
i honestly doubt i would have done that i mean can you really have a terms of service
00:10:54.180
that doesn't allow for weird swastikas that seems impossible to enforce coherently but the point is
00:11:01.140
i think elon and twitter should be free to moderate their platform however they want conversely i think
00:11:07.740
a nazi should have been free to buy twitter and kick all the non-nazis off the platform twitter is a
00:11:15.560
company it should be free to destroy itself and to inspire competitors and many people think it's in
00:11:22.960
the process of doing just that and it remains an open and interesting question what to do when the
00:11:29.520
nazis or the semi-nazis start using your social media platform and similar questions arise about people
00:11:35.980
who spread misinformation or what seems to be misinformation where is the line between necessary
00:11:43.080
debate which i agree we should have about things like how to run an election or vaccine safety
00:11:48.940
but where's the line between debating these things and simply making it impossible for people to
00:11:54.660
cooperate when they really must cooperate for instance after an election when you have a sitting
00:12:00.520
president lying about the results being totally fraudulent or during a global pandemic when the
00:12:06.780
healthcare systems in several countries seem on the verge of collapse there is a line here and it might
00:12:12.660
always be impossible to know if we're on the right side of that line it's simply not enough to say
00:12:18.800
that sunlight is the best disinfectant because we have built tools that give an asymmetric advantage
00:12:25.280
to liars and lunatics we really have done that social media is not a level playing field
00:12:33.260
and the idea that we are powerless to correct this problem because any efforts we make amount to quote
00:12:40.420
censorship is insane it's childish it's masochistic and it is demonstrably harming society
00:12:49.620
but this is a hard problem to solve as we're about to hear as i said we take the twitter files release
00:12:57.140
as our focus because both barry and michael were involved in that release but the four of us speak
00:13:04.960
generally about the loss of trust in institutions of media and the government we discussed barry and
00:13:11.160
michael's experience of participating in the twitter files release the problem of misinformation the
00:13:17.380
relationship between twitter and the federal government russian influence operations the
00:13:22.440
challenges of content moderation hunter biden's infamous laptop the need for transparency
00:13:28.640
platforms versus publishers twitter's resistance to the fbi political bias jk rowling the inherent
00:13:38.560
subjectivity of moderation decisions the rise of competitive platforms rumors versus misinformation
00:13:45.840
how twitter attempted to control the spread of covid misinformation the throttling of dr jay
00:13:51.600
bhattacharya the failure of institutions to communicate covid information well the risk of paternalism abuses of power
00:14:00.800
and other topics and now i bring you barry weiss michael schellenberger and renee di resta
00:14:13.520
i am here with barry weiss michael schellenberger and renee di resta thanks for joining me thanks for having us
00:14:19.200
uh as i said i will have introduced you all properly in the beginning but i was hoping we could have a
00:14:25.360
discussion about the the twitter files and social media generally and the failures of of the mainstream
00:14:33.520
media and the government and other institutions to maintain public trust uh and and perhaps the failure
00:14:40.320
of them to be worthy of public trust but i i think the the twitter files is the right starting point here
00:14:47.120
uh as uh as luck would have it we have barry and michael both of whom were um part of the uh the
00:14:53.520
journalistic effort to reveal these files barry let's start with you perhaps you can
00:15:00.320
really take it from the top and give us the the high level description of what the twitter files
00:15:04.880
are and uh how you came to be um part of the the release it's funny because this is one of those
00:15:12.000
stories where i feel like for half of the country it was the biggest thing that has happened in the
00:15:18.320
past decade and the other half of the country had no idea it even existed um and it was interesting to
00:15:24.480
kind of test my family in pittsburgh to find out which news sources they were reading and could tell
00:15:28.640
you everything about the way they viewed the story so basically what it is depending on how you look at
00:15:33.600
it is elon musk the new owner of twitter trying to in his words have a kind of informal truth and
00:15:42.240
reconciliation commission he understands that the platform that he just bought has lost a tremendous
00:15:47.520
amount of trust with the public was claiming to be one thing and actually in secret was something
00:15:53.200
quite different was also probably as he would frame it cooperating with the government in ways that
00:15:59.040
would make americans if they knew about it extremely uncomfortable was blacklisting people without
00:16:03.760
their knowledge and all kinds of other details along those lines and so another group of people would
00:16:08.880
say this is all about elon musk buying twitter and trying to shame the previous owners of twitter and
00:16:15.600
the previous top brass at twitter and really what this is all about is embarrassment and vengeance and
00:16:21.360
where you fall on the answer to that question tells you a lot about where you stand politically i would
00:16:25.520
say in general so basically what the mechanics of it were that elon musk decided to make available
00:16:34.640
the inner workings of the company to a number of independent journalists the first one that he reached
00:16:40.560
out to was the journalist matt taibbi who has a very popular newsletter then he texted me and reached
00:16:48.000
out to me then i reached out to michael schellenberger and then the group kind of grew from there and
00:16:53.920
came to include journalists like abigail schreier lee fang layton woodhouse and a number of other
00:16:59.920
people associated with my company the free press what was said on twitter publicly by elon musk is that
00:17:06.240
we had unfettered access to all of the inner workings of twitter everything from emails private slack
00:17:14.640
messages group slack messages and on and on and on and that was sort of the headline that was trumpeted
00:17:21.040
all over twitter and all over the press in fact what we had and michael can explain this probably
00:17:26.320
in better detail than i can because he's has a meticulous memory we basically were able to do
00:17:32.240
highly directed searches on at most two laptops between at times up to eight journalists in a room
00:17:41.520
so what we had the ability to do was to say to a lawyer working through a laborious e-discovery
00:17:47.440
tool and it came to include two different tools tell me everything that happened between the following
00:17:53.760
six c-suite level employees of twitter on the dates between january 6th and january 10th basically the
00:18:02.240
dates that trump got kicked off of twitter and basically over the course of a few days it would
00:18:06.560
spit back to us information and what came out of that was a number of stories that depending again on how
00:18:14.240
you look at it were either enormously important bombshell confirmation of what a number of or of
00:18:20.480
what a lot of people in the country had thought was actually going on on twitter what they denied
00:18:24.560
and if you're on the other half of the country and again i'm being crude here it was you know nut
00:18:29.600
picking uh it was cherry picking it was finding it was going sort of searching for anecdotal stories
00:18:37.760
that would confirm the political biases of the independent journalists involved in the project i
00:18:43.600
think the really really important thing for people to understand and i think that this wasn't explained
00:18:48.080
well enough by any of those of us who were involved is how unbelievably laborious these searches were
00:18:54.400
and how if we had the choice like it's not as if we walked into a room with organized files according to
00:19:01.760
covid masking myocarditis the election in brazil modi israel palestine like then we could have really
00:19:09.760
told you the comprehensive story instead we had to make some very difficult choices based on the kind
00:19:17.760
of tools we were using to go looking for certain stories where we knew the public story that had been
00:19:23.280
told and we wanted to see what had actually gone on behind the scenes and again in my view you know the
00:19:28.960
story of the decision to kick off trump very important story is it the number one story that was interesting
00:19:34.400
to me not at all covid was far more interesting to me but i knew that if we looked at those set of dates
00:19:40.880
that we could come out with that we could come out with some information that would be worthy of the public
00:19:46.720
interest um and we also knew that we're dealing with someone who is in many ways a mercurial person you know
00:19:54.560
any source that gives you information has motivation you have no idea when their motivation
00:20:00.320
or incentives might change and so we wanted to harvest as much information as we possibly could
00:20:05.200
in the days that we were there yeah i hadn't thought to talk about these details but now i'm interested so
00:20:11.040
i just a couple of follow-up questions so when you when you would perform a search or have someone perform
00:20:16.560
a search for you there wasn't some layer of curation from lawyers or anyone else who was telling you
00:20:23.280
who are deciding what could be released and what couldn't be released if you said give me every
00:20:28.000
email that contains the word trump between certain dates they just spit out those emails
00:20:33.280
no one of the ways that i knew that sam i just don't know how much detail you want me to get into
00:20:37.920
here but what one of the in the first few days that i was there with my wife nelly who also works with
00:20:45.760
me and building the company matt wasn't there was just the two of us elon musk and lawyers that we were
00:20:52.560
communicating with over the phone and i would ask them to do a search let's say for covid or cloth
00:20:59.760
masks or fauci or whatever and what i was getting back was garbage information it was such garbage
00:21:06.400
information that it and i'm not a paranoid person i would say michael shallenberger is way more suspicious
00:21:12.240
than i am in general i'm pretty naive but it was so bad that nelly was saying this cannot be right
00:21:18.000
this cannot be right and that's when i came to discover that the lawyer who was actually doing
00:21:23.840
the searches worked for twitter and was one of the people that we were hoping to do searches on
00:21:30.160
which is this guy jim baker who became a sort of flashpoint in the story later on and maybe michael
00:21:35.200
i can hand it over to you if you want to explain sort of the mechanics of how this worked the reason
00:21:40.400
it can be maybe a little boring the reason i think it's significant is because i think it will help
00:21:44.960
people understand why we did the stories we did right right yeah michael jump in here what was
00:21:50.560
your experience extracting information from twitter yeah i mean i think it's really fun conversation i
00:21:56.000
love i love talking about it and i was a little annoyed after that that just a lot of people wrote
00:22:01.680
stories about how they thought the process worked without just asking us because we would have said so
00:22:05.440
and i've always have taken all the time to explain it but you know as barry mentioned um barry brought me
00:22:10.720
in i do not have a relationship with elon musk i've only criticized elon musk in the past um i
00:22:17.920
criticized him in mother jones i wrote about him in apocalypse never and just obviously when barry was
00:22:23.600
like we can get access to the twitter files i was like hell yeah i mean there's no for me there was no
00:22:28.720
just it's a chance to go and get this incredible information i met when i met elon he said he did not
00:22:35.760
know who i was i you know and and basically it's just like what barry said if there was any filtering
00:22:42.720
or curation or removing of any emails we saw no signs of it and i would be shocked because the size
00:22:49.760
of the searches we were getting i can just tell you some of them we would be like you know a lot of like
00:22:56.320
all the emails for this person over this period of date and we would get you know email boxes of
00:23:01.920
2000 emails 890 emails 2000 emails 1800 emails 1800 emails 2300 emails so it's just for somebody
00:23:10.720
like we i consider myself an extremely fast reader and i'm able to process a lot of information very
00:23:15.280
quickly it took me a very long time to go through these emails i couldn't see anybody being able to
00:23:19.920
have done that and then when the emails populated in our inboxes there was no that we never saw any
00:23:25.680
evidence anything had been removed i don't think anybody i mean i'm not saying i can't prove that nothing
00:23:31.120
was but i just saw no evidence for it and i didn't see anything in elon that suggested that he cared
00:23:35.920
about that although michael it sounds like that cuts against what i was understanding barry to be
00:23:41.280
saying which was initially the search results were so crappy that you thought somebody this nefarious
00:23:47.680
lawyer was throttling the results that was before i i should clarify that was before michael had gotten
00:23:54.320
there and as soon as elon found out that that person was involved okay so the process changed
00:24:00.640
he's operating at the highest level of the company he had until i told him hey do you know that jim
00:24:05.920
baker is the one doing the search he had no idea that jim baker was the one doing the search right
00:24:10.240
then the the the people involved change he was fired and like michael said the files we got subsequent
00:24:17.600
to that there was no evidence at all that they were tampered with the thing i should add is
00:24:21.600
one of the one of the criticisms of the story of the twitter files is that we focused an inordinate
00:24:27.840
amount on a person who had been at one time the head of twitter's trust and safety this guy yoel
00:24:33.840
roth and the reason for that is that yoel roth was a very loquacious person he talked a lot on slack
00:24:40.240
and on email and in other places so it's not as if we weren't interested in other people it's just that
00:24:45.520
you know like any story you're looking for the person who's going to share the most information and he
00:24:49.760
spoke openly and a lot on platforms like slack to his colleagues it's not like we were actively
00:24:55.840
going out to interested in yoel roth i barely knew who he was before i walked into twitter right now
00:25:01.280
were either of you concerned about the optics of this process that you would appear to be at least in
00:25:08.800
part doing pr for elon uh rather than actually engaging in in journalism that was you know a more
00:25:17.120
normal sort i mean there were other constraints releases had to be done on twitter itself uh which
00:25:22.800
i think it could be argued was not the best platform for actually discussing this and really
00:25:27.920
anything at length i mean what were your concerns going into this and and is there any residue of
00:25:33.920
those concerns not for me really i mean for me it was just like we get the access to the date and
00:25:39.680
i just am not i mean people say things but i'm just not i'm not that concerned about but for
00:25:45.200
instance what i noticed before i left twitter i mean i have to admit now that i'm no longer on
00:25:49.760
twitter i don't consider myself even minimally informed about what it's like there now but when
00:25:56.160
i was there and the first drops were happening more or less everything elon said about what was coming
00:26:04.000
and what had dropped was wrong right and he was lying or just delusional about what he was releasing
00:26:10.400
you know you know the level of government but he wasn't releasing it so isn't that kind of the
00:26:14.880
point it was the frame around it i mean he was saying here it is and and and his summary of what
00:26:19.280
taibbi was saying was was just not in fact accurate in fact it was the in the case of one of taibbi's
00:26:24.960
drops it was the opposite of what taibbi said yeah i mean so i didn't bother you at all i mean it
00:26:30.720
bothered me when he tweeted my pronouns are prosecute fauci it bothered me when he said that thing about
00:26:37.120
yoel roth i've i told him that you know we we've i think barry criticized elon when he de-platformed
00:26:44.880
those journalists i retweeted it we don't control elon musk i mean we were invited into a smash and
00:26:51.680
grab situation to be able to look at as many emails as we could and we're thrilled at it and
00:26:57.440
it's super important what came out of it but no i mean i just kind of i i'm a big gen xer i'm a
00:27:03.840
breakfast club type i go on tucker carlson i talk to tucker carlson i talk to people that
00:27:10.400
my family thinks it's terrible i talk to them and i don't believe i'm not i don't have like a
00:27:15.360
view that if i talk to somebody that somehow i'm legitimizing all of their views or that if i go take
00:27:22.480
these if i go and like look for these emails that somehow i'm agreeing with elon musk i've criticized
00:27:26.560
elon musk about his policies around solar in china i'm not going to stop doing that i told him exactly
00:27:32.960
what i thought and have told him exactly what i thought and i'm just with elon the way that i am
00:27:37.760
with everybody and so no i mean and people talk shit but it's like i don't like people say things
00:27:43.920
but they're not true so i can't i'm you gotta have a stoic attitude about it which is like i'm
00:27:48.160
responsible for the things that i do i'm not responsible for what other people do i just i just
00:27:53.440
think that this is as old school as it gets like a source had documents that he wanted to leak to the
00:27:59.360
public and journalists who felt those documents were in the public interest jumped to go look at
00:28:06.000
them and any source who leaks documents or leaks a story to the new york times or the washington post
00:28:12.480
always has an agenda like that goes without saying i think the unusual thing in this case
00:28:17.280
is that the source was public about it and he made his agenda entirely transparent the entire time
00:28:24.320
so like and you know as as michael just mentioned i i think i well proved that i was not in the tank for
00:28:32.480
anyone on this matter i'm just on the side of the public knowing more information and people can decide
00:28:38.960
for themselves whether or not that information was in the public interest i certainly think that it was
00:28:43.920
and i frankly think a lot of people are resorting to sort of criticizing journalistic practice or you know
00:28:50.000
standing in or other sort of like technicalities of that sort because they don't want to confront
00:28:56.880
what the actual story is right well i definitely want to get to the story but renee i want to bring
00:29:01.600
you in here do you have anything to say at this point but just the process and the optics yeah it's
00:29:07.760
very interesting so i um for you know your audience members you probably don't know i started off talking
00:29:12.960
about social media kind of as an activist on the outside in 2015 moved into academia in 2019 and in
00:29:20.640
the intervening time the relationship between platforms and government and researchers changed
00:29:26.720
very significantly over those four years we can talk about why and how perhaps i was part you know
00:29:32.160
i'm at stanford internet observatory we we are we're i don't know part of something that was called
00:29:37.040
the twitter moderation research consortium that i think no longer exists because everybody got laid off
00:29:42.080
but it was a process by which twitter could actually share data sets with researchers and
00:29:48.160
this is relevant because we did all of our research was done independently we would receive data sets
00:29:54.080
from twitter we would do research independently and sometimes we would actually go back to them and
00:29:57.840
we would ask why is this account included why is this this doesn't feel like it fits if we're going
00:30:02.960
to tell a story to the public about this chinese network this irania network this saudi network this
00:30:08.000
russian network we want to make sure that we're doing an independent analysis and we are only going
00:30:13.040
to to say what we think we can support as researchers and what we would try to do was
00:30:18.480
look at and enrich the story with as much of a full picture as possible so the twitter data set
00:30:23.360
was almost a jumping off point to a significant process that would involve also looking for these
00:30:28.160
accounts on facebook tick tock twitter you know you sorry um youtube you name it and we what we would
00:30:34.000
try to do was not tell an anecdotal story but we would always include both the qualitative you know
00:30:40.160
here's what these accounts are saying here's what we think they're doing but we would try to include
00:30:44.000
something in the way of summary statistics here's how many of them there are here's the engagements
00:30:48.640
they're getting here's where they are situated in the public conversation relative to other accounts
00:30:54.400
that talk about these things and the reason for that is because one of the problems that i think
00:30:59.520
has been you know one of the problems with a big driver in the public conversation around content
00:31:04.000
moderation whether that's related to the kind of foreign influence campaigns or domestic activism
00:31:09.360
or anything else is that it is so anecdotal and so when the twitter files began as somebody who has
00:31:15.920
worked with platform data and also you know testified in front of congress critiquing platforms
00:31:20.560
and their lack of transparency and who has written about that for the better part of seven years now
00:31:25.360
what what has been interesting to me in the files i think i think they're very interesting just to kind
00:31:29.680
of to start with that i i'm not a person who says oh this is all a nothing burger this is not interesting
00:31:34.320
but i had kind of three issues with the process and the first was that i think a lack of familiarity with
00:31:40.640
that that multi-year evolution of content moderation policy meant that for me as an observer there were
00:31:46.720
some of these like wet streets cause rain moments you know the gelman amnesia phenomenon where
00:31:51.120
the person doesn't fully understand what is happening in context a specific example that i
00:31:57.520
that i said on twitter was uh one comment in which you see the senate intelligence committee engaging with
00:32:02.720
twitter asking it if it responded to some sort of tip from the fbi and that was very interesting to
00:32:07.760
me because i had done a bunch of work on twitter data for the senate intelligence committee in 2018
00:32:13.520
and as a researcher running that process in 2018 with no engagement with twitter whatsoever
00:32:19.520
what i knew was that the senate intelligence committee did not have very high confidence
00:32:24.240
in twitter's ability to find anything so reading that interaction was fascinating to me because they
00:32:29.520
were in my opinion essentially saying did you find this yourself or did somebody have to hand it to
00:32:34.400
you again but what the reporter who wrote that thread construed that as was are you taking direction
00:32:39.600
from the fbi and marching so this was the kind of wet streets cause rain experience that that i had in
00:32:44.480
a number of these threads where i thought gosh i wish that somebody who had either been there you
00:32:49.280
know in a in an abstract sense not in the company but who understood the evolution of that stuff
00:32:53.520
had perhaps like weighed in or been consulted and then i think the second critique was the was how
00:32:58.800
anecdotal it was and that made it feel a little bit cherry-picked and this kind of ties into maybe
00:33:04.240
point three which is that the trust and the public confidence in whether or not you believe in a
00:33:09.760
framing around an anecdote is entirely dependent on whether you trust the reporter or the outlet at this
00:33:14.720
point and that's a function of polarization in american society it is not a critique of barry or
00:33:20.080
michael or anybody else's thread it is i think the reality and so with some of the in my opinion
00:33:26.800
over emphasis on anecdote and and i recognize you know this is the process this is what you had
00:33:30.560
available to you what made that troubling to me is that it did feel like there were opportunities for
00:33:37.360
score settling and things or searching for things that you you know that a particular reporter
00:33:42.320
found problematic or wanted to dig more into but that didn't necessarily get at the scope and scale
00:33:47.680
of the phenomenon overall and i'll point specifically to something like shadow banning right fascinating
00:33:53.600
topic lots of you know many of us have looked at it over the years and made arguments that i don't
00:33:58.800
think it's something that the platforms shouldn't be able to do and we can talk about why but i do think
00:34:03.200
it should be transparent so that's that's sort of where i sit on the shadow banning question
00:34:06.800
but what we didn't get was how many users were receiving these labels in what country during what
00:34:12.960
time period how many of those who received a label were mentioned in a government request that's
00:34:18.320
absolutely kind of crucial this question of what to what extent does the government actually exert
00:34:23.040
influence over the platform it's not simply filing a report it's did the report lead to an action
00:34:29.040
and this is the sort of thing again maybe this is my bias as you know as somebody in academia
00:34:33.120
where i say like god i'd really love to get my hands on the summary stats you know can you request
00:34:36.880
those can you say like in this moderation tool you know can we connect the dots here between here's the
00:34:42.800
fbi over submitting in my opinion litanies of accounts you know really just sort of stupid process
00:34:48.720
but then what happened next and and that was like the the kind of connecting the dots there
00:34:54.640
was in my opinion kind of underdone candidly and it led to an opportunity for innuendo to drive
00:35:00.880
the story and whether or not you believe the innuendo is entirely dependent on whether you
00:35:05.280
believe or trust the outlet and the person covering the story so in the interest of
00:35:10.400
informing the public writ large that's where i felt like you know and as barry notes you
00:35:16.240
depending on i think we're saying the same thing which side of a political spectrum you sit on you
00:35:20.880
either trust or do not trust at this point i don't know that it's political spectrum so much as like
00:35:24.800
you know institutionalist populist maybe right um but but there is i think that that tension for me
00:35:31.760
was was it was where i felt and and i wrote this in the atlantic that i felt that there was a little
00:35:36.320
bit of a missed opportunity there how could we perhaps get at more of those like holistic or
00:35:40.800
systemic views informing an opinion and platform moderation that are less anecdotal and less
00:35:45.520
dependent on trust in a reporter's narrative right yeah i mean just to echo part of it it's hard to
00:35:50.880
capture kind of like how chaotic the situation was i mean it was like getting searches back at
00:35:56.400
midnight working till three in the morning the owner of twitter coming in at 12 30 wanting to schmooze
00:36:02.320
like you know i second everything renee's saying i got meaning on the question of should these
00:36:08.720
platforms not just twitter be more transparent do we have a problem with private companies that have
00:36:15.440
sort of unaccountable power over the public conversation and to what extent are they
00:36:22.240
you know doing the bidding of organizations like the fbi like that's something really important that
00:36:28.480
every citizen has a right to know not just me and michael schellenberger and matt taibbi but i just
00:36:33.440
can't emphasize enough like that the idea of going in and saying give me a report or a summary on xyz
00:36:40.400
that just wasn't something that was possible while we were there okay well let's get to what was
00:36:45.840
found out or what has been found out so far i guess as preamble i just want to say i think the
00:36:52.160
big story here which is certainly beyond the case of twitter is um our ongoing struggles to deal with
00:37:00.480
misinformation and this is something that renee obviously knows a lot about but um it seems to me
00:37:05.760
that this is the kind of thing that may never be perfectly solved in the absence of just perfect
00:37:11.280
ai and when you look at what imperfect solutions will look like they will always throw up both type
00:37:19.440
one and type two errors so you know any attempt to suppress misinformation is going to suppress real
00:37:25.920
information and that'll be embarrassing and cause some people to be irate and to allege various
00:37:32.640
conspiracies and it also you know it will fail in the other way and lots of misinformation will get
00:37:38.720
through and fail to be suppressed and this isn't merely just an engineering question this is a it's
00:37:45.040
an ethical question it's a political question and even in the simplest case where we know what is true
00:37:51.680
and what matters and what we should do on the basis of these facts and we're i would say we're
00:37:58.640
very rarely in that situation at this point but even in the best case where we know what's true
00:38:04.000
it can be very difficult to know what to do in a political environment where great masses of people
00:38:11.680
believe crazy things it's a question of how to message the truth in hand to great numbers of people who
00:38:19.200
as we've already said and no longer trust certain institutions or certain people and
00:38:27.280
will um reach for the most sinister possible interpretation of events and you know anchor
00:38:34.400
there and are you know that seems to be the the state of discourse we have on more or less everything
00:38:38.640
from public health to the very topic we're talking about so uh with that is just kind of the frame
00:38:44.480
around this perhaps um bearing michael you know either of you can start i'd love to know what you
00:38:50.880
think we have learned so far and what has been the most interesting slash concerning facts i'll say maybe
00:38:57.680
one thing and then kick it to michael i think that there are two main stories here story number one
00:39:03.840
is about the way that an extremely powerful tool that has influenced elections that has led to
00:39:11.040
revolutions claim to have a particular mission and gaslit the public as it secretly abandoned that
00:39:19.120
mission in critical ways and it shouldn't matter what your politics are like that to me is a really
00:39:26.640
important story if you believe as i do you don't need to go all the way and believe that twitter's a
00:39:31.200
public square to believe that has an enormous influence on the public conversation on deciding who is
00:39:36.240
a hero and who is a villain on on any number of things the second thing that i think is the headline
00:39:42.800
story is the way that this sort of very close relationship between the federal government and
00:39:50.720
one of the most powerful tools of communication in the world i think those are kind of sort of like
00:39:55.520
the two core stories that that came out of all of the reporting i don't know if we're on twitter files
00:40:01.280
number 121 or whatever but those those to me are the two biggest headlines and around around which
00:40:08.960
topics yeah which topics do you think i mean you're most concerned about the messaging around
00:40:15.040
covid or hunter's laptop what do you consider to be the center of gravity here on the cultural i'll
00:40:22.480
leave the government conversation to michael because he did much more on that to me it's the way that
00:40:28.320
twitter actively narrowed we can i don't know if we want to get into hunter yet but yeah i mean i
00:40:34.560
certainly think that when a private company decides to lock out a newspaper doing truthful reporting
00:40:43.440
weeks before an election on the on the spurious grounds that it was based on hacked material as
00:40:49.520
if that isn't what is printed in places like the new york times and the washington post every day yeah
00:40:53.280
i have a huge problem with that but i think one of the core things that came out of
00:40:58.320
what we what we saw especially in the the shadow banning of people like dr jay badacharya was the
00:41:04.400
way that it people inside twitter actively worked to make things untouchable to make people untouchable
00:41:14.960
to make particular viewpoints that have turned out to be very vindicated untouchable and therefore
00:41:21.680
profoundly shaped the public conversation about something like covet and the right way to respond to it
00:41:28.240
i think that is a really significant story michael i would say there's three areas i would say the
00:41:34.080
first had to do with the crackdown on misinformation bleeding into a crackdown on a free expression which
00:41:41.920
i think you alluded to sam and i'll give i'll give one big example which is facebook under pressure from
00:41:48.160
the white house censoring accurate information emailing the white house to say we are censoring this
00:41:54.640
accurate information because we view it as encouraging vaccine hesitancy now they didn't exactly black it
00:42:02.000
out it was a they repressed the spread of it but it is a form of censorship twitter did a milder version
00:42:08.560
of this with jay badacharya and and with martin kildorf who just simply said not everybody needs to get the
00:42:14.640
vaccine and they put an interstitial on it which is a kind of like warning thing saying official twitter
00:42:21.040
censors say that this is not this is not right um that was a case where uh in the case of kildorf he
00:42:27.840
was expressing an opinion and he is a harvard medical doctor not to stoop to credentialism but
00:42:33.360
he certainly i think had a right to weigh in on that question and then in the case of facebook there
00:42:37.840
was no transparency here and i should actually pause and just say how however much we disagree on many
00:42:43.600
things i've had the pleasure of being able to have an ongoing conversation with renee over the last few
00:42:47.840
weeks and we all we both very strongly support transparency that i think i agree with renee
00:42:55.760
and others that argue that transparency would solve a lot of these problems if facebook had simply done
00:43:01.360
if there was something somebody said that facebook said we are suppressing these views because we are
00:43:06.800
encouraging vaccines and we're going to allow this debate on in some way this is no so technical
00:43:11.520
obstacle to allowing that to occur so that's one number two is the laptop i think there is a very
00:43:20.720
clear pattern of behavior i cannot prove that there was an organized effort but nonetheless i think that
00:43:28.320
my thread on the hunter buying laptop shows that there was a very strange pattern and again maybe it was a total
00:43:35.680
coincidence for both existing intelligence community officials and former intelligence community
00:43:41.920
officials to pre-bunk the hunter buying laptop and we can get into the details of this but i think
00:43:48.720
suffice it to say i think it merits more investigation i strongly support congressional investigation on it i
00:43:53.920
don't think we've gotten to the bottom of it i find it extremely suspicious and i think other people do
00:43:59.360
too when they really look at it and again maybe it's maybe i'm overly pattern recognizing here i hold
00:44:05.520
that as a possibility but i think there's there's something really interesting there that has to be
00:44:10.080
talked about more and then the third thing is just this grotesque inflation of the threat of russian
00:44:15.920
influence operations it was being used as a way as a cudgel to basically start to de-platform de-amplify
00:44:26.080
censor demonize disparage discredit people that did not deserve that that's sort of what matt talks about
00:44:36.000
today and and i thought renee over email we had a there was an exchange about this but it's not it's not just a
00:44:42.160
single thing i mean it was being used as justification for all sorts of things including censoring this laptop it
00:44:49.360
became a kind of boogeyman and you know i think one one thing i wanted to do on this podcast and say
00:44:56.400
very clearly is i do think that yole roth turns out to be a more complicated character than i think he
00:45:02.960
had been perceived as in the beginning i see him he repeatedly would point out that various things were
00:45:09.280
not violations including the thing that trump was de-platformed for he said very explicitly that
00:45:14.400
trump had not broken had not violated twitter's terms of service and they then worked to create
00:45:19.920
a justification for de-platforming him same thing with the hunter biden laptop they said that it had
00:45:24.720
not violated twitter's terms of service they were very clear on this and there was there were other
00:45:29.280
instances of it now then yole roth was then uh basically overruled by the people above him so he
00:45:36.880
was a good company man but i don't think that the demonization of yole roth that had occurred
00:45:41.680
perhaps earlier in the process of looking at what happened at twitter was fair but i do think that
00:45:48.000
this you know and i think i mentioned him here in this context because he was the one that was often
00:45:52.400
pushing back against the abuse of this russian uh influence operation you mean it wasn't fair when
00:45:58.960
elon branded him pedophile in front of 120 million people that was no that was obviously wrong that was
00:46:04.800
obviously wrong absolutely 100 i i no hesitation obviously announcing that so renee feel free to
00:46:12.240
say whatever you want here but i i would love to get your take on the uh russian disinformation piece
00:46:17.600
too yeah sure so i think that where i come down and you know michael and i have been emailing about so
00:46:24.800
many of these issues over the last couple weeks i i really come down in a place where i feel like
00:46:29.920
there are nuanced moments here and as we as we talk about for example yoel pushing back about
00:46:35.360
against some of the things that happen content moderation is the story of people trying to make
00:46:40.000
decisions the best possible decision in line with a particular policy that a company has written
00:46:44.800
and then some sort of sense of even-handed enforcement you know so you have the policy
00:46:48.720
and then the enforcement these are sometimes two different things the policies what you then have is
00:46:53.440
people in the most high stakes volatile situations trying to figure out what to do so what winds up
00:47:00.000
happening on twitter ironically is that all of these things are reduced down to do you think this
00:47:04.480
person is bad do you think that decision is bad if you think that's bad obviously there was some sort
00:47:08.640
of malice behind it and that i think is a flattening of what is actually happening there is some
00:47:13.680
interesting dynamics and uses of the word censorship that i've been intrigued by as we have moved through
00:47:21.680
the evolution of some of those policies over the last seven years and for just to just to help
00:47:28.320
make sure the audience understands content moderation is not a binary take it down leave it up
00:47:33.280
so that is the i'll use facebook's terminology here they have a framework and they call it remove
00:47:38.400
reduce inform remove means it comes down reduce means its distribution is limited and inform means a
00:47:45.680
label is put up there is some sort of interstitial you you know a pop-up comes up or there is a fact
00:47:50.400
check under it or youtube has a little like context label down at the bottom of the video
00:47:54.640
sometimes it'll take you to a wikipedia article so in that moderation framework remove reduce inform
00:48:01.040
when something is reported there's a policy rubric that says this thing may violate the content and
00:48:06.000
then the enforcement whether to remove reduce or inform is based on some sort of internal series of
00:48:10.800
thresholds i am not an employee of these companies i don't know what those are so for me one of the
00:48:15.600
interesting things about the files has been seeing those conversations come to light and
00:48:19.760
my personal take on it my interpretation has been largely that you have people trying to decide within
00:48:25.600
the rubric of this policy what they should do so there were a couple policies that i think are
00:48:31.520
relevant to this conversation and what's just been said the first on the subject of the hunter biden
00:48:36.400
laptop was the creation of a policy following a lot of what happened with the gru there so when we
00:48:42.800
talk about russian interference i'll connect it to russian interference for you you and i spoke back
00:48:46.640
in 2019 about the work that i did on a particular data set for the internet research agency so that
00:48:52.080
is the sort of troll factory when people think about social media interference and they think about
00:48:56.400
trolls or bots the internet research agency is what they're thinking of but there was another component
00:49:01.200
to russian interference in the election which was russian military intelligence hacking the dnc and
00:49:06.720
the clinton campaign and then releasing files at opportune times for example to distract or change
00:49:13.600
the public conversation to make them cover these files i think the first transfer not mistaken was
00:49:18.240
dropped the day of that access hollywood pussygate tape coming out right so um the media is talking
00:49:23.360
about pussygate all of a sudden here's this trance of secret documents media conversation changes so this
00:49:28.960
is a you know in response to things like this and also to hacked materials more broadly the platform
00:49:35.200
implements a hacked materials policy that says despite the fact that again journalists may have
00:49:40.560
a particular point of view about how to treat hacked materials the platform does not necessarily
00:49:44.720
have to share that point of view because sometimes hacked materials turn out to be sex pictures or
00:49:50.160
nudes that are sitting on your phone or a variety of other types of private conversations that get
00:49:54.960
dropped so this policy covers things beyond you know the contents of a wayward laptop from a
00:50:00.960
presidential son and so again they're not writing the policy for hunter biden's laptop they've written
00:50:05.920
the policy and then you see in the conversation them deciding whether and how to enforce it and
00:50:11.040
this is where the conversations with the fbi come into play again no personal you know i felt like the
00:50:16.960
enforcement on the hunter biden laptop by twitter was quite foolish i thought this is one of these like
00:50:21.600
the horse has left the barn you know you're doing more you're creating more problems for yourself by
00:50:25.520
trying to censor particularly an article as opposed to the contents of the laptop itself right there's
00:50:31.120
one thing you can enforce your policy on hacked material by taking down the nudes that were going
00:50:36.240
up and saying that violates our terms of service without saying that also the new york post article
00:50:41.280
digesting the contents of the laptop violates the terms of service this is where you see some of the
00:50:45.760
debates about the enforcement there but the actually just just to linger on that distinction if i'm not
00:50:52.160
mistaken this was true when i last looked but perhaps something has come out since biden and
00:50:58.160
his team never asked for the story to be suppressed on twitter that weren't they just asking about
00:51:05.360
the contents of the laptop like nude photos of hunter biden to be so my understanding from when that
00:51:12.160
twitter files thread went out you know i and others went to the internet archive to go see what the
00:51:17.600
substance of those tweets had been and they were in fact nudes does that mean that they were all nudes
00:51:23.280
no because again we have a very particular filtered anecdotal view of what happened with regard to
00:51:28.720
those requests you know we're told the trump campaign requested take down sorry the trump
00:51:32.480
administration requested take downs the biden campaign requested take downs and then we have a
00:51:36.560
list of like four or five different tweets and so that again is where depending on your your
00:51:40.400
framing and your perception this was either uh egregious jawboning or somebody trying to get nudes taken
00:51:46.160
down off a platform but from what i have seen it was it was the latter but don't we think that the
00:51:52.320
scandal is that the scandal was the fact that twitter locked out the new york post yeah yeah i'm not in
00:51:58.320
any way saying that i thought that that was a good decision that was what i meant when i said that the
00:52:01.680
suppression of the article was bad facebook did something different i don't know if you remember
00:52:05.840
what facebook did at the time but facebook actually used reduce and facebook said we are going to
00:52:12.160
throttle the distribution of the story while we try to figure out what is going on here
00:52:16.000
now the question of is that throttling censorship is a subsequent label censorship is where we've
00:52:22.800
really moved very very far in our use of that term in the context of social media moderation my
00:52:29.040
personal feeling on that very strongly is that it was political the first labeling is censorship
00:52:35.280
articles began when twitter began to fact check tweets by president trump it did not take them down
00:52:40.640
it did not throttle them it put up a label a fact check account i think that's counter speech and
00:52:46.160
contextualization this is my personal view on it but we began to see a again a flattening of the
00:52:52.480
conversation where remove reduce and inform were all contextualized as egregious overreach and censorship
00:53:00.640
and so where i come down on a lot of these questions is i recognize the complaint i acknowledge
00:53:05.440
that things were not handled well and i ask what do you want done instead yeah if you do not want the
00:53:11.600
label if you do not want the reduce and if you definitely don't want the takedown then is the
00:53:16.880
alternative simply a viral free for all at all times with every unverified rumor going viral and the
00:53:23.280
public being left to sort it out and i'm very curious about that particularly because journalism
00:53:28.960
is supposed to be about informing the public a recognition that journalists themselves
00:53:33.920
serve a filtering function serve a fact-checking function and you know we can debate the the
00:53:39.600
you know whether that's partisan or biased or this or that but there is a i think a core
00:53:45.200
believe at the center of the profession that there is such a thing as the best information that we have
00:53:51.200
in this moment and how do we convey that in a particular information environment that's where i
00:53:56.320
think a lot of my work has been but you know i'll stop talking there because i think that the
00:54:00.240
complexities of content moderation are too often viewed as right versus left takedown versus leave
00:54:06.400
up they're really filtered through the context of the american culture war and this is a global platform
00:54:11.520
trying to figure out you know what the hell do you do when modi's government requests a takedown
00:54:14.480
you know this is a this is the policies which elon just agreed to yeah you just you just capitulate
00:54:19.840
and hope no one notices all right so i just want to add something to what you said renee because it's
00:54:25.520
what people are reacting to is so people are acting like they want just everything to rip however
00:54:33.200
the algorithm sees fit and any curation is nefarious and yet we know we have an algorithm
00:54:41.360
or a set of algorithms that preferentially boost misleading and injurious information so the truth
00:54:50.160
is always playing catch up to the most salacious lies and if that's going to be the status quo there's
00:54:57.920
no way you build a healthy society and a healthy politics on top of that so i think anyone who
00:55:02.720
thinks about it for five seconds knows that they don't want that and therefore you have to get your
00:55:07.440
hand on the wheel at least a little bit and whether that hand is is some other algorithm or it's it's
00:55:13.920
actual kind of the conscious curation of monkeys you you need to intrude on on what what we currently
00:55:20.000
have built and it comes back to how transparent those intrusions are and then what people make of
00:55:26.800
those efforts based on there are divided politics and our tribalism and i think that the transparency
00:55:35.120
piece is the you know the common ground and the area where we can actually move forward there are you
00:55:41.600
know google has a interesting all the platforms have transparency reports most of them are aggregated
00:55:47.040
stats they're not particularly interesting google actually will say here's a government takedown
00:55:52.080
here is approximately the request you know here here's what we received here's what they asked us
00:55:55.760
to do and then here's what we did it's very you know very um one sentence summary two sentence
00:55:59.600
summaries but i really love that i think of that as like this is a best practice there's the lumen
00:56:04.880
database which does this for dmca takedowns which are usually companies uh sometimes others
00:56:10.000
requesting uh takedowns related to copyright violations again here is the request here is the
00:56:15.680
um you know here's what we did and i think that is an optimal path forward for saying you cannot have
00:56:23.440
a wholly moderation free environment every algorithm just speaking of curation has a weighting in some
00:56:29.120
regard there is no such thing as neutral there is no you know even reverse chronological is a particular
00:56:35.520
value judgment because they're weighting it by time and you can see this actually quite clearly
00:56:40.320
now on twitter if you look at the for you page i think they're calling it or you know for you versus
00:56:45.440
following you see different types of things you can go and you can look at a chronological feed
00:56:49.520
you will see that for you is often bait right it's it's the most you know outrage inducing you're
00:56:56.800
going to go click into this you're going to go fight with that person that's great for the
00:57:00.000
platform the chronological feed is not necessarily as engaging it's not necessarily going to keep you
00:57:05.360
there but it is a different mechanism for surfacing information and so what we're ultimately talking
00:57:12.480
about here is incentives it is a system of incentives it is a system of judgments and that is in
00:57:17.520
algorithmic curation as well as content moderation and i do think that the public does not actually
00:57:22.160
understand the extent to which the extent to which an algorithm deciding to curate and surface something
00:57:29.520
shapes their behavior shapes what they do next and this is a thing that yeah i feel like i'm like
00:57:35.120
trying to scream it from the rooftops just saying it's not just about is this person being censored
00:57:40.000
is that person being censored it's actually what is being amplified and that is potentially the far more
00:57:44.880
interesting question as we think about how to build a system that vaguely mimics a public square
00:57:50.880
you know i'm i've run the simplest algorithm over here which is to delete my twitter account
00:57:55.280
and it's impossible to exaggerate the effect it has had on my mind and life not to be on twitter i mean
00:58:02.720
it's just i recommend it to anybody and when i have checked back on twitter just to prepare for this
00:58:09.040
conversation i am just fucking aghast that i spent that much time there it's a uh a mortifying glance
00:58:16.800
backward over the previous 12 years and uh i mean for you know even even the good stuff it's for the
00:58:23.600
same reason i'm not on tiktok now or some any of these other platforms because it would just be a
00:58:28.080
time incinerator when i look back at my engagement with twitter it's amazing to me and so there's
00:58:33.600
something pathological about i think every variant on offer and it's not to say that it would be
00:58:41.440
impossible to build a social media network to everyone's benefit but it twitter ain't it and
00:58:46.320
um it's just very interesting to have unplugged you know i've done podcasts about facebook without
00:58:51.840
ever being on facebook because it's of enormous importance to society it's you know both what it
00:58:56.800
does uh appropriately and and badly and hence this conversation michael do you have anything you want
00:59:03.040
to insert at this point well i mean i guess so i would say the three things i raised which was
00:59:10.000
the need for transparency and content moderation because there is some amount of censorship of
00:59:16.320
justifiable opinions going on and accurate information that's kind of the big social
00:59:21.760
media thing the other two i mentioned i think really have more to do with um fbi and do we think it's
00:59:27.200
an apolitical law enforcement organization and the third is around this inflation of the russia threat which
00:59:33.280
doesn't is not specific at all to social media but i think is extremely important because
00:59:38.400
we all know it's a a terribly dangerous thing to underestimate a threat but in fact exaggerating
00:59:44.480
a threat has very serious problems associated with it both the ability to abuse that which we saw
00:59:51.360
in terms of de-platforming de-amplifying people that were innocent in other words saying russian
00:59:57.280
influenced as opposed to russian and i i think that's that needs to be that we need to kind of get to the
01:00:04.160
bottom of the fbi issue and the treatment of the of the russian laptop and also i think have a real
01:00:10.160
honest conversation about this issue of of russian threat inflation but i'm not sure you know and i
01:00:16.400
think if we all agree on transparency that's great i just think we should acknowledge that you know for
01:00:22.240
example some of the misinformation a lot of the misinformation is coming from you know the sources
01:00:27.200
that i think you might think of as the people that are not the sources of misinformation and and
01:00:31.120
that sometimes it's innocent so we all thought that if you got the vaccine that you were either
01:00:37.680
not going to get sick or that you were not going to be transmissible those two things both turned
01:00:42.080
out to be wrong seems to me that like having a conversation about those edges of science is
01:00:47.840
exactly what you would do on something that we call a platform and so i think it resolves a little
01:00:53.200
bit if you say look if you're a platform you have this incredible privilege which is that you're
01:00:57.040
going to have this light in this these limited liability basically but that means that that
01:01:01.840
you're also but the flip side is that those platforms are also curating and so you get yourself in a
01:01:07.600
funny position which is like okay well then how do you resolve that and it seems that you have to
01:01:11.600
resolve it which is that if you're going to have this amazing privilege to be a platform rather than
01:01:16.160
a media content producer then you must be transparent about how you're making those decisions right and
01:01:22.640
there must be a place for people to appeal if they're being censored or just throttled or reduced
01:01:29.920
or even if there's an interstitial you know i i have a i have an ongoing conflict with facebook about
01:01:37.360
the cause of high intensity fires in california i have the top scientists in california saying high
01:01:43.840
intensity fires are caused by fuel wood load rather than temperature changes i am not allowed to express
01:01:51.360
that on facebook i have been severely so severely throttled by facebook that it's basically a dead
01:01:56.960
platform to me the chance they asked my demand for an appeal to the fact checker was they said go
01:02:02.320
talk to the person that censored you you know it pisses me off needs to be resolved that it's due to
01:02:07.120
jewish space lasers you might have an open lane on facebook you've got some nice followers i want to
01:02:13.520
drill down on a couple of points you just made so it seemed to me that the story was changing in real
01:02:19.760
time as the files as the twitter files were dropping initially the story seemed to be
01:02:25.840
that uh the meddling on the part of twitter's ultra woke employees was just horrific and horrifically
01:02:33.200
biased but then it seemed to shift and it was more of a picture of twitter resisting an intrusive
01:02:41.840
uh fbi and actually resisting fairly professionally or or or at least i mean they they eventually caved
01:02:50.640
i guess but they they seem to surprise some of you in in how much they in fact did resist and so you
01:02:57.200
know from viewing it from the outside it seemed like the goalposts kind of moved a little bit like
01:03:02.320
first we're just meant to be astounded by what the back room of slack looks like at twitter but now we're
01:03:10.560
seeing that no actually they were quite tied in knots over following their own policies and we're
01:03:15.760
just really getting worn down by the fbi's requests how do you view it there and then i guess the other
01:03:21.760
piece i would add is here we're obviously talking about trump's fbi right what run by a trump appointee
01:03:29.600
so that does kind of muddy the picture of you know frank political bias being the reason to suppress
01:03:36.960
or pre-bunk a a hunter biden laptop scandal yeah i mean so i think yeah i definitely think that
01:03:45.360
my perception of what was going on changed over time i i of course you know we were all only
01:03:51.920
responsible for the threads that we wrote so you know by the time i came in it looked like yoel was
01:03:58.160
doing more pushing back at the same time on both of the two issues that we that i looked at and was
01:04:03.840
involved in the decision to de-platform trump and the decision to bounce is the technical language
01:04:10.960
the technical word the new york post account in both cases yoel and his team had decided that there
01:04:17.040
was no violation and then they reversed themselves now to some extent you go well you know you're kind
01:04:23.600
of i think renee articulated this a little bit before which is you know the rules are evolving over
01:04:29.040
time as they deal with real world cases so you can't be too much of a you know like well we wrote
01:04:33.920
the rules and we can't change them but it did it there wasn't transparency about how that was happening
01:04:40.800
and hence hence another reason for transparency and in the case of the hunter biden laptop i find jim
01:04:47.600
baker's behavior extremely suspicious he is clearly a deep anti-trumper he is the person that hillary
01:04:57.920
clinton's attorney came to michael sussman came to to share false information about an alleged russian
01:05:06.000
bank that was wiring information potentially the trump campaign triggering the investigation he then goes
01:05:13.680
to twitter and appears to have played a major role in reversing the initial conclusion by yoel roth's
01:05:22.480
team that the laptop had not violated any of twitter's terms of service and that the laptop
01:05:28.880
appeared to be authentic rather than a product of russian disinformation again i i got you get to a
01:05:36.160
point with this stuff where you kind of go i've done all i can in terms of going through the data i've
01:05:42.080
i've made a strong of a statement about this pattern of behavior appearing to be a pattern of behavior and
01:05:47.840
appearing to be organized and now i just think it's in the hands of the congress that it needs to get to
01:05:52.400
the bottom of what was going on over there and they may never do it but there was stuff going on sam that
01:05:57.280
was weird like why in the world did aspen institute do a tabletop exercise with the national security
01:06:06.480
reporters from the new york times and washington post the safety officers from facebook and twitter
01:06:11.200
to talk about a potential russian hack and leak operation involving hunter biden why did they do
01:06:17.280
that in august and september keep in mind that fbi had the hunter biden laptop since december 2019
01:06:23.280
that they were listening to giuliani's calls i find the whole thing extremely suspicious now of course
01:06:28.480
hunter biden was in the news and that was what trump got impeached over yes and it may be that i am
01:06:34.400
again engaging in a pattern an overly pattern trying to recognize a pattern here that's not there
01:06:40.640
and that's possible but i do think there's something very strange going on that we haven't gotten to
01:06:45.280
the bottom of and we should and the fact that uh you know trump had his appointee there doesn't mean
01:06:51.360
that the that there wasn't potentially some effort by both existing and former intelligence community
01:06:57.920
operatives to basically engage in a pr campaign or what we used to call psyops or now it gets called
01:07:04.560
influence operations to basically pre-bunk the hunter biden laptop so that i personally and my
01:07:12.080
entire liberal democratic family and all of the democrats i know thought that it was russian
01:07:17.120
disinformation yeah i mean i didn't even really know that it wasn't until pretty recently i just sort
01:07:22.080
of assumed it was russian disinformation i didn't take it seriously so i think partly i kind of go i know
01:07:28.160
how i experienced that episode which was to go i don't know sounds like it was probably russian
01:07:32.640
disinformation and it wasn't and they knew at the time that it wasn't and that seems to me quite
01:07:37.600
important well let's just close the loop on that if we can has now this laptop has been studied
01:07:44.080
certainly by many many people right of center for many months is there anything there that was
01:07:50.560
important oh my gosh yeah is there anything that ties back to his father that is is a big deal yes
01:08:01.040
like what well oh i i'm still in a bubble i still haven't heard i mean this story apart from a
01:08:07.120
single line in an email saying i gotta give 10 to the big guy and we assume the big guys
01:08:14.640
if you'd like to continue listening to this conversation you'll need to subscribe at
01:08:18.320
sam harris.org once you do you'll get access to all full-length episodes of the making sense podcast
01:08:24.000
along with other subscriber only content including bonus episodes and amas and the conversations i've been
01:08:30.080
having on the waking up app the making sense podcast is ad free and relies entirely on listener support