#172 — Among the Deplorables
Episode Stats
Words per Minute
174.54439
Summary
Andrew Marantz is a staff writer at The New York Times and a contributor to Radio Lab and the New Yorker Radio Hour. His work has also appeared in Harper's New York Magazine, Mother Jones, and many other publications. He has spoken at TED, and he's been interviewed in many places, including CNN, MSNBC, and NPR. And we are talking today about his book, antisocial online extremists, techno utopians, and the hijacking of the American conversation. Andrew is more woke than I realized, and I can't tell you how I was surprised at the line he took through parts of this conversation. I was a little surprised at some of the lines he took, too. I'll have a few more things to say about that in my Afterword. Andrew's book is a fascinating read, and it gets more contentious than I was expecting, which is why it's a fun read. I hope you enjoy it, and if you do, please share it with a friend or colleague who might be interested in reading it. Thank you for supporting the podcast, and as always if you can afford a subscription, you need only send an email to support at waking up dot com and you'll be given a free account and you get access to all sorts of perks, including access to a library of books and other perks, like the kind of things you can read, from which you can get a chance to learn about the world, and so much more. Make it sense? -Sam Harris "Make It Sense" Make It Sense? "The Making Sense Podcast" -- by Sam Harris Subscribe to the Making It Sense Podcast? -- "The making sense podcast" -- "A kind of thing?" -- "Let me know what you're listening to the podcast?" -- -- "I'll be missing some of that kind of stuff happening soon, and there'll be more of that in the podcast? -- "I'm making sense" -- I'm not going to miss it? -- I'm making it better than that, I'm going to say that you can be a good thing, I'll be helping you know that you'll have it like that? -- I'll say so much of that's not enough of that, or you'll get a good idea, like that, so I'll do that, right I'll know that I'm having a good chance to know that that'll be like that'll have something like that...
Transcript
00:00:00.000
welcome to the make it sense podcast this is sam harris okay brief housekeeping once again i'm
00:00:24.600
now adding afterwards at the end of these podcasts so i will save some of my remarks for there again
00:00:31.720
i'd like to urge supporters of the podcast to visit my website and go to the subscriber content
00:00:37.960
page and download the subscriber feed of the podcast if you are seeing a black making sense
00:00:47.520
icon in your podcasting app you do not have the subscriber feed you have the public one
00:00:53.960
and you will be missing some content and there'll be more of that kind of thing happening
00:00:58.920
very soon so you have been warned and as always thank you for your support
00:01:04.420
also many of you have asked whether the conversations that i'm now having on the waking up app
00:01:11.260
can be made available to podcast subscribers the answer is yes although we've decided to put them
00:01:19.080
on my website only not pipe them to the subscriber feed this is because they really are narrowly
00:01:27.620
focused on the topic of living an examined life meditation the nature of consciousness and many
00:01:35.820
of these conversations are just too specific for the podcast generally so i just don't want to hit
00:01:42.860
the average podcast listener who is supporting the podcast with these episodes in his or her feed
00:01:49.660
so if you want to hear them they will be on the website and you can listen in your browser and they will
00:01:56.520
soon be posted in the subscriber content section and of course if you really want to get into these
00:02:02.480
topics with me and you want to hear everything i have to say there there really is no substitute for
00:02:08.280
subscribing to the waking up app there's a reason why it's a separate app and as always if you
00:02:15.860
actually can't afford a subscription you need only send an email to support at waking up dot com
00:02:21.760
and you'll be given a free account okay today i'm speaking with andrew marantz andrew is a staff
00:02:32.260
writer at the new yorker his work has also appeared in harper's new york magazine mother
00:02:38.120
jones the new york times and many other publications he is a contributor to radio lab
00:02:44.520
and the new yorker radio hour he has spoken at ted and he's been interviewed in many places cnn
00:02:51.760
msnbc npr and we are talking today about his book antisocial online extremists techno utopians
00:03:00.680
and the hijacking of the american conversation and it's an interesting book and an interesting
00:03:07.140
discussion it gets more contentious than i was expecting about an hour in or so andrew is more
00:03:15.060
woke than i realized and we talk about all that i can't tell if we disagree more than is apparent in
00:03:22.760
this conversation or less i'm going to guess more again i was a little surprised at the line he took
00:03:30.980
through parts of this conversation this is becoming an occupational hazard anyway i'll have a few more
00:03:37.380
things to say about that in my afterward and now i bring you andrew marantz
00:03:43.620
andrew thanks for joining me on the podcast thank you thanks for having me so you've written a
00:03:53.380
fascinating book which is a really fun read uh the book is antisocial online extremists techno utopians
00:04:02.020
and the hijacking of the american conversation before we jump into your book give me a potted bio for
00:04:10.020
you how do you describe your career as a journalist uh it's a good question well first uh i think some
00:04:17.060
people might have a hard time believing that it's a fun read but i appreciate you saying it because
00:04:21.620
although it is dark subject matter i did try to find some of the the dark comedy in it yeah yeah
00:04:28.340
there's a lot of that yeah no i'm glad that landed because you know you can't just be bleakness from
00:04:33.860
start to finish even though it does get bleak sometimes i guess i would say i graduated from
00:04:40.100
college in 2006 uh moved to brooklyn and became a freelance journalist because that's what all my friends
00:04:47.220
were doing basically and you know for other reasons too i wanted to learn about stuff and pursue truth
00:04:55.060
without sort of boxing myself into one academic discipline exclusively for the rest of my life and
00:05:01.220
wanted to try to write beautiful sentences that also spoke to true things in the world and you know all the
00:05:09.620
sort of cliche reasons one becomes a sad young literary man in brooklyn and um ended up freelancing
00:05:17.700
for you know harper's new york magazine mother jones a variety of places and did a master's program
00:05:25.460
where i met someone who was leaving an editorial position at the new yorker and told me that one was
00:05:31.300
opening up and it was a kind of entry-level bottom of the food chain sort of job there but i just was so
00:05:37.620
impressed with everyone i met there that even though i wasn't sure i wanted an editorial entry-level job
00:05:44.260
and i was in some ways just happier being a writer i uh i took that job and then i sort of was an editor
00:05:51.700
and a writer for six or seven years before going back full-time into writing mostly for the new yorker
00:05:58.500
and for this book it's interesting so your book is it's fascinating because essentially you embedded
00:06:04.260
yourself among the deplorables and you know so it's really a report from the front when did you
00:06:11.460
actually start reporting for the book uh there's kind of different ways of answering that i mean
00:06:16.500
in one sense the idea for the book really gelled once trump entered the picture you know on that
00:06:23.140
on that escalator that we all remember from june 2015 in another way the preoccupations of the book
00:06:29.300
predated that you know i was reporting on click bait factories and the kind of degradation of
00:06:36.500
online media since 2014 and so it it was kind of it was a set of preoccupations i had already had and
00:06:44.820
then once trump and trumpism entered the picture and then from there you know various kinds of trollery
00:06:51.060
and misogyny and white nationalism and stuff it all kind of congealed into something that i felt really had
00:06:56.820
to be a book but the underlying concerns i think had had been with me for a while they were just kind
00:07:01.860
of inchoate and hard for me to even really put a name to so before we we actually begin to walk
00:07:08.500
through your adventure in the book and and touch specific topics like social media and fake news and
00:07:16.900
gatekeeping and trump and how the press deals with him and there's just there's a lot to cover here but
00:07:23.860
what's interesting to me is that many of us have been isolated in a kind of liberal scare quotes
00:07:30.820
elitist bubble and this book is really a kind of breaking of that spell is what it's like to fully
00:07:38.500
embed in this culture of reaction in their own terms to elitism and your book offers some considerable
00:07:49.220
testimony to to what has been happening but i do have a concern that as we analyze this we are very
00:07:58.100
likely to be importing our the continued liberal confusion into that context and misunderstanding
00:08:05.940
things and and so what i feel like this is this is now a concern that i've expressed on multiple
00:08:11.140
podcasts i feel like there's a the prospect of either exaggerating the problem of things like
00:08:19.620
white nationalism for instance uh and sparking a kind of pendulum swing into moral panic i mean i
00:08:25.460
certainly see that on the left i see that especially clearly because there are people on the left who
00:08:30.340
think i'm a white nationalist which is completely insane so as we walk through this at various points i'm
00:08:37.620
going to want to question whether or not the way you see the the data you were confronting the data
00:08:45.620
being these these conversations with people is the only interpretation to which it's susceptible so
00:08:51.700
with that caveat let's just weigh it in and well just to just to add on before we get going i think
00:08:57.620
that's all stuff i'd be interested in exploring in the conversation i guess one thing about being
00:09:02.820
blindsided and being in a in a you know elitist bubble and all that stuff it's sort of fully yes
00:09:08.580
and and also no i mean on the one hand i do find it sort of inexplicable that trump could have any base
00:09:16.020
at all in this country on a kind of like a priori level on another level i you know was the guy sort
00:09:22.740
of betting my friends that he would win because i had that read of the political landscape even though i
00:09:29.140
was sort of you know incredulous about it i still did think it would happen and there was kind of a
00:09:36.500
lull in you know october 2016 when i sort of finally accepted okay i guess all the polls can't be wrong
00:09:44.100
but up until then i did sort of have a sense that it was not only possible but in many at many points
00:09:51.780
probable so i definitely own the uh you know latte sipping glasses wearing brooklynite
00:09:59.620
label that i very clearly wear but i also you know i think there there are bubbles and there are
00:10:05.460
bubbles and yeah i think it's possible to see out of them and in some cases all you have to do to see
00:10:11.460
out of them is just you know do a google search or a facebook search and and there it is so i definitely
00:10:16.420
everybody has their biases and i have mine but um i do think that the sort of liberal elites who can't
00:10:22.660
even believe that trump you know misspelled a word on twitter kind of you know caricatured you know
00:10:28.660
we get carried away with that sometimes i think yeah i actually had forgotten that part of your
00:10:32.500
book where you you detail your impressions with respect to trump so i was more in the bubble than
00:10:38.820
you were except i was struck by the detail you flagged that the new yorker had not prepared a cover
00:10:47.220
for a trump victory they only had a clinton cover which was yeah fairly amazing yeah and there is
00:10:52.420
a degree to which and we can get into this too but there's another kind of bifurcation here with with
00:10:58.740
regard to the new yorker because in a sense the new yorker is kind of a minor character in the book
00:11:02.900
and there's a way of reading what i do as a kind of you know parody of the new yorker's you know
00:11:10.900
insistence on putting accents on you know the word elite or you know you have the diuresis or whatever
00:11:17.940
and like that stuff is fun to mock and i'm happy to mock it lightly on the other hand there is this
00:11:24.100
sort of strange almost reversal that i experienced where my natural tendency is to be pretty anti-authoritarian
00:11:32.980
and contrarian and anti-establishment in many ways that's kind of my natural instinctive tendency
00:11:38.820
and yet i find myself kind of coming to this from within the kind of inner sanctum of elite american
00:11:46.260
journalism and i guess there's a lot to say about it we can explore many angles of it but i guess
00:11:50.660
there are just different kinds of elitism and many of them most of them are obviously bad and i think
00:11:57.620
that from my experience of the new yorker from being inside it it actually doesn't subscribe to that
00:12:04.820
bad kind of elitism nearly as much as i would have expected i mean i expected a lot of snooty
00:12:11.380
elbow patch wearing you know all the stuff you see on the family guy or something i didn't that that
00:12:18.180
hasn't been my experience it has however been my experience that there's a lot of um there are a lot of
00:12:26.180
discriminations being made you know with regard to which piece is better than another piece or how a
00:12:32.180
piece should be structured or which arguments withstand scrutiny and so it is um hierarchical
00:12:39.300
in that sense you know it is a gatekeeping institution in the sense that it takes great
00:12:44.100
care to decide which things to publish and which things you know not to but so i mean that's just to
00:12:50.340
sort of mark that as to the extent that that is um hierarchical you know there are different ways of
00:12:57.220
being hierarchical some more arbitrary than others yeah well i once wrote an article titled in defense
00:13:04.020
of elitism for newsweek and and then john meacham helpfully retitled it when atheists attack i still
00:13:12.180
think yours is more clickable but uh yes maybe more meretricious right yeah i was i was going after
00:13:17.860
sarah palin back in the day so um let's start with social media i mean you you make this so i think
00:13:25.940
many of us feel that social media is somewhere close to the root of the problem you know that coupled to
00:13:31.860
the the advertising model for digital media and the and the primacy of clicks you point out in a
00:13:38.500
recent new yorker article i don't think you make this point in your book but that you know the
00:13:42.740
gutenberg revolution unleashed similar problems right i mean the you know the printing press had
00:13:49.140
had liabilities in that it allowed for the amplification of misinformation and martin luther
00:13:54.420
is often celebrated as a sign that you know the printing press enabled the reformation but as you
00:14:03.300
point out it also allowed him to spread his murderous anti-semitism and ushered in something
00:14:08.980
like a century of religious conflict but it seems that there is something special about
00:14:15.620
the time in which we're living and this notion upon which facebook and these other companies are
00:14:21.860
founded that linking people digitally is just an intrinsic good that hasn't really survived contact
00:14:29.780
with reality these few years yeah it it unfortunately survived i think a few years beyond where it was
00:14:36.340
plausible and that extension of a few years was enough to in part give us trump and brexit and
00:14:43.940
bolsonaro and modi and you know duterte i mean i could go on but but yeah i think you're right it's it's
00:14:50.980
the worm has kind of turned on that one i think you know in the space of you know just the few years
00:14:56.660
between when i started embarking on this project and when i'm putting it out into the world i've been
00:15:01.220
shocked at how much public opinion has swung from in my view one sort of extreme to the other
00:15:10.580
and i think that's helpful i don't think that it's i don't think we're all the way there yet in terms of
00:15:17.220
nuance and understanding you know obviously there's a lot of helpful stuff bundled together with a lot of
00:15:23.700
really dangerous stuff and i don't think on a large scale we've really teased it all out yet i guess to
00:15:29.220
your to your point about whether it's different i think yes it's different in the sense that well
00:15:34.020
two things one i think because we relied on a really unrealistically oversimplified idea of what
00:15:43.540
liberatory technologies do because a lot of the because a lot of the you know young men who started
00:15:50.100
these social media companies were just sort of assuming that their technologies would be like the
00:15:54.900
printing press and that the printing press essentially did nothing but help us move toward
00:16:00.500
progress and democracy and all the rest of it they had an opportunity if they had had a more nuanced
00:16:06.980
view of it to build in protections right from the beginning and they didn't do that and so they set
00:16:13.780
themselves up for more pain than was necessary i also think that even though the early publishers and
00:16:22.420
printers in you know renaissance europe were ambivalent about their status as gatekeepers they did come to
00:16:31.140
accept pretty quickly that they had that role and responsibility and the social media founders worked
00:16:38.740
really hard to deny that they were gatekeepers to deny that they had any curatorial responsibility
00:16:44.500
to deny that they could be held accountable for what happened on their platforms in some cases they're
00:16:48.580
still trying to deny it again without without any plausibility in my view so yeah i think it's the
00:16:54.580
combination of these massively powerful tools with all kinds of denials of the idea that the tools could
00:17:01.380
have any negative impact and that if they do have any negative impact well we're just not responsible for
00:17:06.420
it um i think that is unique plus you know now we have nukes and climate crisis and you know just
00:17:14.660
things that they didn't have to deal with back then do you think the gatekeeping problem is soluble
00:17:20.900
i think it can be improved i don't have a perfect solution in mind but i think one key thing
00:17:27.780
is for the engine of virality to be moved away from what it currently is which is
00:17:37.300
what i call activating emotion or i don't call it that the scientists that i cite call it that
00:17:41.380
i think that's one big thing you could do where instead of the current system which measures
00:17:48.900
engagement and engagement is measured by proxies for you know essentially things that increase your
00:17:56.100
galvanic skin response you know anger lust laughter you know there's just these very kind of
00:18:02.500
animalistic behavioral responses if you moved away from that as the coin of the realm and moved
00:18:08.420
into a more you know balanced system where those emotional reactions were mixed with other
00:18:15.060
kinds of reactions you know more slow brain kinds of reactions more pro-social reactions to use the
00:18:21.860
mirror image of the book's title that would solve a lot the problem is it's really hard and it might
00:18:26.980
make the companies a little bit less money yeah well i mean there is just this problem which you cite
00:18:32.660
in the book that fake news consistently spreads faster than the truth and it is because we're we've
00:18:39.380
optimized for these these activating emotions we've created essentially a quasi-darwinian system
00:18:47.140
that selects for outrage and misinformation this is a machine for generating controversy and you could see how
00:18:55.140
you might tinker with the settings there but it may just be a fact of human nature that the lurid
00:19:02.820
incredible terrifying and divisive is stickier than something that tells us that people are mostly good
00:19:11.780
most of the time and that you know order is progressing yeah i i would agree that the lurid and the
00:19:18.420
you know that those things are stickier inherently i guess my hope would be that you could build a
00:19:26.900
system that doesn't just privilege those things i mean we tend to assume that what we see on social
00:19:32.660
media is just kind of a flat reflection of the popular will or that you know because we're seeing a
00:19:38.820
lot of something that means that you know a lot of people want that thing to be out there and you know
00:19:44.980
know that it's just sort of a flat reflection of democratic urges and desires i guess on one kind of
00:19:53.140
immediate level that's true because there are people or bots in many cases you know clicking on the
00:19:58.100
things but in another sense it is you know there's a system that is conditioning people to behave a certain
00:20:05.700
way so you know it may have always been the case that lurid and false and you know sensationalist
00:20:11.620
things got more attention but you know if you were a producer of a newspaper let's say yeah there was
00:20:17.780
plenty of yellow journalism there were plenty of penny papers and partisan presses and all the rest of it
00:20:22.180
but they also had a sense of shame they also were you know members of a society that could be made to
00:20:28.980
feel that they should stop you know goading society into war because their buddies would look down on
00:20:35.380
them if they did it or that you know they would do it for the beginning of part of their career and then
00:20:39.300
try to look much more high-minded for the latter part of their career like joseph pulitzer did so
00:20:44.980
i guess you know i do think that human urges can be pushed back against when you have people in
00:20:52.580
charge of the systems who are willing to try yeah i guess i'm just i'm sympathetic with this distinction
00:20:57.460
that every social media overlord wants to draw between a publisher and a platform and you know they
00:21:05.860
consider themselves the latter and and therefore have more or less um happily abrogated any curatorial
00:21:14.580
responsibility that you or i would want to assign to them and i do understand that just because of the
00:21:21.060
sheer scale of the problem and but for the fact that we might invent some ai that is truly competent
00:21:29.860
you know and doesn't make egregious errors you know just censoring you know normal republican senators
00:21:35.300
as you know neo-nazis or whatever the failure would be there's a difference between you know you
00:21:40.900
the new yorker making a decision about who to publish and what sort of views to amplify or me making a
00:21:47.220
decision about who to speak with on this podcast i think those kinds of decisions have to be made
00:21:53.700
responsibly and i and i take it seriously the the concern about whether it makes sense to give
00:21:58.980
someone a platform or not you know and there are many people i've you know there are people you you
00:22:04.020
spoke with in your book who've asked to be on the podcast and who i've decided essentially never to speak
00:22:09.700
with because i think they are at least beyond my overton window and even there i'm i'm somewhat
00:22:15.780
conflicted because part of me you know i'm somewhat idealistic about the prospects of just shining
00:22:21.860
sunlight on bad ideas and insofar as i think i'm capable of doing that it's tempting to do it with
00:22:28.180
any bad ideas of consequence but um i don't know when you when you talk to someone like uh you know
00:22:34.820
jack dorsey or anyone who's running a platform of this kind it's hard to see how they can ever
00:22:42.180
curate this correctly you know in a way that doesn't cause more problems than it's solving because
00:22:49.780
there's just so many casualties of their inept efforts to curate i mean they're people who have
00:22:55.060
received lifetime bans from twitter for saying things like men are not women in that case that
00:23:00.900
was considered hate speech against the trans community i mean say more about your sense of
00:23:07.140
optimism that we should even move in that direction but what if the technology on some level just doesn't
00:23:12.180
allow for this to be done in a way that is going to solve this particular problem yeah there's a lot in
00:23:18.660
there and i i so i've i'm rarely accused of being an optimist so i appreciate that i don't think i'm
00:23:24.820
very optimistic about their ability to do it certainly not their ability to do it flawlessly
00:23:29.060
you know we can get into specific cases i think you know my instincts on the trans stuff might be
00:23:35.860
different than yours but just you know we can take any number of examples but i do take the larger
00:23:42.340
point that there's problems of scale it's never going to be perfect and that in some cases the the
00:23:47.540
medicine could be worse than the disease when it comes to banning people or kicking them off i mean
00:23:52.260
i get that in principle and i am i am definitely sympathetic to how hard it is for someone who's in
00:23:58.740
charge of one of these platforms to make these very tough decisions i don't think i have all the
00:24:04.740
answers or that if i were in charge i would know exactly what to do i guess we kind of have a different
00:24:10.820
starting point though where it seems like you might be starting from the you know the system as it
00:24:15.540
exists and then saying well how can we expect them to know exactly who to ban when and i guess
00:24:21.700
i would start a few steps back in the causal chain and and question how the system was built in the
00:24:26.580
first place i think by the time you get to the point of deciding who to ban or who not to ban it's
00:24:31.540
it's in a sense too late and you're dealing with symptoms instead of root causes i mean just because
00:24:36.660
you brought up the example of jack dorsey he's been going around for the last six months or so saying
00:24:41.540
well i think it might help if we you know got rid of follower accounts i think it might help if we
00:24:46.500
didn't incentivize likes and follows and and shares as much as we incentivized other things
00:24:51.300
he hasn't done any of that i'm not sure why he hasn't done it if he thinks it would help but
00:24:55.460
i mean i think i do know why because it would make the company less profitable and it's it's
00:25:00.100
profitability is a big question mark right now but but that's just one example of how there are
00:25:05.380
structural changes that can be made that are way above the level of do we ban this account
00:25:10.820
or not and i think what that gets to is the notion that these things were not built with this stuff in
00:25:15.860
mind you know facebook is right now working on abdicating some of this responsibility but they're
00:25:20.820
doing it by building a so-called supreme court of facebook that is not a body that is not an idea
00:25:26.660
they would have entertained even three years ago so there are different ways of getting around
00:25:31.300
this problem it's not just to curate or not to curate yeah it's interesting to consider what would
00:25:38.660
happen if we could curate perfectly let's say it's an error-free you know nazi detector but then
00:25:46.020
you still have the question of when these platforms become more or less identical to internet infrastructure
00:25:54.420
which you could argue that some of them already are you know that means so that you could draw
00:25:59.860
an analogy to something like the phone company right so let's say that the phone company could
00:26:05.860
perfectly detect when people were spreading you know nazi ideas in their private phone conversations
00:26:14.260
should the phone company just shut down those accounts that's sort of the territory we're in
00:26:19.460
if we actually could do this well just to address the point you just made i mean i guess the analogy
00:26:26.580
i would use is that the phone company would be more analogous to you know crowd strike or or not
00:26:33.140
crowd strike that's the thing in the ukraine complaint what's it called cloud flare cloud flare yeah exactly
00:26:38.180
that that there would be deeper layers of internet architecture to analogize the phone company to twitter
00:26:43.460
would be to imagine that the phone gives you extra points every time you say something you know really
00:26:51.220
exciting or you know something that really riles people up that you know when you're on a conference
00:26:57.140
call if you call someone a douchebag or something you know you get 15 extra points like you're in a video
00:27:02.020
game if the phone were tilting the playing field in that way then the phone yeah i think would have
00:27:06.820
more of a gatekeeper analogous responsibility because the phone would be affecting our behavior in a in a
00:27:12.740
proactive way okay well let's jump into the book properly so you really go behind the scenes with
00:27:20.660
a fairly motley cast of characters none of whom i've met in person i think but some of whom i've
00:27:27.940
had various skirmishes with online but there is this larger issue which you know we should touch on
00:27:34.180
which is just this guilt by association algorithm that is running on the left i mean this really is a
00:27:41.540
problem of the left where if you talk to someone you know who's ever spoken to someone who has spoken to
00:27:47.780
someone who's a nazi you're a nazi right no one can survive that scheme you have spoken to someone
00:27:56.420
who's spoken to someone who's spoken to someone who's a nazi and well i've spoken to all of them so
00:28:00.740
yeah but you i mean you you have spoken with them you could argue to however amicably you're you're
00:28:07.780
actually on horsing them and their worldview or to to some degree doing that this is not no one is going
00:28:14.180
to argue reading your book that you gave these people platforms in or at least i wouldn't expect
00:28:20.260
that would be a common charge i would hope not yeah i think that's right i i set out to see them
00:28:26.740
through a critical lens from the beginning and there is a confusion that i think a lot of journalists
00:28:32.900
have and a lot of the public has about journalism and it's a it's a good faith confusion it's not easily
00:28:38.260
resolved between being unbiased or objective or any of those words and being someone who you know
00:28:47.060
takes in the evidence of your senses and more and more of those things seem to be at odds so there's
00:28:52.020
a way of that i could have approached this project where i could have said well i'm just gonna quote what
00:28:57.860
they say and i'm just gonna kind of you know transcribe it and be a stenographer to power and that's
00:29:04.820
that or i could have done what i did do which was be really critical and in some cases really acerbic
00:29:11.540
and mocking which i think was deserved and in some cases necessary but you can't really do both i mean
00:29:19.700
you can't always be both even-handed and tell the full truth there's a difference there between me using
00:29:27.060
the material i've gathered to tell the story i want to tell and you know handing the microphone to
00:29:32.100
someone to tell the story they want to tell i think that's a meaningful distinction does having
00:29:37.220
hung out with these people in person noticeably corrupt your objectivity with respect to how you
00:29:44.980
portray their ideas or i mean do you think you're you're less combative in your treatment of them and
00:29:50.740
their ideas for having you know broken bread with them and and shared long car rides and all the rest
00:29:56.340
yeah it's a good question i i can't know for sure i've only run that experiment once i don't
00:30:02.340
have a control group but i think you probably do have a control group in that there are other people
00:30:06.260
you cover who you presumably don't meet face to face even and you know what that's like right that's
00:30:12.740
a good point yeah i mean one thing i do try to do it's not just a sort of straight ahead taxonomy of
00:30:19.700
shitty people on the internet you know i do try it's sort of using them as fodder to tell a larger
00:30:24.980
story but i do try to take care to taxonomize to the extent that you know i don't want to run
00:30:31.860
together differences and conflate differences you know some of them are nazis some of them are not
00:30:37.620
some of them are white nationalists some of them are not some of them you know are just kind of die
00:30:43.300
hard trumpists who i find absurd because they hold that opinion but i actually am fine with talking to
00:30:51.220
in all other respects and some of them i just find kind of skin crawly and creepy all the way through
00:30:56.340
so it is case by case now the question of whether you know the case by case accounting is different
00:31:02.820
for cases where i've hung out with people versus not i think it's hard to say i mean the kind of
00:31:07.860
journalism i do mostly requires just really being a fly on the wall for long periods of time and there
00:31:13.940
are some people who do that kind of longitudinal immersive style journalism who just don't do it
00:31:20.660
about people they don't think they're gonna like i mean i have colleagues and friends who say i don't
00:31:25.140
want to write a profile of someone unless i am reasonably sure that i'm gonna enjoy their company
00:31:30.740
because i don't want to spend my time in a combative environment and i also just don't want to take
00:31:37.140
on the responsibility of writing a really sharp critical piece in the end and i just would rather
00:31:43.540
write an admiring piece now obviously i'm not one of those journalists but i i was always on guard
00:31:49.060
against the possibility that they were playing me or that they were using their time with me to try to
00:31:53.860
subtly lobby me toward a more flattering picture of them in some cases i think there was no danger of
00:32:00.260
that such as the cases of you know me a jewish journalist talking to a professional anti-semite
00:32:06.820
there was very little chance that i was gonna see eye to eye with that person and yeah in the end even
00:32:13.700
in cases where there wasn't that direct of a conflict i don't think that i was hoodwinked by any of them and and
00:32:22.180
at the same time i don't think i overreacted to that or overcompensated by trying to go harder on
00:32:28.580
them than was merited but i mean it's really not for me to decide yeah there is an interesting effect
00:32:35.380
of you know compassion creeping in for better or worse where i mean just as the reader i mean you
00:32:41.460
you know the one of the more odious characters you talk about is this guy mike enoch who i knew nothing
00:32:49.620
about that's his pseudonym what's his real name again uh painovich painovich mike painovich but when you
00:32:55.460
get the details of his childhood and his life it's it's pretty easy to see that there's a a
00:33:02.420
psychological explanation for at least you know some of his obsession with these ideas and and you
00:33:09.380
know the misuse of his own mind i mean he's he's a smart guy who's spending all his time being an
00:33:15.860
anti-semite yet married to a jewish woman or i guess no longer married to a jewish woman once once
00:33:22.420
you discovered the nature of his podcast but i mean the whole thing is so depressing you know that
00:33:30.420
it's it's hard not to just see him as a casualty of something i mean it's like his his his own agency
00:33:36.980
kind kind of erodes and you just see you know but for the fact that he wasn't you know fed an endless
00:33:43.860
supply of prednisone because he had such horrible eczema as a child that things might have been different
00:33:49.220
and um it's those kind of details which if you're just dealing with the ideas you know i just find
00:33:55.060
if i'm just reacting to someone because they're putting out terrible memes that's one level where i
00:34:01.780
can just deal with the ideas and i can be as uncompromising as i can be but then if you hung
00:34:07.700
out with someone and gotten a sense of their humanity and all of the exculpatory or potentially
00:34:13.620
exculpatory influences on them you come away sort of not knowing how harshly to judge them as a
00:34:20.180
person i i felt the same thing with cernovich frankly i've never met cernovich he's attacked
00:34:25.220
me a bunch online and i you know responded in kind a little bit but then i just sort of got more of a
00:34:30.740
sense of you know how complicated it was it was to be mike cernovich and i just couldn't keep it up
00:34:38.260
anymore you know it just seemed like all right this is just it's not worth interacting in a hostile
00:34:44.420
way with this person at all and yeah so i just don't know if you if you felt that in your reporting
00:34:49.620
or not because as a reader i i felt it you know meeting some of these guys it just felt like it's
00:34:55.460
uh you know i wouldn't want to trade places with any of these people so um how harshly am i going to
00:35:01.060
judge them i think you're right that spending a lot of time with these people both as a reporter and
00:35:10.740
for you as a reader it does change and deepen the way you see them and that was part of my goal i
00:35:17.460
think it's tricky because you don't want to let people off the hook for their terrible behavior and
00:35:24.260
there's a really fine line between empathy and excuse or you know you use the word compassion
00:35:31.460
i don't know how many layers deep we want to go it's kind of deep in my ethos that i try to have
00:35:37.780
radical compassion for everyone i try to have compassion for donald trump who's obviously
00:35:42.900
suffering from one or more personality disorders and who it would be easier to have compassion for
00:35:49.140
him if you felt that he was actually suffering right but you're using suffering in a in a different
00:35:55.380
sense because i mean we're suffering from his his neurological disorders but he doesn't appear to be
00:36:00.340
suffering from them you know i think in the first few minutes before he can actually get the tv to turn
00:36:04.900
on in the morning i think he probably experiences immense suffering right but i you know obviously don't
00:36:09.460
know and in a sense it doesn't matter right because what i really try to do and i'm not i'm not saying
00:36:16.420
that i'm you know able to do this always i'm not some christ-like saint or anything but i do think
00:36:22.260
on some deep level the goal is to try to have empathy for everyone even the worst people now that
00:36:30.900
obviously doesn't mean that you excuse what they're doing and i every fiber of my being thinks donald
00:36:36.980
trump is a bad dude it's just like what do we really mean by bad we mean that he behaves badly
00:36:42.740
he's bad for the world he's bad at his job you know you can go down the list but does it mean
00:36:48.660
that he is condemned on every level that he you know is a soulless creature who's not a human being
00:36:56.740
you know i mean if you really really want to get down to the core of it a part of it has to do with
00:37:04.100
you know you you mentioned the concept of things might have been different in these people's lives and
00:37:07.940
one of the deep sort of concepts that i'm wrestling with in the book is this concept of contingency
00:37:12.580
and how history might have been different and people's lives might have been different and that
00:37:16.820
yeah there is this kind of deep existentialist effect of a kind of you know giant pachinko machine that
00:37:24.140
we're all in but i think the key and i didn't expect to be talking about this kind of stuff but i think
00:37:30.400
it does get to that deep level pretty quickly because i think the key is to try to hold at once
00:37:35.880
the sort of existentialist absurdist notion that nothing is predetermined and that we're not on a
00:37:41.500
automatic track toward progress and redemption while also not you know becoming nihilistic and
00:37:48.700
feeling that life has no meaning and so part of how that for me applies in this case is to think that
00:37:55.380
on some level of course people need to be held accountable for their actions and of course there's
00:37:59.500
a massive moral difference between being a professional anti-semite and being a professional
00:38:06.360
nurse or bus driver or you know i mean like there are differences in how we act in the world and
00:38:13.080
they're immensely meaningful but i i really struggle with saying that that that the deepest and most
00:38:22.160
complete explanation we can give for someone who does bad things is that they're a human dumpster
00:38:27.440
fire and that's the only thing we have to say about them i actually think it's more incumbent on us
00:38:32.480
again not to excuse not to look away but to actually understand the complexity of it and and and in no
00:38:38.700
way to say oh if you grew up poor or if you had eczema therefore you can do whatever you want absolutely
00:38:44.740
not or that we have to agree with you know we have to go from 100 condemnation of their behavior to
00:38:50.800
90 it's not that at all it's just that on some deeper level you know my my wife has been a public
00:38:57.220
defender in the past and their sort of ethos is you're not the worst mistake you ever made and
00:39:02.800
it's really really hard to apply that to nazis i trust me it's not instinctive it's not intuitive and
00:39:08.860
again i don't claim to be some gandian figure who just naturally intuitively does that i mean nazis
00:39:14.640
make me upset they make me angry i i get why people want to yell at them i get why people even want to
00:39:20.160
punch them and i i don't claim to be above that i just think it it's not the only place to land and i also
00:39:26.200
think it doesn't help us understand anything i mean there are different projects right there's one
00:39:30.940
project that is about fighting the ideas which is valuable and there's another project that's about
00:39:36.280
diagnosing and understanding where they come from i think they're both necessary well there really is a
00:39:41.620
problem of understanding what's going on because in addition to having nazis out there and and you know
00:39:49.340
extremists of various types we have this other problem this layer that is built you know around
00:39:57.060
it on it you know somehow interacting with it of what we might call troll culture and there's just
00:40:03.860
this new style of insincerity or apparent insincerity or you know irony usurping every other value
00:40:12.780
which creates a problem of assessing what people actually believe and intend or you know even if
00:40:21.380
you do grant that people should be taken literally even in these contexts it's hard to know just how
00:40:29.160
committed they are to these specific ideas there's a culture of just deliberate obfuscation around this
00:40:35.520
where you know as you report there some of these people are you know i think this was i forget which
00:40:40.720
website this was but you know it was explicit that they wanted it to be hard for the normies to tell
00:40:48.040
whether or not they're joking right right and contained in that is the implication that you know most of
00:40:54.480
the time we're not really joking right or you know we're not joking about some of the worst stuff how do
00:40:59.640
you think about troll culture and what should be the appropriate response to it because the response i'm
00:41:07.060
seeing more and more in the mainstream media and on the left is just taking the worst possible
00:41:16.020
construal of everything as the literal truth of everything yeah and i i get where that impulse comes
00:41:23.840
from and you know look it's really really complicated i mean i say a couple times in the book that
00:41:30.160
trolls set this ingenious trap right because if you're a good troll and you know i think the president is
00:41:36.880
good at very few things but i think trolling is definitely one of them if you're good at it you
00:41:41.180
don't leave people any good choice right if you pay any kind of attention to a troll you're letting
00:41:48.400
them win because what they want is attention if you let their views or putative views or offensive jokes
00:41:56.740
or ironic whatever go unchallenged then they also win so it's a kind of trap and i don't think we've
00:42:05.460
figured out a good way out i think you know i have a a little part of the book where i'm i'm at the
00:42:13.160
white house briefing room and um i'm there with uh this kind of you know just just he's he's essentially
00:42:21.480
an insurgent in a in a dirty culture war who is acting as a white house correspondent for
00:42:28.980
a gateway pundit so i'm there kind of shadowing him and sort of seeing how how far he can go in the
00:42:36.780
he's essentially just performing he doesn't actually ask questions or intend to ask questions
00:42:42.580
he's just there to kind of act out the degradation of the norm of the press briefing room being
00:42:50.100
meaningful at all and right while he's there but nonetheless he and others in this vein were just
00:42:57.640
adorably excited to have been granted press credentials in the first place absolutely you
00:43:02.320
know so they're subverting it as then the norm of this institution is like this is just a worthless goof
00:43:07.860
and yet this is the biggest day in my life that i get access to the white house totally i mean yeah
00:43:13.200
you see that all over the place you see that with all kinds of reactionaries and proto-reactionaries
00:43:18.260
and want to be authoritarians that you know our whole system is meaningless and you know should
00:43:23.580
be consigned to the waste bin of history and yet as soon as i have any power within it i'm going to
00:43:27.880
flaunt that power to the maximum not that these guys were really reactionaries in the sense that
00:43:33.040
they had a consistent ideology but just that they their impulses run in both directions but while i was
00:43:38.580
there with him a few of the real reporters who were there called him out and sort of confronted him
00:43:45.440
on camera or you know everything is on camera these days because someone just holds up a phone
00:43:49.740
and they wanted to nail him to the wall they wanted to nail him on having a view that was
00:43:55.960
inarguably beyond the pale so that they could prove that he didn't belong there and they couldn't really
00:44:04.680
do it because they didn't well because they just didn't know exactly who he was and so they kept
00:44:11.840
saying well you are a white nationalist and he said well my boyfriend is colombian so i guess i'm not a
00:44:19.320
good white nationalist you know and he was able to kind of win that round now even though they weren't
00:44:26.800
wrong in their intuition that he didn't belong there he absolutely didn't belong there because he
00:44:32.040
wasn't even pretending to be good at being a journalist i mean he and some level was pretending
00:44:37.600
but just in the barest most superficial way he really he really was to the extent that what
00:44:42.700
happens in that room is meaningful at all which we can you know call into question but to the extent
00:44:48.040
that that kind of journalism is meaningful he shouldn't have been there but they couldn't really
00:44:52.200
nail down why and the reason i ended up being so scene dependent in the book is because i feel like
00:44:58.100
i went round and round in my head about these theoretical concerns and reading into the history
00:45:04.140
of questions of journalistic ethics and reading public opinion by walter lippman and thinking through
00:45:09.900
how democratic institutions do or don't survive and all of which was an interesting thought exercise
00:45:16.260
but then you know to just see a scene like that playing out in front of your eyes and seeing how
00:45:21.280
even when something is obviously going awry it's not always easy to name it accurately or to
00:45:28.100
decisively prove it and so to me to get back to the substance of your question that kind of seems
00:45:33.880
like it it suggests two different things to me that may or may not be at odds with each other like
00:45:39.740
on one hand it seems to me like you want to be really minimalist and limit yourself to only lodging
00:45:46.320
accusations that you absolutely know to be true because otherwise you know you could set yourself up for
00:45:51.560
for humiliation on the other hand when you're dealing with a really slippery gifted troll they're not
00:46:01.180
always going to give you the ammunition you need so if you limit yourself to only the barest assertions
00:46:08.200
of fact you're just letting them win because you are allowing a liar to dictate the terms of the debate
00:46:16.980
so of course i don't advocate for making up accusations or for you know misinterpreting jokes
00:46:25.020
as reality or vice versa obviously in a vacuum you want to get things right as often as you can but
00:46:29.780
the problem is they don't say what they mean they don't give you the courtesy of of of telling you who
00:46:36.040
they are and so i get why people try to you know why sometimes people overplay their hand because you
00:46:41.940
you you you have to get outside of their setting of the terms well many of them tell you who they
00:46:49.240
are that or they tell themselves who they are when only their friends are listening i mean so if you if
00:46:54.340
you listen long enough to many of these people i think the mask if they're ever wearing one does come
00:47:00.760
off sometimes yeah so let's go to one of the kind of the harder cases which are more by definition more
00:47:07.080
mainstream and here i think our intuitions might divide a little bit and again i mean my intuitions
00:47:16.260
here are now you know sort of newly anchored to the experience of being on the other side of this i
00:47:22.600
mean being targeted by people's poorly calibrated racist detectors so like take the cases of tucker
00:47:30.120
carlson and laura ingram right so these are both people who i've been interviewed by i've never met
00:47:35.200
either of them in person i don't think but you know i've been interviewed by each of them a few
00:47:40.560
times you know not recently but you single them out essentially as as racist dog whistlers for things
00:47:48.380
they've said recently and i think so laura ingram said you know democrats mostly want to replace those
00:47:53.720
old white yahoo conservatives with a new group who might be a little a bit more amenable to big
00:47:59.040
government and that you read as a dog whistle i believe i can read that more charitably just
00:48:04.960
as a fairly factual statement i mean there's so many people on the far left who are banging on and on
00:48:13.240
about white privilege and using whiteness and age and gender you know it's old white men being the
00:48:23.800
filter against which they would make almost any political decision i mean they're advertising this
00:48:29.080
about themselves and it seems to be charitable to laura and that's an impulse i don't often feel
00:48:35.560
she could have just been remarking on that and not dog whistling to actual racist much less expressing
00:48:43.580
her own racism yeah so i believe you that you can parse that in a way that you see it as not a dog
00:48:53.340
whistle i guess i don't see it that way and i don't really see why you i mean i i look i get that
00:49:03.100
it's always possible to read a quote literally as not racist in the sense that the person is not
00:49:10.180
literally saying in the quote i a racist believe that the races are that the white race is superior to
00:49:16.940
the non-white races like in any quote where somebody's not saying those words it's possible
00:49:21.920
to read it as not racist i haven't listened to every episode of your show but i've heard some
00:49:27.120
former episodes where i've heard you do this a few times with trump you know saying you know yes he told
00:49:33.520
you know these women of color to go back to their countries but i'm not sure i see that as a racist
00:49:37.900
dog whistle and i guess i guess i don't see why we we should ignore what's right in front of us
00:49:44.420
and not take the obvious inference from it there there is a a very well-known poisonous theory
00:49:51.680
called the great replacement theory that we all know now because they were chanting about it in
00:49:57.020
charlottesville and so to the extent that people didn't know about it before that which i would
00:50:01.460
argue it's probable that laura ingram and tucker carlson did know about it before that but i can't
00:50:05.560
prove it but we all know about it after that to then traffic in those words replacement and give it an
00:50:12.900
explicitly race related valence and then to turn around and deny that you're trafficking and race
00:50:18.980
baiting i it just it it just beggars belief and i i don't you know plus you can put it together with
00:50:24.920
a decades-long history of of doing similar things and of supporting policies that have those effects
00:50:30.440
so i guess i just don't i don't see why we would try to contort ourselves into you know trying to i i get
00:50:37.820
the point of being charitable to people but this doesn't seem charitable this seems implausible i mean i can
00:50:42.440
give you an answer to that question i mean why bend over backwards to be charitable even in the case
00:50:51.000
when you're dealing with someone who have other reason to believe
00:50:54.520
if you'd like to continue listening to this podcast you'll need to subscribe at samharris.org
00:51:03.500
you'll get access to all full-length episodes of the making sense podcast and to other subscriber
00:51:08.580
only content including bonus episodes and amas and the conversations i've been having on the waking
00:51:14.160
up app the making sense podcast is ad free and relies entirely on listener support and you can
00:51:20.300
subscribe now and subscribe now at samharris.org