#146 — Digital Capitalism
Episode Stats
Words per Minute
163.54065
Summary
In this episode, I speak with Douglas Rushkoff, an award-winning author, broadcaster, and documentarian who studies human autonomy in the digital age. He's often described as a media theorist, and he's written 20 books, including the bestsellers Present Shock and Programmed. He s written regular columns for Medium, CNN, The Daily Beast, and The Guardian, and has made documentaries for PBS's show, Frontline. Today, we discuss his work and his intellectual journey, and his most recent book, Team Human. He s also the host of the popular Team Human podcast, and is a regular contributor to The New York Times, The Huffington Post, and the New York Post. He has been named one of the world s most influential intellectuals by MIT, and in 2014, he was named a member of the top 10 most influential people in the world by The New Yorker. This episode was produced by Alex Blumberg and edited by Annie-Rose Strasser. It was edited by Sarah Abdurrahman and Rachel Ward, and produced by Ben Koppel. Our theme music was made by Micah Vellian and Matthew Boll, and our ad music was written and performed by Matthew Boll and Mark Phillips, and additional editing was provided by Haley Shaw, and a score by Matthew Keyser, and Bobby Lord, and Matthew Kuchinski, and music was mixed by Matthew McElroy, and Mark Williams, and Rachel Goodman, and Ian McKinnon, and Annie-Jane Hermoza, and Ben Karmen, and Robert Kortchuk, and Sarah Meegan, and Michael Zebrowski, with additional assistance from Matthew Cawthorne, and Patrick McKee, and James Korte, and Andrew Schweder, and Jack Williams, Jr., and Rachel Graves, Jr. of The New Statesman, and Daniel Blum, and John McDermott, and Megan McElherd, and Jonathan Goldstein, and Sean O'Brien, and Emily Rell, and David Kuchner, and Tom Heppler, and Caitlin O'Connell, and Anna O'Neill, and Alex Blume, and Katie Williams, Sr., Jr., Jr. and Michael O'Neil, and Paul Kuchter, and Jordan Hill, and Sam Harris, and Kaitlin, and Elizabeth Ward, Sr. and Daniel Peevy, and Peter Kuchler, Jr, and Emanuele Thompson, and Sr. Sr.
Transcript
00:00:10.880
Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680
feed and will only be hearing the first part of this conversation.
00:00:18.440
In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:24.140
There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:30.540
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:35.900
So if you enjoy what we're doing here, please consider becoming one.
00:00:49.560
Douglas has been named one of the world's 10 most influential intellectuals by MIT.
00:00:53.900
He is an award-winning author, broadcaster, and documentarian.
00:01:05.520
He's the host of the popular Team Human podcast.
00:01:09.060
And he's written 20 books, including the bestsellers Present Shock and Program or Be Programmed.
00:01:16.900
He's written regular columns for Medium, CNN, The Daily Beast, and The Guardian.
00:01:22.160
And he's made documentaries for PBS's show, Frontline.
00:01:26.240
And today we discuss his work and his most recent book, which is also titled Team Human.
00:01:34.440
We get into many of these issues, and he is certainly someone who spent a lot of time
00:01:41.800
So, now, without further delay, I bring you Douglas Rushkoff.
00:01:59.220
So, you have a very interesting job description and background.
00:02:04.780
How do you describe what you do and your intellectual journey?
00:02:10.220
I mean, I guess what I do is I am arguing for human autonomy or human agency in an increasingly
00:02:19.300
And I guess what brought me here was originally I was a theater director, and I got fed up with
00:02:28.460
narrative, really, especially these closed-ended, predictable, but felt like almost propagandistic
00:02:37.140
And the internet came around, and I saw chances for participation and interactivity and, you
00:02:45.240
know, sort of the pre-open-source participatory narrative that we could rewrite the human story and
00:02:54.900
And, you know, started to write books about that.
00:02:57.260
And I wrote a book called Siberia about designer reality and one called Media Virus, which was
00:03:03.200
celebrating this new stuff called viral media, which seemed like a good thing at the time.
00:03:07.980
And then I watched as the internet really became kind of the poster child for a failing Nasdaq
00:03:17.800
And all of these companies from Google to Facebook that said they would never be about advertising
00:03:22.800
became the biggest advertising companies in the world.
00:03:25.920
And these tools, which I thought were going to be the new gateway or gateway drug in some
00:03:31.800
ways to a new kind of collective human imagination, ended up being the opposite.
00:03:39.800
So I've been not really writing books or struggling against technology so much as asking people to
00:03:46.860
retrieve the human and bring it forward and embed it in the digital infrastructure rather
00:03:52.980
than just, you know, surrendering all of this power and all of these algorithms to agendas that
00:04:04.160
Is that a label you happily wear or does that kind of miss most of what you're up to?
00:04:09.560
I mean, I happily wear it when people understand media theorists in the way I do.
00:04:14.960
But to most people, I feel like the word media theorist sounds like some kind of a PBS boring
00:04:25.940
But when I think of someone like Walter Ong or Marshall McLuhan or Lewis Mumford, then
00:04:33.940
yeah, because I don't mind being a media theorist because almost everything is media.
00:04:39.100
It's almost hard to figure out something that's not media.
00:04:44.660
But I guess over time, I've become a bit more of a social activist or an economic thinker.
00:04:52.900
It's kind of hard to just to say I'm thinking about, you know, like the content on television.
00:04:58.220
I'm thinking more about the platforms and the political economy that's driving this media.
00:05:06.240
You know, I guess I should be embarrassed to say, I mean, I didn't really read McLuhan.
00:05:14.500
But I didn't read him until after people said, oh, your work is like McLuhan's.
00:05:24.020
People started to say, this is what, you know, McLuhan was saying.
00:05:27.080
And so then, you know, I went back and read him afterwards.
00:05:30.580
And yeah, he was, you know, he was crazy smart, but it's a bit like reading Torah or something
00:05:38.820
where everything he says, I could say, oh, it means this or it means that, you know.
00:05:43.600
So while it's this, it's a terrific intellectual exercise, it's a bit like it becomes like James
00:05:50.680
Joyce that where you can almost argue about it more than make sense of it sometimes.
00:05:56.480
I mean, and part of why, honestly, part of why I'm excited to be talking with you is because
00:06:01.700
there's certain ideas that I'm really unresolved about and sort of certain understandings of
00:06:10.440
the human story, if you will, that I'm still really challenged by.
00:06:16.540
And, you know, in writing this book, I feel like on the one hand, I'm maybe accidentally
00:06:24.760
or unintentionally telling a new myth, you know, or, you know, that, oh, I'm sort of
00:06:31.180
arguing in this book that humanity is a team sport and that, you know, if you look at evolution
00:06:35.980
or even read Darwin, there's just as many examples of cooperation and collaboration leading
00:06:41.580
to, you know, species success as there is competition.
00:06:45.600
And that if we want to understand human beings as the most advanced species, we should think
00:06:49.760
about about the reasons for that are our language and collaboration and, you know, increasing
00:06:57.100
So the Dunbar number got up to over 100 people that we could, you know, collaborate and coordinate.
00:07:02.660
And then I, of course, I argue that all the institutions and media and technologies that
00:07:07.020
we came up with to enhance that collaboration, they tend to be used against that.
00:07:12.500
So instead of bringing people together, social media atomizes people into those separate silos,
00:07:17.980
or even you can go back and see how text abstracted people from the sort of tribal oral culture.
00:07:24.280
And then you could even argue that language before that disconnected people from some essential,
00:07:31.300
But that becomes an almost Eden-like myth that I don't want to fall into to say, oh, don't
00:07:41.020
But then we're stuck in another story, you know.
00:07:43.780
And so what I'm really aching for, what I'm looking to do is to give people reasons to
00:07:50.160
celebrate humanity for its own sake and human values and retrieve what I consider to be,
00:07:57.300
and I hate even the word, but these essential human values without falling or without requiring
00:08:08.600
You know, I'd rather justify it, you know, with science or with common sense or with some
00:08:13.360
sort of an ethical template than, you know, than some other story.
00:08:19.860
My first reaction to some of those ideas is that, you know, basically everything we do,
00:08:26.900
virtually everything has a potential upside and downside.
00:08:30.660
And the thing that empowers us, the lever that we can pull that moves a lot in our world or
00:08:38.120
in our experience, also shifts some things that we don't want to see moved.
00:08:44.860
As you said, you could walk us all the way back to the dawn of language, right?
00:08:48.700
And obviously language is the source of virtually everything we do that makes us recognizably human,
00:08:59.780
And yet language, you know, under one construal, I mean, anyone who has taken psychedelics or
00:09:04.540
spent a lot of time meditating or trying to learn to meditate, recognizes that this compulsive
00:09:11.460
conceptualizing through language, this tiling over of experience that we do just as a matter
00:09:19.260
of course, once we learn to think linguistically, it is in most cases, the limiting factor on our
00:09:28.520
well-being in so many moments, because so much of the conversation we have with ourselves is a
00:09:36.700
And yet we can't have civilization without our full linguistic competence.
00:09:42.520
And we, you know, we certainly want to be able to use it on demand all the time.
00:09:46.980
And basically, any other complex technology built on language, you know, every form of
00:09:54.840
So as I, you briefly gestured at this now fairly famous notion that just the mere introduction
00:10:01.440
of print and a widespread ability for people to read and write was bemoaned by many intellectuals
00:10:09.340
of the time as a guaranteed way to lose our collective memory.
00:10:14.720
The oral tradition would erode, each person's capacity to memorize things would disappear.
00:10:21.960
And given the advantages of print and reading, that seems like a fairly fatuous concern.
00:10:30.900
You can carry that forward into the present with respect to the way markets and digital
00:10:39.100
I mean, the one difference really between speech, text, radio, television, and today and digital
00:10:49.300
technology is that the algorithms that we're building, the artificial intelligences that we're
00:10:55.680
building, you know, continue on, you know, they, they change themselves as they go.
00:11:01.420
You know, if the, if the words that we spoke, you know, mutated after they were out of our
00:11:07.000
mouths, it would be, you know, in order to affect people differently, it would be very different
00:11:14.400
So I get concerned that, that people are not, and certainly the companies that are, that
00:11:20.700
are building these technologies don't quite realize what they're setting in motion, that,
00:11:26.520
that, that the values that they're embedding in these technologies end up, well, the technologies
00:11:32.740
end up doing what we tell them, but by any means that they see fit, you know, they keep
00:11:37.540
going and we don't even, we're not even privy to the techniques that they're using to elicit
00:11:47.240
So while I, I, I could certainly look at capitalism as a system that ended up seemingly kind of
00:11:54.440
having its own agenda and capitalism working on us and the, the defenseless CEO or the unconscious
00:12:02.500
shareholder or the worker who's being exploited that all of these people are kind of stuck
00:12:07.520
in this system that they don't understand, but digital technology seems to make this,
00:12:13.700
this reversal between, you know, the figure in the ground, or I guess McLuhan would say
00:12:19.000
the medium and the message, but I really just think it's the subject and the object that
00:12:23.020
instead of having these tools that we're putting out there to get things that we want or to,
00:12:27.900
to, to help us in some way, we're using these tools to get something out of us.
00:12:33.640
You know, we've turned these, this language, these machines on the human psyche and whether
00:12:40.240
we're using Las Vegas slot machine algorithms or telling them to develop their own, they're
00:12:50.540
So the exploits aren't things that we look at or notice while we're meditating and go,
00:12:55.880
This must've evolved from a human need to do something.
00:12:58.740
And while on one level, it's a neurosis on the other level, it's part of my human strength.
00:13:04.020
And we could look at, you know, how do I want to use this in my life?
00:13:09.920
Oh, look, I can, I can leverage that person's instinct for reciprocity, or look, I can see
00:13:15.720
this one trying to establish rapport and taking all of these painstakingly evolved social mechanisms
00:13:21.480
and using them against us, you know, and that's where I can sort of feel that there's a kind
00:13:28.340
of a good and an evil, you know, and, and, and I never really go there in any of my prior
00:13:33.780
I tried to be kind of nonjudgmental, but now I'm really arguing that, that whenever one
00:13:39.980
of these technologies or languages or algorithms, when they're bringing us together, they're doing
00:13:47.320
good and when they're turning us against each other, they're doing bad just to have almost
00:13:52.660
a simple litmus test for people to understand, you know, am I helping or hurting?
00:13:58.420
Well, so are there companies that are getting to scale in the digital economy that are actually
00:14:05.320
doing it well, that are at all aligned with your more idealistic notions of what the internet
00:14:15.120
Well, I don't, I don't know that there are companies that are doing it.
00:14:18.700
There's certainly organizations that are, are doing it, you know, whether it's a Mozilla,
00:14:24.280
you know, which invented the browser really, archive.org, which is a great organization where,
00:14:31.240
you know, it's tremendous, you know, the film archives and text archives and the Gutenberg
00:14:37.300
project, you know, the example everyone uses, Wikipedia is at scale and doing a good job,
00:14:45.380
but they're not, they're not companies as such, you know, the only companies I'm really seeing
00:14:50.780
doing that are, are cooperatives, you know, and I've gotten inspired by the, the platform
00:14:57.920
And I mean, there's many companies that sort of model themselves on the, the famous Spanish
00:15:03.620
Mondragon cooperative, but basically where, where workers own the company, but that's
00:15:10.080
not necessarily just a digital tradition, you know, uh, uh, associated press is a, is a co-op,
00:15:16.200
uh, ACE true value hardware is a, is a employee owned co-op.
00:15:21.400
So I I've seen things reach scale that way, but usually, or at least so far, they're not,
00:15:29.000
you know, these traditional shareholder owned companies.
00:15:31.420
How would you compare something like Netflix to Facebook?
00:15:36.780
I consider myself a, a reluctant and, uh, none too well-informed student of digital capitalism,
00:15:44.500
I mean, I have my, having a podcast and, and other endeavors, I've just, I've had to turn
00:15:49.300
more and more attention to this, but I feel quite late to begin analyzing all of this.
00:15:55.080
But when I, you know, in sort of the front facing, just consumer eye view of these platforms,
00:16:01.740
when I look at Netflix, I mean, clearly they're playing games with algorithms and they're trying
00:16:07.000
to figure out how to maximize my time on their platform.
00:16:11.320
But my experience is I want them to have all the content they can have.
00:16:16.460
I want them to promote content that I find interesting rather than boring or, or a haphazard
00:16:23.500
connection between my interests and what they're promoting.
00:16:26.120
So insofar as their algorithms begin to read my mind and anticipate what I will find interesting,
00:16:32.440
and they do that better and better and it becomes stickier and stickier, on some level,
00:16:39.480
I mean, I, I can curate the contents of my own consciousness enough to know that if I've
00:16:43.840
spent 17 uninterrupted hours on Netflix, I've got a problem.
00:16:48.140
So if every time I open that app, things just get better and better, that's good.
00:16:55.880
And the business model there is I have to pay a subscription fee and, you know, presumably
00:17:01.560
they're not selling my data to anybody and I'm not the product, right?
00:17:05.900
Whereas with Facebook, everything is flipped and, again, they're trying to game my attention
00:17:13.900
In the case of Facebook, it's completely ineffectual, but they're doing that in order to sell my
00:17:21.160
And we know, you know, more and more about the downside of that, of those incentives and
00:17:26.740
Do you see the distinction between these two companies this way or is there something I'm
00:17:30.580
No, I, I definitely see the, the Netflix versus Facebook is sort of the same thing to me as
00:17:37.240
Apple versus Google, where, you know, here's a company where if I've got the money and that's
00:17:42.480
kind of the, the sticking point, if I've got the money to pay for it, I can buy television
00:17:48.440
and technology and email and all of these things that are treating me as the customer.
00:17:59.240
I'm paying for it to understand me for my benefit and my enjoyment.
00:18:03.740
Whereas on, on Facebook or Google, you know, we understand that we're not the customer and that
00:18:09.480
someone else is paying Facebook or Google to understand us for their benefit.
00:18:14.780
And then not just understand us, but tweak us to their benefit.
00:18:19.620
So if Facebook can determine with 80% accuracy that I'm going to go on a diet in the next
00:18:26.560
six weeks, I'm going to start seeing advertising and updates and things to push me towards going
00:18:34.900
And they're not just doing that to sell the specific products, the specific diet products
00:18:40.060
that are, that are on their site, but to increase that 80% to 90%.
00:18:45.360
They want to increase the likelihood that I will do the thing that they've predicted I
00:18:51.100
So, you know, when I, when I look at a platform like that, or when I look at the way YouTube
00:18:56.060
makes suggestions of what videos I should watch.
00:18:59.060
And when I go down three, four videos in, I'm always at some really dangerously extreme
00:19:05.620
version of whatever it was that I was initially interested in.
00:19:09.020
You know, I see these platforms turning me into a caricature of myself or trying to get
00:19:15.560
me to behave more consistently with the statistical algorithm that's predicted my behavior.
00:19:22.320
Whereas on, on Netflix, the extent to which they use algorithms to deliver up to me what
00:19:28.800
I might like, I find that almost part of the entertainment.
00:19:32.800
You know, I'm interested when I finished Narcos Mexico, you know, and I, the next, if they
00:19:38.520
knew I finished it, then the next morning I look in my inbox and they say, here's what
00:19:45.160
You know, based on the last six things I watched, as well as how much I paused, how quickly I
00:19:53.040
I find it, I find it interesting and I almost enjoy, and maybe this is just sickness, but I
00:20:01.780
You know, in other words, people are using it as a mirror.
00:20:02.800
What shows do I have to watch on Netflix to get it to suggest Roma for me?
00:20:09.040
Because I, I wanted to think that I'm that kind of person.
00:20:15.380
I watch too much, uh, too much, you know, Game of Thrones kinds of things and they don't
00:20:20.420
realize that I have that side, but the, the downside with Netflix and their algorithms
00:20:27.580
is not so much what they suggest, but sometimes I'm a little creeped out by the
00:20:35.780
So, you know, we know that house of cards was partly derived through algorithms.
00:20:41.980
They found out that, Oh, people that like David Fincher also like political intrigue also
00:20:51.840
And then I, I, I wondered why the show kind of went through me like cheese doodles or something,
00:20:59.760
you know, it's, it's, it's as it's like cheese doodles is this sort of industrial age taste
00:21:05.660
styrofoam sensation that that's, that's constructed for me to keep eating it compulsively,
00:21:14.380
And I kind of felt that way, felt that way with those shows, but the biggest problem right
00:21:19.400
now, and, and it shouldn't be seen as a problem is, you know, you get what you pay for.
00:21:23.440
And I, I do get concerned about, you know, bifurcating society into these two classes of
00:21:29.140
Those of us who can afford, you know, to maintain our autonomy by paying for our technologies.
00:21:35.140
And those, I suppose, who still need the remedial help of, of marketing on free platforms.
00:21:41.820
Well, that really is the source of the tension I see, because again, I have a podcaster's eye
00:21:46.780
view of this, but as someone who's decided not to take ads and to just have listeners
00:21:51.680
support the show, I now have a very clear view of these two business models.
00:21:57.340
There's just the PBS NPR version, which is, you know, this thing is free.
00:22:02.100
And if you want to support it, you can, and I know how that works.
00:22:06.500
And, you know, I've just released a, a meditation app, which is a subscription only service through
00:22:17.640
And I see that on the podcast side, I have been engaged in this fairly heavy handed effort to
00:22:26.840
educate my audience, to support this work if they want it to exist.
00:22:31.100
You know, many more people engage with the podcast than have ever engaged with my books.
00:22:36.620
I listened to your, your, that little six minute piece you did on why you have people, uh, why
00:22:43.400
And that it articulated what the exact same thing I feel is, you know, I'll do one podcaster
00:22:49.700
or, or I did one of those Ted talks, you know, for free, more people watch that Ted talks
00:22:54.360
than have bought all the books I've ever written combined.
00:23:01.040
You want, because, because the goal is as a writer or as a public intellectual or someone
00:23:06.180
with any ideas that you want to spread, you want to reach as many people as can conceivably
00:23:14.320
And yet what's happened here is that, I mean, you know, your phrase, you get what you pay
00:23:20.620
And yet it's antithetical to everyone's expectations, you know, even mine, frankly, online.
00:23:27.620
I mean, we're expecting our information to be free.
00:23:30.700
I mean, there's certain contexts where people understand that they're going to hit a paywall
00:23:44.720
It's supported by, by ads, you know, millions and millions of people listen to it.
00:23:49.100
But then he often releases a comedy special on Netflix.
00:23:53.420
I don't think there's anyone thinking that they should be able to watch his special for
00:23:59.060
Like, I don't think he gets angry emails saying, what the hell?
00:24:02.220
Why are you putting this behind Netflix's paywall?
00:24:04.940
But if he put it on YouTube, if he put it online in some other context and put it behind
00:24:11.380
a paywall, you know, it was like Vimeo on demand or something, and he was charging $5 for people
00:24:15.640
to see it, I think you would get a lot of grief over that.
00:24:19.040
And it's just a very odd situation where we were in certain contexts and the widest possible
00:24:27.380
We have trained ourselves to expect things for free.
00:24:30.360
And yet the only way free is possible is this increasingly insidious ad model that is gaming
00:24:46.700
But in others, it just feels like this is the problem we want to figure out how to solve.
00:24:54.520
If we put everything behind a paywall, then we have a real problem of people not being
00:25:01.240
able to get access to content that we really do want to spread as widely as possible.
00:25:06.300
I mean, I heard your conversation with Jaron Lanier about this.
00:25:09.460
And it's interesting that he was sort of blaming the famous truncated Stuart Brand quote,
00:25:15.160
you know, information wants to be free, which always people leave off, but information also
00:25:19.620
wants to be protected is the second half of that sentence.
00:25:24.100
Yeah, but I don't think I look back at the early internet.
00:25:30.300
And the reason why everything was free is because the internet was a university-based system.
00:25:38.520
It's these early, you know, pre-visual text-only internet search retrieval systems.
00:25:45.540
And you would download and share documents, but it was all university archives.
00:25:53.000
So because it was part of a nonprofit academic world, because people actually signed an agreement
00:26:00.080
before they went on the net saying, I promise I'm using this for research purposes.
00:26:04.880
I'm not going to do anything commercial on here.
00:26:10.880
It set up expectations of a very different place.
00:26:15.480
The internet really was intended at that point to become a commons.
00:26:20.780
Then once we brought business on, businesses really leveraged and exploited that freeness, the sense
00:26:30.740
that everybody wanted things to be free without ever really bringing forward that sort of academic
00:26:41.360
And then I remember the moment that I really thought it would change, and maybe it did,
00:26:49.800
And he was sitting in this big easy chair and showing a different posture.
00:26:56.760
The iPad, you couldn't just kind of download files the way you did with your computer.
00:27:01.660
Now you were going to go through an iTunes store to look at stuff.
00:27:06.040
And I feel like what he was trying to do, almost with the skill of a neurolinguistic programmer,
00:27:12.240
he was trying to anchor this device in a different social contract between the user and the content
00:27:21.840
And to some extent, it worked, at least in the Apple universe.
00:27:25.080
He said, look, it's going to be easier and better to buy something through iTunes than to go play
00:27:31.120
around on Napster, you know, just collecting music for the sake of it.
00:27:36.120
Yeah, well, I mean, part of it is once you move to digital and people understand that there's zero
00:27:47.820
And that their use of a file doesn't prevent anyone else from using that same MP3 or whatever
00:27:54.280
it is, at least psychologically, that seems to be one of the reasons why there's this expectation
00:28:05.100
And they're okay with that, you know, and, you know, first they came for the musicians
00:28:08.220
and I said nothing and they came for the cab drivers and I said nothing, you know, and
00:28:12.220
then once they come for me, you know, so the art, the thing that people are upset about
00:28:16.640
is not that they're ruining all these other people's jobs and taking all this stuff.
00:28:20.600
The thing that they worry about is, well, now my privacy is being invaded, so now I'm going
00:28:26.040
to get, you know, now I'm going to get up in arms about what, you know, what's happening
00:28:30.280
here or now my job is in danger, so now I'm going to get upset about that.
00:28:35.020
Yeah, well, people speak specifically of what it's like to be a writer.
00:28:39.080
Recently, an article, I think it was an op-ed in the New York Times, it might have been
00:28:42.600
the Washington Post, but in the last couple of weeks talking about the economics of writing
00:28:46.880
and how dismal they've become and it's amazing.
00:28:50.120
I mean, I, you know, I've had some sense of this for some time, but to read these stats
00:28:57.860
I mean, like the average professional writer who's making some portion of his or her living
00:29:04.100
from writing is living below the poverty line and even, you know, you have to be a massive
00:29:11.920
outlier in terms of, you know, just not even an ordinary bestseller to make a very good
00:29:18.100
living from writing and for the most part, professional writers have to have other jobs.
00:29:24.540
I mean, most professional writers who turn out a book every year or two or three have
00:29:29.240
professorships or they have something else that's paying the bills and that's not an
00:29:34.760
optimal world to live in, especially when you throw in journalism there, which is massively
00:29:39.220
underfunded and ironically, we're living to some degree in a recent heyday of journalism
00:29:47.920
Still, there's a kind of winner-take-all effect there where you have the New York Times and
00:29:52.740
the Atlantic doing well and then everyone else still going slowly or quickly bankrupt.
00:29:59.080
How do you view journalism and the life of a writer at this point?
00:30:05.220
You know, I'm lucky in that, you know, when I wrote my first books in the early 90s, it
00:30:10.420
was still this, the end of the golden period for authors where, you know, I would write
00:30:16.160
a book that sold a lot less than my books do now, but I would get, my publisher would
00:30:23.440
I'd stay in the author's suite of hotels, you know, and they had these special suites that
00:30:28.080
we would go to at fireplaces and the books of all the people who had stayed in it before.
00:30:32.180
It was this, and you'd get this person called a media escort who would take you to your events
00:30:42.820
And then, you know, whatever, you know, Viacom buys Simon & Schuster, which buys, you know,
00:30:48.660
each of the little publishers and all the slack and went out of the wheel somehow.
00:30:53.660
It's like we, we, they started to use just much more accurate spreadsheets and all the
00:31:02.100
It was an industry that somehow just kind of got by about at the same size, I guess,
00:31:11.620
And we lost the ability to kind of fudge our way through that.
00:31:15.740
And they, they started to demand better margins and more of a squeeze and, you know, and,
00:31:21.380
and yeah, the power law dynamics of the internet then came into it.
00:31:24.680
So it's, it's better for a publisher to have, to, to sell, you know, a Taylor Swift's autobiography,
00:31:32.100
you know, and sell, sell half a million copies of that, then 40,000 copies of a book that's
00:31:48.960
And that was right after the 2007 crash when all the publishers were asking for books that
00:31:57.560
Every book I wrote was supposed to have a, a business self-help aspect to it.
00:32:04.300
So I got a university job so that I could write books, you know, like Programmer be Programmed
00:32:08.980
in Present Shock or Throwing Rocks at the Google bus.
00:32:11.460
These ones, or this one, Team Human, which, you know, are, are, are books promoting humanism.
00:32:17.860
And I don't have to worry as much about whether I sell, you know, 5,000 or 50,000 copies.
00:32:26.500
I've done university lectures where college students, a common question I've gotten is
00:32:31.900
why should journalists get paid for what they do if I could blog as easily as them?
00:32:37.240
So they've almost lost all touch with the idea.
00:32:39.660
That's one of the more, I mean, there's so much in that.
00:32:41.860
It's one of the more terrifying questions I've heard.
00:32:45.020
I mean, the way I answer it now is, well, you know, if governments and corporations can
00:32:49.040
spend, you know, billions of dollars on propaganda, don't you want someone who has enough money
00:32:54.560
to spend a couple of weeks deconstructing what you're being told?
00:33:01.440
If I had to list my fears around what technology is doing to us, the erosion of the economics
00:33:08.580
of journalism is one, and also just the distortion of their, their incentives.
00:33:13.000
I mean, the fact that even our best organs of journalism are part of the clickbait logic,
00:33:24.300
But what we want, we should be able to reverse engineer this.
00:33:27.160
We know we want smart people to be well compensated for taking, you know, months to really fully
00:33:37.640
explore and vet topics of great social importance.
00:33:44.420
And the idea that someone could take months to write the best piece that has been written
00:33:52.560
in a decade on the threat of nuclear war, say, right, and that that could just, you know,
00:34:01.280
just sink below the noise of, you know, Kim Kardashian's latest tweet, or in the same,
00:34:09.940
in a similar vein, our president's latest tweet, and just disappear from a news cycle, you know,
00:34:16.180
and therefore earn comparatively little ad revenue.
00:34:20.460
And the net message of all of that is that, you know, those kinds of journalistic efforts
00:34:26.240
don't pay for themselves, and that we really don't need people to hold those jobs, because
00:34:36.660
I mean, and we do see, you know, a few networks or cooperatives, you know, like the journalists
00:34:42.920
who put together the Panama Papers, and, you know, we see some response to that.
00:34:50.080
I mean, I don't like the alternative, the opposite alternative, where it started to feel like,
00:34:57.460
to me, anyway, coming up, that the journalists who got the, well, who got the biggest platforms
00:35:05.600
tended to be people who were paid well enough by the system not to challenge neoliberalism.
00:35:12.380
It's like, well, it's, if you pay the journalists enough, then they're going to support the system
00:35:20.360
So now that it's getting harder and harder for journalists to make ends meet, I feel like
00:35:25.000
a little bit of them, there might be a little bit of a positive effect in terms of, at least
00:35:31.180
in terms of their politics, and they're looking at saying, oh, you know, now I'm a gig worker.
00:35:38.500
There's something, and there's something valuable to, you know, to not being able to just, you
00:35:43.680
know, graduate an Ivy League school and get to write books, you know, for the rest of your
00:35:49.460
Well, what do you do with the fact that, and this seems to be the counter-argument, is that
00:35:53.380
people want what they want, and it's not an accident that certain media products get, you
00:36:01.700
know, tens of millions of views, and certain others just sink beneath the waves, no matter
00:36:13.300
So if it's just a fact that only 8,000 people in the United States really want to read the
00:36:19.940
next white paper about climate change, well then you can't monetize that white paper because
00:36:27.580
Now they should care about it, and we have to keep trying to figure out how to make them
00:36:31.600
care about it, but if they don't care about it, given the glut of information, I mean,
00:36:37.180
given the fact that, again, you know, you can just binge watch Game of Thrones for the
00:36:41.540
third time, and you can't stop people from doing that, this is just a kind of fool's errand
00:36:47.880
to figure out how to get them to take their medicine.
00:36:52.700
On some level, what we're saying is that if it can't be made interesting enough and titillating
00:36:59.260
enough so as to actually survive competition with everything else that's interesting and
00:37:04.600
titillating, well then maybe it deserves to sink, even if that is selecting for the Kardashians
00:37:15.440
You know, it's interesting, and I make the same sort of argument in Team Human when I'm
00:37:21.080
kind of defending the work of David Lynch against the more commercial kinds of movies, and the
00:37:27.460
way that Hollywood will use that argument, they'll say, well, look, people in the mall,
00:37:31.620
they want to see the blockbuster with the spoiler and the climactic ending, they don't want
00:37:37.140
to see the thing that makes you think, they don't want, you know, the, they don't want the strange,
00:37:43.740
but I think people do deep down want that. I do think people want to experience awe, and they want
00:37:51.580
to engage with paradox and with ambiguity and with depth. You know, I get so, I mean, what makes me so
00:38:01.620
annoyed with Netflix is that you can't talk about a Netflix show with someone else because you'll be
00:38:06.960
on season three, episode seven, and they're on season two, episode one, and you're going to spoil
00:38:11.960
something for them because these shows themselves are basically, you know, timelines of IP, of little
00:38:19.500
IP bombs, the little spoilers that show up every three or four, every three or four episodes. It's
00:38:25.140
their, it's like the value of the thing, and you keep undoing them. It's, oh, Mr. Robot, he is his
00:38:32.860
father, or, you know, it's a, each one of them has these almost stock, stock reversals, and you look
00:38:42.120
at, at, at real art, I mean, what's the spoiler in a, in a David Lynch movie? I mean, I couldn't even
00:38:48.960
under, I couldn't even tell you what it was about after I've seen it twice anyway, even though, you know,
00:38:55.000
even though I've, I've loved it. But the idea that, that, that people, that we should deliver what
00:39:02.620
people want, because, because this is, this is what they're, they're paying tickets for, doesn't
00:39:08.720
really make sense when the whole thing has been, has been contextualized that way. And in other words,
00:39:14.440
I don't think, I don't think that's what people really want so much as that's what we're being,
00:39:20.680
we're being trained, or that, that these, um, they're shows that almost, that almost open up
00:39:29.000
wonder, or that almost make deep arguments, or books that make quasi-deep arguments, but then
00:39:36.440
just solve everything at the end. There is a right answer. So in, in a show, I don't know if you saw
00:39:43.000
that Westworld show on HBO, you know, there is an answer. It's all these timelines that are all over
00:39:49.520
the place. And then you find out, oh, these were different timelines. And this happened then,
00:39:53.920
and this happened then, and this happened now. And it's just a kind of a postmodern pyrotechnic,
00:39:59.400
but it gets you to that same place where there is an answer. And every article we write is supposed
00:40:05.900
to have, yes, therefore you should take echinacea, or don't take echinacea, or vote for Trump, or don't
00:40:12.160
vote for Trump, that this, this need to give people the conclusion as if, well, I'm not going
00:40:18.760
to pay you for an article or, or a book. If you don't give me the answer by the end, I'm not going
00:40:23.960
to watch your movie unless everything works out by the end. And that's in some ways, the most dangerous
00:40:31.020
aspect of this, of this cultural collapse that we're in, that everything has to have an answer or
00:40:37.600
every effort has to have a utility value that's recognizable, at least by the time you've done
00:40:43.840
this thing. Because you can't reduce all human activity, all writing, all product, all culture
00:40:51.240
to its utilitarian value. And this is where I get into that weird, mythic religious place that I'm
00:40:58.680
still uncomfortable speaking out loud. But I just read, I reread Horkheimer. He wrote this book,
00:41:05.320
The Eclipse of Reason, and he was talking about the difference between reason with a capital R,
00:41:10.880
the real reasons, the, the, the, the essential human values, we do things versus the small
00:41:17.340
R utilitarian reasons, we do things. And what I'm really trying to do is stress that human beings
00:41:24.540
have capital R reasons for things, that there is, that there is something more going on here that
00:41:31.300
meets the eye. And I don't just mean some magical other dimension, but some essential value to,
00:41:39.640
to human camaraderie, to establishing rapport, to being together, to looking in someone's eyes.
00:41:45.920
It's not, I mean, yes, it's the mirror neurons fire and the oxytocin goes through your bloodstream and
00:41:51.420
your, your breathing rates sync up. And this is the stuff that you studied, you know, and, and,
00:41:56.160
and there's a, an evolutionary small R reason to establish rapport, but is there another one?
00:42:04.460
Is there a value? Is there a, a, a meaning, a meaning to this? And, and that's the part I'm not
00:42:11.500
willing to give up. And that's the, the big argument that I guess the real thing I'm in now is
00:42:16.280
the argument with the transhumanists or the, the post-humanists or the singularitans who,
00:42:21.620
who really believe that technologies are evolutionary successor, and we should pass the torch to
00:42:27.980
technology because tech will not only write a more factual article and a more utilitarian program,
00:42:34.200
but, you know, tech is more complicated. It's a, a more, a more complex home for information than
00:42:42.440
the human mind. So we should, you know, pass the torch. And when I say no, that human beings are special,
00:42:48.580
and I start talking about awe and meditation and, and, and camaraderie and, and establishing rapport.
00:42:55.240
I mean, the one that the famous transhumanist, I'll leave him nameless. I was on a panel with him
00:43:00.080
and he said, Oh, Rushkov, you're just saying that because you're a human, you know, as if it was hubris
00:43:05.820
for me to argue this, to argue for humanity. And that's when I decided, okay, I'll be on team human.
00:43:12.400
I'll, I'll, I'll make a bet that there's something important here because without that as a starting
00:43:19.420
place, I find it hard to make any of the arguments that we're making, whether they're against the
00:43:25.400
market or against automation or against all of our stuff being for free or the collapse of quality or
00:43:31.360
just giving into consumerism. It seems that I have to hold up some sort of essential human value that
00:43:39.480
needs to be recognized rather than surrendered so readily. Well, I guess that there are two
00:43:44.940
different kinds of surrender or blurring of the border between the human and, and what might be
00:43:52.860
beyond. I guess you could talk about artificial intelligence replacing us or augmenting us.
00:43:58.780
If you'd like to continue listening to this conversation, you'll need to subscribe at
00:44:06.080
samharris.org. Once you do, you'll get access to all full length episodes of the Making Sense
00:44:10.740
podcast, along with other subscriber only content, including bonus episodes and AMAs and the conversations
00:44:17.500
I've been having on the Waking Up app. The Making Sense podcast is ad free and relies entirely on
00:44:23.040
listener support. And you can subscribe now at samharris.org.