Making Sense of Social Media and the Information Landscape | Episode 8 of The Essential Sam Harris
Episode Stats
Words per Minute
170.90248
Summary
In this episode, we continue our compilation of Sam Harris's conversations on the topic of Social Media and the Information Landscape. This compilation extends the considered issues of social media well beyond personal engagement and into the many ways in which the surveillance economy generally has warped our politics, social relations, and moral psychologies. In this compilation, we'll hear the natural overlap with theories of moral and political philosophy, belief and unbelief, and artificial intelligence. We'll also be situating the social media question in a broader context of the business model which enables it, something that s been called surveillance capitalism by its critics, and personalized advertising by its more supportive advocates. We'll be zooming in on some of the specific technologies upon which all of this is built, including an episode with Jaron Lanier, which is included in the compilation. And at the conclusion, we ll offer some reading, listening, and watching suggestions which range from fun and light to densely academic. This is episode 304: Why I Left Twitter. Why I left Twitter? is an update before we jump into this compilation. Since the initial writing and recording of the episode, Sam quit Twitter entirely, he recorded a solo episode entitled, "Why I Left. which explains his reasoning and thought process for that decision. We ll be wandering through the same roadmap through these clips, and what we re going to see for ourselves in the next few episodes of this series. We don t run ads on the podcast, and therefore it s made possible entirely through the support of our listeners. So if you enjoy what we're doing here, please consider becoming a supporter of the podcast by becoming a patron of The Making Sense Podcast. Become a patron and become a supporter by becoming one of our patron. You'll get access to the full-time supporter of The Essential Sam Harris Podcast, wherever you get your ad choices are available. Thanks to our sponsorships, and we'll get a better idea of the best listening experience. of what s going to be covered in the podcast. . If you like what you're listening to, you'll get 10% off your ad-free version of Making Sense of the Podcast? Subscribe to the Making Sense: The Podcast, Subscribe to our podcast, Subscribe at Making Sense? Subscribe at Audible Become one of us on iTunes Learn more about your ad choice, and get 20% off the podcast on Audible, too!
Transcript
00:00:10.880
Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680
feed and will only be hearing the first part of this conversation.
00:00:18.420
In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:24.060
There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:30.520
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:35.900
So if you enjoy what we're doing here, please consider becoming one.
00:00:50.760
This is Making Sense of Social Media and the Information Landscape.
00:00:55.480
The goal of this series is to organize, compile, and juxtapose conversations hosted by Sam
00:01:05.380
This is an ongoing effort to construct a coherent overview of Sam's perspectives and arguments,
00:01:10.800
the various explorations and approaches to the topic, the relevant agreements and disagreements,
00:01:17.140
and the pushbacks and evolving thoughts which his guests have advanced.
00:01:20.620
The purpose of these compilations is not to provide a complete picture of any issue, but
00:01:27.700
to entice you to go deeper into these subjects.
00:01:30.940
Along the way, we'll point you to the full episodes with each featured guest.
00:01:35.620
And at the conclusion, we'll offer some reading, listening, and watching suggestions, which range
00:01:43.520
One note to keep in mind for this series, Sam has long argued for a unity of knowledge where
00:01:50.880
the barriers between fields of study are viewed as largely unhelpful artifacts of unnecessarily
00:01:58.100
The pursuit of wisdom and reason in one area of study naturally bleeds into, and greatly
00:02:05.480
You'll hear plenty of crossover into other topics as these dives into the archives unfold.
00:02:10.160
And your thinking about a particular topic may shift as you realize its contingent relationships
00:02:18.220
In this topic, you'll hear the natural overlap with theories of moral and political philosophy,
00:02:23.720
belief and unbelief, free will, and artificial intelligence.
00:02:31.560
Let's make sense of social media and the information landscape.
00:02:35.340
One very important update before we jump into this compilation.
00:02:41.840
Since the initial writing and recording of this episode, Sam quit Twitter entirely.
00:02:48.500
He recorded a solo episode entitled, Why I Left Twitter, which explains his reasoning and
00:02:56.800
We, of course, recommend listening to that, in addition to the included conversations here.
00:03:02.180
The knowledge that Sam eventually walked away from social media platforms places an interesting
00:03:08.880
lens over his conversations on the subject from the previous decade.
00:03:13.180
Though, as you'll hear, this compilation extends the considered issues of social media well
00:03:19.340
beyond personal engagement and into the many ways in which the surveillance economy generally
00:03:24.680
has warped our politics, social relations, and moral psychologies.
00:03:40.620
Social media is one of those topics that everyone seems to have strong opinions about.
00:03:45.940
That fact in itself, the idea that our feelings on just about everything seem to have gotten
00:03:51.220
stronger, inflamed by the advent of social media, is something we'll fold into the discussion
00:03:57.740
Like just about all of us, Sam has gone through, and continues to go through, a strained relationship
00:04:08.340
Apparently, even the most practiced meditators can be hijacked by algorithms that target our
00:04:13.160
propensity for outrage, adulation, annoyance, and disgust.
00:04:20.460
But of course, social media also has positive potential, and its own success stories.
00:04:28.000
There are unignorable societal benefits that must be evaluated and considered.
00:04:32.960
This compilation contains plenty of critique and perspective regarding the darker sides of
00:04:37.580
social media, and the economic model which has provided its scaffolding.
00:04:41.280
But the criticism should not completely crowd out or invalidate the defenders and believers
00:04:48.980
We're also going to be situating the social media question in a broader context of the
00:04:53.800
business model which enables it, something that's been called surveillance capitalism by
00:04:58.840
its critics, and personalized advertising by its more supportive advocates.
00:05:03.620
We'll also be zooming in on some of the specific technologies upon which all of this is built.
00:05:11.520
In the introduction to an episode with Jaron Lanier, which is included in this compilation,
00:05:17.000
Sam identified three main lines of inquiry for this topic.
00:05:21.260
The first is economics, and the question of incentives in the face of automation and artificial intelligence.
00:05:27.220
The second is politics, and the question of how we can cooperate and cohere on ideas in a space
00:05:36.720
And the third is psychology, and how our attention and well-being are being assaulted by the power
00:05:45.660
We'll be wandering through that same roadmap through these clips.
00:05:49.540
There won't be a ton of deep philosophical lessons and thought experiments to walk you through
00:05:54.660
Much of what we'll be tackling is plain for most people to see and experience for themselves.
00:06:00.920
We're going to hear Sam's conversations with defectors from the ranks of the architects
00:06:05.100
of information ecosystems in Silicon Valley, like Tristan Harris and Jaron Lanier.
00:06:10.920
We're going to hear some of Sam's conversation and pseudo-interrogation of Jack Dorsey himself,
00:06:16.520
the co-founder of Twitter, who was also its CEO at the time of their conversation.
00:06:20.680
And we're going to hear from authors like Jonathan Haidt and Cass Sunstein, who have studied
00:06:26.340
and continue to investigate the impacts of social media on individuals and the health
00:06:33.440
We're also going to broaden our lens and listen in on a conversation with Zeynep Tufekci, an
00:06:39.440
author who focuses on global movements and geopolitics, and consider how social media fuels,
00:06:45.340
diverts, or otherwise confuses political efforts.
00:06:50.060
And finally, we're going to tiptoe into the emerging deepfake technology, which threatens
00:06:55.380
to pour even more fuel on the fire of the collapsing integrity of global information.
00:07:01.540
So let's start with Sam talking to Tristan Harris.
00:07:07.340
Tristan has been called the closest thing Silicon Valley has to a conscience.
00:07:11.620
Tristan had just appeared in a documentary entitled The Social Dilemma when he spoke to
00:07:18.120
So much of their conversation references the film, which is certainly recommended viewing
00:07:25.280
Tristan has been laser-focused on the problems of social media after spending years working
00:07:30.660
as a designer for Google and seeing firsthand the potent attention-harnessing techniques that
00:07:37.800
If you listen to our compilation about artificial intelligence, you'll be familiar with a concern
00:07:43.820
about our strengths and competencies being squashed by technology.
00:07:48.180
Here, you'll hear Tristan flip that concern around with a sharp observation.
00:07:56.760
This is from the more recent conversation from episode 218, Welcome to the Cult Factory.
00:08:12.240
If you could boil it down to the elevator pitch answer, what is the problem that we're going
00:08:21.780
Well, it's funny because the film actually opens with that prompt, the blank stares of
00:08:24.940
many technology insiders, including myself, because I think it's so hard to define exactly
00:08:30.580
There's clearly a problem of incentives, but beneath that, there's a problem of what those
00:08:36.000
incentives are doing and where the exact harms show up.
00:08:39.300
And the way that we frame it in the film and in a big presentation we gave at the SF Jazz
00:08:43.460
Center back in April 2019 to a bunch of the top technologists and people in the industry
00:08:48.980
was to say that while we've all been looking out for the moment when AI would overwhelm human
00:08:54.980
strengths and when we would get the singularity, when would AI take our jobs?
00:08:59.980
We missed this much, much earlier point when technology didn't overwhelm human strengths,
00:09:07.580
And you can actually frame the cacophony of grievances and scandals and problems that we've
00:09:11.700
seen in the tech industry from distraction to addiction to polarization to bullying to harassment
00:09:18.620
to the breakdown of truth, all in terms of progressively hacking more and more of human
00:09:25.620
So if we take it from the top, you know, our brain's short-term memory system have seven
00:09:32.720
When technology starts to overwhelm our short-term and working memory, we feel that as a problem
00:09:41.580
I came here to go to Facebook to look something up, but now I got sucked down into something
00:09:44.960
That's a problem of overwhelming the human limit and weakness of just our working memory.
00:09:49.840
When it overwhelms our dopamine systems and our reward systems, that we feel that is a
00:09:56.880
When it taps into and exploits our reliance on stopping cues that at some point I will stop
00:10:02.340
talking and that's a cue for you to keep going.
00:10:04.360
When technology doesn't stop talking and it just gives you the independent bottomless
00:10:07.400
bowl, we feel that as a problem called addiction or addictive use.
00:10:10.980
When technology exploits our social approval and giving us more and more social approval,
00:10:16.300
we feel that as a problem called teen depression because suddenly children are dosed with social
00:10:20.780
approval every few minutes and are hungry for more likes and comparing themselves in terms
00:10:26.640
And when technology hacks the limits of our heuristics for determining what is true, for example,
00:10:31.340
that that Twitter profile who just commented on your tweet five seconds ago, that photo
00:10:39.200
We only have a few cues that we can use to discern what is real and bots and deepfakes and
00:10:44.300
I'm sure we'll get into GPT-3 actually overwhelm that human weakness.
00:10:50.560
So I think that the main thing that we really want people to get is through a series of misaligned
00:10:55.120
incentives, which we'll further get into, technology has overwhelmed and undermined human
00:10:59.540
weaknesses and many of the problems that we're seeing as separate are actually the same.
00:11:03.720
And just one more thing on this analogy, it's kind of like, you know, collectively this digital
00:11:07.840
fallout of addiction, teen depression, suicides, polarization, breakdown of truth.
00:11:13.180
We think of this as a collective digital fallout or a kind of climate change of culture that
00:11:18.260
much like the, you know, oil extractive economy that we have been living in an extractive race
00:11:23.400
for attention, there's only so much when it starts running out.
00:11:26.200
We have to start fracking your attention by splitting your attention into multiple streams.
00:11:29.780
I want you watching an iPad and a phone and the television at the same time, because that
00:11:33.980
lets me triple the size of the attention economy.
00:11:36.480
But that extractive race for attention creates this global climate change of culture.
00:11:41.180
And much like climate change, it happens slowly.
00:11:50.380
And that collectively we called in that presentation human downgrading, but you can call it whatever you
00:11:54.400
The point is that, you know, if you think back to the climate change movement, before there
00:11:59.160
was climate change as a cohesive understanding of emissions and linking to climate change,
00:12:04.560
we had some people working on polar bears, some people working on the coral reefs.
00:12:08.920
We had some people working on species loss in the Amazon.
00:12:11.140
And it wasn't until we had an encompassing view of how all these problems get worse that
00:12:17.000
And so we're really hoping that this film can act as a kind of catalyst for a global response
00:12:21.840
to this really destructive thing that's happened to society.
00:12:26.340
Okay, so let me play devil's advocate for a moment using some of the elements you've
00:12:30.900
already put into play, because you and I are going to impressively agree throughout this
00:12:37.820
But I'm channeling a skeptic here, and it's actually not that hard for me to empathize with
00:12:44.800
a skeptic, because as you point out, it really takes a fair amount of work to pry the scales
00:12:53.580
And the nature of the problem, though it really is everywhere to be seen, it's surprisingly
00:13:00.620
So if you reference something like, you know, a spike in teen depression and self-harm and
00:13:07.840
suicide, you know, there's no one who's going to pretend not to care about that.
00:13:13.080
And then it really is just the question of, you know, what's the causality here?
00:13:16.460
And is it really a matter of exposure to social media that is driving it?
00:13:20.660
And I think, I don't think people are especially skeptical of that.
00:13:23.420
And that's a discrete problem that I think most people would easily understand and be concerned
00:13:30.180
But the more general problem for all of us is harder to keep in view.
00:13:36.680
And so when you talk about things, again, these are things you've already conceded in a
00:13:41.860
So attention has been a finite resource always, and everyone has always been competing for
00:13:49.140
So if you're going to publish a book, you are part of this race for people's attention.
00:13:53.760
If you were going to release something on the radio or television, it was always a matter
00:14:00.560
And as you say, we're trying to do it right now with this podcast.
00:14:02.800
So it's when considered through that lens, it's hard to see what is fundamentally new
00:14:14.220
And then the question is, is it good content or not?
00:14:19.640
It's just, this is just a matter of interfacing in some way with human desire and human curiosity.
00:14:26.480
And you're either doing that successfully or not.
00:14:29.660
And what's so bad about really succeeding, you know, just fundamentally succeeding in a
00:14:34.920
way that, yeah, I mean, you can call it addiction, but really it's just what people find captivating.
00:14:41.440
They want, they want to grant their attention to the next video that is absolutely enthralling.
00:14:47.020
But how is that different from, you know, leafing through the pages of, you know, a hard copy
00:14:52.160
of Vanity Fair in the year 1987 and feeling that you really want to read the next article
00:14:59.260
rather than work or do whatever else you thought you were going to do with your afternoon.
00:15:05.180
And then there's this sense that the fact that advertising is involved and really, really
00:15:13.440
the foundation of everything we're going to talk about.
00:15:17.240
I mean, so really it's a story of ads just getting better.
00:15:22.180
You know, I don't have to see ads for Tampax anymore, right?
00:15:25.800
I go online and I see ads for things that I probably want or nearly want because I abandoned
00:15:36.540
And I think most people are stuck in that place.
00:15:40.640
Like they just, we have to do a lot of work to bring them into the place of the conversation
00:15:51.020
Gosh, there's so much good stuff to unpack here.
00:15:52.840
So on the attention economy, obviously we've always had it.
00:15:56.180
We've had television competing for attention, radio, and we've had evolutions of the attention
00:16:00.960
Competition between books, competition between newspapers, competition between television
00:16:05.020
to more engaging television to more channels of television.
00:16:07.480
So in many ways, this isn't new, but I think what we really need to look at is what was
00:16:16.340
Smartphones, we check out, we check our smartphones, you know, a hundred times or something like
00:16:21.360
They are intimately woven into the fabric of our daily lives and ever more so because of
00:16:26.360
we pre-establish addiction or just this addictive checking that we have that any moment of anxiety,
00:16:32.260
So it's intimately woven into where the attention starting place will come from.
00:16:37.180
It's also taken over our fundamental infrastructure for our basic verbs.
00:16:42.560
Like if I want to talk to you or talk to someone else, my phone has become the primary vehicle
00:16:46.920
for just about for many, many verbs in my life, whether it's ordering food or speaking
00:16:51.600
to someone or, you know, figuring out what I where to go on a map.
00:16:55.400
We are increasingly reliant on the central node of our smartphone to be a router for where
00:17:02.260
So that's the first part of this intimately woven nature and the fact that it's our social,
00:17:07.000
it's part of the social infrastructure by which we rely on.
00:17:10.540
And part of what makes technology today inhumane is that we're reliant on infrastructure that's
00:17:14.820
not safe or contaminated for many reasons that we'll get into later.
00:17:18.680
A second reason that's different is the degree of asymmetry between, let's say, that newspaper
00:17:23.700
editor or journalist who is writing that enticing article to get you to turn to the next page
00:17:28.060
versus the level of asymmetry of when you watch a YouTube video and you think, yeah, this
00:17:32.240
time I'm just going to watch one video and then I got to go back to work.
00:17:35.040
And you wake up from a trance, you know, two hours later and you say, man, what happened
00:17:41.680
What that misses is there's literally the Google, you know, Google's billions of dollars
00:17:45.960
of supercomputing infrastructure on the other side of that slab of glass in your hand pointed
00:17:50.820
at your brain doing predictive analytics on what would be the perfect next video to keep
00:17:57.880
You think, OK, I've sort of been scrolling through this thing for a while, but I'm just
00:18:00.660
going to swipe up one more time and then I'm done.
00:18:03.820
Each time you swipe up with your finger, you know, you're activating a Twitter or a Facebook
00:18:08.420
or a TikTok supercomputer that's doing predictive analytics, which has billions of data points
00:18:15.300
And I think it's important to expand this metaphor in a way that you've talked about
00:18:19.260
on, I think, in your show before about just the power, increasing power and computational
00:18:24.800
When you think about a supercomputer pointed at your brain trying to figure out what's the
00:18:28.780
perfect next thing to show you, that's on one side of the screen.
00:18:31.640
On the other side of the screen is my prefrontal cortex, which has evolved millions of years
00:18:34.800
ago and doing the best job it can to do goal articulation, goal retention and memory and
00:18:39.960
sort of staying on task, self-discipline, et cetera.
00:18:44.540
Well, a good metaphor for this is let's say you or I were to play Gary Kasparov at chess.
00:18:52.220
It's because, you know, there I am on the chessboard and I'm thinking, OK, if I do this,
00:18:57.880
And I'm playing out a few new moves ahead on the chessboard.
00:19:00.700
But when Gary looks at that same chessboard, he's playing out a million more.
00:19:06.600
And that's why Gary is going to win and beat you and I every single time.
00:19:09.840
But when Gary, the human, is playing chess against the best supercomputer in the world,
00:19:14.580
no matter how many million moves ahead that Gary can see, the supercomputer can see billions
00:19:20.960
And when he beats Gary, who is the best human chess player of all time, he's beaten like
00:19:25.520
the human brain at chess because that was kind of the best one that we had.
00:19:28.640
And so when you look at the degree of asymmetry that we now have, when you're sitting there
00:19:33.240
innocuously saying, OK, I'm just going to watch one video and then I'm out, we have
00:19:37.620
to recognize that we have an exponential degree of asymmetry and they know us and our weaknesses
00:19:44.100
That part of the conversation sets the stage for us well, but we recommend a full listen
00:19:52.480
to that episode as Sam continued to skillfully play devil's advocate throughout and allowed
00:19:58.020
Tristan to flesh out the nuanced and complex considerations.
00:20:01.240
But Tristan remains steadfast in his effort to sound the alarm about the power of algorithms
00:20:09.380
And so that's where we're going to stay in this trek through social media.
00:20:12.760
You heard Sam, while channeling a skeptical view, point to the economic model that serves
00:20:18.500
as the oxygen that keeps the social media monsters breathing.
00:20:24.520
Here is an open question for the health of democracy and individual psychology.
00:20:28.740
Is there a point when advertising can become too effective?
00:20:34.020
And has social media pushed us over that threshold?
00:20:38.240
Advertising is certainly nothing new, of course, and the profit motive has always encouraged persuasion
00:20:46.680
But turn back the clock a few hundred years and imagine a handcrafted, colorfully painted wooden
00:20:52.040
sign hanging above a rival blacksmith shop in a town square.
00:20:56.060
And compare its influence to a perfectly timed, personalized, targeted advertisement that was crafted
00:21:05.140
and custom-molded to your taste in music, attraction, color preference, current mood, political
00:21:14.820
The latter does seem to suggest a deep shift in the power to persuade effectively.
00:21:21.060
If there is something like an objective measure of the effectiveness of persuasion
00:21:25.280
that immorally encroaches on a notion of personal autonomy,
00:21:29.220
it's fair to wonder if we've blown right past it.
00:21:32.740
There's an old adage in marketing that goes like this.
00:21:35.700
I know I'm wasting half of my marketing budget.
00:21:41.840
That built-in uncertainty might be eroding in the face of data-collecting machines which
00:21:46.240
promise more and more of a sure thing to advertisers.
00:21:49.740
To explore this area a bit more, we're going to hear from Jaron Lanier.
00:21:54.700
Lanier is a computer scientist and Silicon Valley pioneer who launched virtual reality companies
00:22:02.020
He was part of an early wave of bright-eyed, idealistic technologists.
00:22:07.420
And he's among those who have since begun to question what they may have been missing.
00:22:11.860
When he spoke with Sam, he had just written a book which was not shy about its suggestion.
00:22:18.160
10 Arguments for Deleting Your Social Media Accounts Right Now.
00:22:22.900
For this compilation, we're going to be tapping this interview for Lanier's thoughts
00:22:27.020
on the economic models that have run amok on the internet.
00:22:30.220
And listen in on some of his nascent suggestions on how different models might improve the
00:22:36.280
We'll start with Sam and Lanier revisiting the early days of Silicon Valley and the seemingly
00:22:41.520
uncontroversial notion that information should be free.
00:22:51.140
Many of the worst decisions we've made here, and this is something you point out in your books,
00:22:55.440
in creating this technology, are not on their face bad decisions.
00:23:00.840
I mean, they're certainly not sinister decisions.
00:23:03.180
And so, and one of the first decisions we've made is around this notion that information
00:23:10.820
And that just seems like a very generous and idealistic way to start.
00:23:19.240
So, perhaps we can start here with the digital economy.
00:23:24.600
What could possibly be wrong with information being free?
00:23:30.100
Well, this idea that information should be free was held in the most profound and intense
00:23:39.780
It was something that was believed so intensely during a period starting in the 80s.
00:23:44.380
In some ways, it still holds for a lot of people, and to defy that was very, very difficult.
00:23:51.840
It was painful for my friends who couldn't believe that I was defying it.
00:23:59.320
And on its face, it sounds very generous and fair and proper and freeing, but there are
00:24:07.720
problems with it that are so deep as to, I think, threaten the survival of our species.
00:24:12.980
It's actually a very, very, very serious mistake.
00:24:16.620
So, the mistakes happen on a couple of levels here.
00:24:20.940
I would say the first one has to do with this idea that information is totally weightless
00:24:28.940
and intrinsically something that's free in an infinite supply.
00:24:32.760
And that's not true, because information only exists to the degree that people can perceive
00:24:40.660
And it ultimately only has a meaning when it grounds out as human experience.
00:24:46.300
The slogan I used to have back in the 80s when we were first debating these things is
00:24:50.520
that information is alienated experience, meaning information is similar to stored energy that
00:24:57.800
You put energy into a battery, then you can release it.
00:25:00.260
Or you lift up a weight, and then you let go of the weight, and it goes back down, and
00:25:05.200
And in the same way, information ultimately only has meaning as experience at some point
00:25:12.800
And the problem with experience, or maybe the benefit of experience, is that it's only a
00:25:22.000
And so, therefore, if you make the mistake of assuming that information is free, you'll have
00:25:29.620
And what you do is you make yourself vulnerable to what we could call a denial-of-service attack
00:25:36.400
So, a denial-of-service attack means that malicious people send so many requests to a website that
00:25:47.280
And every website that you use reliably actually has to go through this elaborate structure
00:25:52.380
of other resources created by companies like Akamai that defend it from denial-of-service
00:26:01.240
But in the same way, when you have services like Twitter or Facebook, where anybody can
00:26:06.340
post anything without any cost to themselves, and there's no postage on email, and everything
00:26:12.040
can just be totally filled up with spam and malicious bots and crap, to the point where
00:26:17.820
reality and everything good about the world gets squeezed out, and you end up amplifying
00:26:31.560
There has to be some way that seriousness comes into play if you want to have any sense
00:26:38.180
of reality or quality or truth or decency, and unfortunately, we haven't created a world
00:26:46.860
But then there's a flip side to it, which is equally important, which is we've created
00:26:52.020
this world in which we're talking about technology often as something that's, if not opposed to
00:27:02.480
So there's a lot of talk, and a lot of this comes from really good technologists.
00:27:06.960
So it's not from, like, malicious outsiders who are trying to screw us up.
00:27:10.540
It's our own fault, where we'll say, well, a lot of the jobs will go away because of
00:27:14.560
artificial intelligence and our robots, and that might either be some extreme case where
00:27:19.360
super-intelligent AI takes over the world and disposes of humanity, or it might just be
00:27:24.500
that only the most elite, smart, techie people are still needed, and everybody else becomes
00:27:29.820
this burden on the state, and they have to go on some kind of basic income.
00:27:32.960
And it's just a depressing—it's like everybody's going to become this useless burden.
00:27:39.460
And so even if that means, oh, we'll all get basic income, we won't have to work for
00:27:42.820
a living, there's also something fundamentally undignified, like you won't be needed.
00:27:47.020
And any situation like that is just bound to be a political disaster or an economic disaster
00:27:52.380
on many levels we can go into if it isn't obvious.
00:27:54.720
But the thing to see is that this economic hole that we seem to be driving ourselves into
00:28:01.760
is one and the same as the information wants to be free.
00:28:04.820
Because the thing is, ultimately, all these AIs and robots and all this stuff, they run
00:28:09.720
on information that, at the end of the day, has to come from people.
00:28:14.960
But for a lot of them, there's input from a lot of people.
00:28:18.800
So if we say that information is free, then we're saying in the information age, everybody's
00:28:25.120
worthless because what they can contribute is information.
00:28:29.080
The example I like to use as just an entry point to this idea is the people who translate
00:28:34.880
So they've seen their careers be decimated, they're a tenth of what they were, in the same
00:28:40.920
way that recording musicians and investigative journalists and many other classes of people
00:28:50.620
They've all been kind of reduced under this weird regime we've created.
00:28:54.480
But the thing is, in order to run the so-called AI translators that places like Bing and Google
00:29:01.800
offer, we have to scrape tens of millions of examples from real-life people translating
00:29:07.380
things every single day in order to keep up with slang and public events.
00:29:13.580
You can't just stuff a language translator once.
00:29:18.560
And so we're totally reliant on the very people that we're putting out of work.
00:29:22.220
So it's fundamentally like a form of theft through dishonesty.
00:29:28.900
I want to back up for a second and try to perform an exorcism on some bad intuitions here because
00:29:36.940
I think people come into this, we've trained ourselves to expect much of our digital content
00:29:45.140
And it now seems just the normal state of the world.
00:29:49.580
And of course, podcasts and blogs and journalism and ultimately music should be free.
00:29:55.940
Or if it's not free, it should be subsidized by ads.
00:30:00.100
And I think there's this sense that TV and radio were free.
00:30:09.040
But I think people feel, you know, what's wrong with ads?
00:30:11.940
Some ads are kind of cool looking and amusing and stylish.
00:30:18.600
And then there's these other elements like, you know, having a personalized news feed.
00:30:27.880
So let's just bring this, the concept of, or the role of ads back in here.
00:30:32.280
So most people have decided that in the face of this, the way to monetize work and inspire
00:30:45.720
And this answers the need to have information be free to all of the young people who want
00:30:53.280
And, you know, now we who used to be young still want to get it that way.
00:30:57.600
And this is something that, you know, many of us have, are fighting against who've been
00:31:01.460
paying attention to the consequence of, of relying on ads.
00:31:05.560
And, you know, I've decided that I, that I can't credibly read ads on this podcast.
00:31:09.500
I know that you're, you're more sanguine about the state of podcasting than most forms of
00:31:15.340
And I should say is that for, you know, many podcasters, because I've taken a position against
00:31:20.000
ads on my own podcast, many people come to me wanting to do the same.
00:31:24.480
And the truth is, I don't actually even know what to tell other podcasters at this point,
00:31:30.000
because I think I'm an outlier in this space where it works for me.
00:31:33.920
I found an audience who, and some percentage of the audience will support this work.
00:31:39.000
But it seems to me by no means straightforward to say that, that any podcaster who wants to
00:31:44.900
will, will find an audience to support their work.
00:31:47.500
And I think in the, given the current expectations, I think anyone who does decide to, to forego ads,
00:31:54.260
will be paying a, an economic price for doing that with, with whatever audience at whatever
00:31:59.940
scale, given, given the expectation that podcasts should be free.
00:32:03.680
So it's kind of hard to, to advise people, even when I'm successfully implementing an ad-free
00:32:15.960
Um, my objection is not to advertising, but to continuous behavior modification by algorithm,
00:32:25.980
So what, what overlaps, overlaps in one case in that, well, I, so I'm, I'm worried as a
00:32:31.940
podcaster about the, the behavior modification or, or the perceived behavior modification that
00:32:38.020
can happen to me as a, as a, just a broker of information.
00:32:41.900
I don't, you know, it's like a credibility concern.
00:32:44.400
I just can't, you know, given what I'm trying, trying to do here, I don't feel that I can
00:32:50.260
personally shill for any products, but I think other podcasters can, I think is completely
00:32:55.480
convergent with the brand of other podcasters to say, oh, listen, here's the, here's the
00:33:02.280
You know, you're, you know, you're going to want this t-shirt and that, that works for
00:33:05.360
I know I've heard some really, I, uh, listening to some of the podcasters have to read their
00:33:12.960
It's actually kind of entertaining, but the thing is, as long as every listener hears the
00:33:18.000
same ad and everybody can understand what's going on, that's okay.
00:33:22.160
I mean, the reason podcasting is still, in my view, an unmolested, authentic medium is that
00:33:30.080
there aren't algorithms calculating, um, what somebody hears on a podcast.
00:33:38.460
And if it includes ads, people can tell it includes ads.
00:33:41.800
It isn't, there isn't some meta podcast that's taking snippets and creating a feed for people.
00:33:47.000
There isn't some algorithm that's in, at least so far, that is like, uh, changing what you
00:33:53.940
say with, uh, you know, uh, audio signal processing technology to suit the needs of somebody who's
00:34:02.940
Uh, there's not a calculation of a feed designed by behaviorist theorists to change people.
00:34:10.020
And as long as it's just a crafted thing, even if it, if it includes commercial communication,
00:34:18.180
I think, um, it does start to destroy society when everything becomes really manipulative and
00:34:24.780
creepy in a way that people can't possibly follow or understand, then it starts to undermine
00:34:31.720
And that's exactly what's going on with social media companies and the way searches run and,
00:34:37.240
uh, the way, uh, YouTube videos are selected for you and fed to you and many other examples.
00:34:43.460
And, and, and that's, that's where we really have the most serious problem.
00:34:50.820
What, if you could reboot the internet, how would you do it?
00:34:56.080
The first thing I would do is, um, encourage everybody involved to gradually bring money
00:35:03.160
back into the world of information instead of expunging it.
00:35:06.300
And, uh, I think people should be able to earn a living when what they add to the network
00:35:13.400
I mean, right now we're creating the most valuable companies in history based on the
00:35:18.760
And meanwhile, we're creating more and more economic, uh, uh, separation, more and more inequality.
00:35:25.800
And the only way to correct it is to start paying the people who are adding the information.
00:35:32.240
It doesn't mean that I think the big tech companies should be shut down or that they're
00:35:38.000
Uh, it just means, um, that we have to get back to a world where when people add value,
00:35:44.820
And of course, uh, that the flip side of that is just as Netflix proved, and for that matter,
00:35:49.920
Apple with the app store and many other examples, we have to encourage business
00:35:55.860
So, you know, Google should, Google should say, Hey, search won't be free after 10 years.
00:35:59.940
We're going to gradually start removing the free option.
00:36:02.680
And what you'll get in exchange for that is no more commercial bias and crap on our search
00:36:12.200
We're going to, we're going to commit to not having any ads in 10 years and yeah, you'll
00:36:17.260
start paying for it, but it'll be a great deal.
00:36:19.580
You'll get, you'll get peak Facebook and just like you got peak TV from places like,
00:36:26.700
We're going to give you peak social media where you can get better information and less
00:36:32.180
But, um, the, the, the other part of that is a little more complicated, uh, which is if
00:36:37.760
you keep your eye out for a piece I have coming out with a colleague in the Harvard Business
00:36:41.700
Review, sorry to, I know it's a snobby thing, but anyway, it's a place to start.
00:36:46.220
We're starting to, to scope out, uh, how to do this in much more detail than before.
00:36:51.200
And a lot of it has to do with creating in-between institutions.
00:36:56.180
Um, right now, if there's nothing but a bunch of individuals in one giant tech platform like
00:37:01.380
a Facebook or a Google, there's this bizarre situation where we're petitioning the central
00:37:05.880
authority that we have no other power over that we didn't vote for to, to police our own
00:37:15.720
Um, and the way around that is to create middle-sized organizations that are analogous
00:37:20.780
to things like scientific journals or universities or trade unions or many other examples where
00:37:26.280
you can volunteer, you can voluntarily join these things and they collectively bargain
00:37:31.000
for you so you can get paid decently instead of having a giant race to the bottom.
00:37:34.660
And they can become brands in themselves that enforce quality, um, and become trustworthy.
00:37:41.860
And so we have to create this, this, um, the sense of intermediate structures.
00:37:45.540
And, uh, remember in the past before the internet, the place where, um, excellence and compassion
00:37:52.860
and trustworthiness came from was not the central government declaring it, but rather things like
00:37:58.060
universities and scientific journals and high quality, um, news outlets developing a reputation
00:38:05.220
And, but that was all voluntary, uh, voluntary.
00:38:10.320
And so if you have in-between sized organizations, you can have all these effects that would be
00:38:15.960
authoritarian if they were global and directed from the center.
00:38:19.140
And all of those institutions are exactly the ones that were weakened and destroyed when Facebook
00:38:23.900
said, we're going to move fast and break things.
00:38:25.860
Stuff that was broken were all of those in-between organizations.
00:38:29.400
And so we have to rebuild them in a new way in order to have this more humane and sustainable
00:38:36.740
It's worth reminding ourselves after those two clips that social media is not entirely destructive.
00:38:49.100
There are personal stories of friendships, reconnections, knowledge growth, business opportunities,
00:38:55.160
and meaningful political change, which can credit themselves to the advent of social media.
00:39:00.760
And it can offer valuable real-time information.
00:39:04.460
So we'll try our best to emphasize a hope to not throw out the perennial baby with the
00:39:11.700
On that note, we'll listen in now to Sam's conversation with Jack Dorsey.
00:39:17.140
Dorsey co-founded Twitter and is cognizant of the monster which he's created and the struggle
00:39:25.460
Since this conversation with Sam, Dorsey stepped down from his role as CEO of Twitter, though
00:39:30.760
he's still the CEO of Square, which is a financial tool he also founded.
00:39:35.400
We'll resist the temptation to read into the move away from Twitter as admitting defeat in
00:39:42.460
In this portion of their conversation, Sam and Dorsey discuss how Twitter has entrenched
00:39:47.420
itself into the political and journalistic environment, for better or worse.
00:39:52.820
Dorsey mentions the echo chamber or filter bubble phenomenon, which describes only seeing
00:39:58.000
and hearing news and opinion which coheres with your particular perspective.
00:40:03.420
This phenomenon tends to warp one's worldview and exacerbate partisanship.
00:40:08.440
After we hear from Dorsey, we'll offer an alternative analogy, which might be even more potent
00:40:16.560
We're going to allow this clip to get into some of the specific policy knots that get
00:40:20.940
tied when any experiment like social media gets underway.
00:40:30.840
You've got these two massive companies which, at least from the public-facing view, seem diametrically
00:40:38.920
opposed in the level of controversy they bring to the world and to your life, presumably.
00:40:44.760
Square seems like a very straightforward, successful, noble pursuit about which I can't imagine there's
00:40:52.800
I'm sure there's some that I haven't noticed, but it must be nothing like what you're dealing
00:40:59.180
How are you triaging the needs of a big company that is just functioning like a normal big
00:41:07.200
company and Twitter, which is something which on any given day can be just front-page news
00:41:14.820
everywhere given the sense of either how it's helping the world.
00:41:20.100
The thing that's amazing about Twitter is that it's enabling revolutions that we might want
00:41:26.320
to support, right, or the empowerment of dissidents, and there's just this one, you know, Saudi teenager
00:41:32.760
who was, you know, tweeting from a hotel room in the Bangkok airport that she was worried
00:41:38.680
that her parents would kill her, and I don't think it's too much to say that Twitter may
00:41:46.060
I'm sure there are many other cases like this where she was granted asylum in Canada, and
00:41:50.860
so these stories become front-page news, and then the antithetical story becomes front-page
00:41:55.540
news, so we know that, you know, ISIS recruits terrorists on Twitter, or their fears that
00:42:00.440
misinformation spread there undermines democracy.
00:42:04.180
And how do you deal with being a normal CEO and being a CEO in this other channel, which
00:42:13.000
Well, both companies in both spaces that they create in have their own share of controversy,
00:42:20.980
but I find that in the financial realm, it's a lot more private, whereas with communication,
00:42:29.080
And I would prefer them both to be out in the open.
00:42:37.000
I'm fascinated by this idea of being able to work in public, make decisions in public, make
00:42:47.700
I was a huge fan of punk rock back in the day, and then that transitioned to hip-hop,
00:42:52.940
and that led me to a lot of open source, where people would just get up on stage and do their
00:43:00.420
And you saw them a month later, and they were a little bit better, and then a month later,
00:43:05.400
And we see the same thing with open source, which led me to technology, ultimately.
00:43:10.600
So I approach it with that understanding of that we're not here just to make one single
00:43:19.380
statement that stands the test of time, that our medium at Twitter is conversation, and
00:43:26.560
And ideally, it evolves in a way that we all learn from it.
00:43:31.760
There's not a lot of people in the world today that would walk away from Twitter saying,
00:43:37.980
And we need to figure out what element of the service and what element of the product
00:43:47.220
we need to bolster or increase or change in order to do that.
00:43:52.000
So I guess in my role of CEO at Twitter, it's how do I lead this company in the open, realizing
00:44:02.520
that we're going to take a lot of bruises along the way.
00:44:05.280
But in the long term, what we get out of that, ideally, is earning some trust.
00:44:12.840
And we're not there yet, but that's the intention.
00:44:17.720
If you'd like to continue listening to this conversation, you'll need to subscribe at
00:44:25.660
Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along
00:44:30.160
with other subscriber-only content, including bonus episodes and AMAs and the conversations
00:44:37.780
The Making Sense podcast is ad-free and relies entirely on listener support.