Making Sense - Sam Harris - February 05, 2019


#148 — Jack Dorsey


Episode Stats

Length

33 minutes

Words per Minute

157.87451

Word Count

5,341

Sentence Count

295


Summary

Jack Dorsey is the CEO of Twitter and Square, the company that provides financial services to millions of people around the world. He's also the co-founder and former CEO of Dorsey Capital, a venture capitalist, and a partner at Square. In this episode of the Making Sense Podcast, I speak with Jack about how he thinks about his company s role in the world, how it s grown, and what it s trying to do about the toxicity on their platform. And, of course, I briefly make my case for banning Trump from the platform, and I talk about Jack s practice of meditation, which I describe as "meditation meditation." I also discuss why I consider this interview a missed opportunity, and why I think Joe Rogan should have pushed Jack harder on certain points. We don t run ads on the podcast, and therefore, therefore, it s made possible entirely through the support of our listeners, I m making possible by becoming a supporter of the podcast. Please consider becoming a patron of Making Sense, and become a supporter by becoming one of our sponsors. . If you enjoy what we re doing here, please consider becoming one! you'll get access to all sorts of great shows, including The New York Times best-selling books, and much more. And you'll be helping us make sense of the world we re living in a world where we re all things we re trying to make sense. Thanks for making sense! Sam Harris - The Making Sense: A podcast that s a podcast about making sense of it all, by Sam Harris, by making sense, by listening to the things we know and talking about it, by thinking about it and writing about things we can make sense, so we can all be a good thing, by being smart, by doing more of it, and by being kinder than we all do that, and we can be kinder, and more of us all do more of that, by helping us all have a better of it by being more like that, not less of that. - Thank you, by good people do better than that, more of what we can do better, by us, by talking about things like that? Thank you for listening to this, by you do better by listening more of this, you re making sense by listening, and you ll be better at it, too, by saying that, you do too, too good of it?


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.780 Today I'm speaking with Jack Dorsey.
00:00:49.340 Jack is the CEO of Twitter and Square.
00:00:53.340 We don't spend a lot of time talking about Square.
00:00:55.520 We get into the details of Twitter.
00:00:58.460 We talk about the role that Twitter plays in journalism now, how it's different from other
00:01:04.760 social media, how Jack and the rest of his team are attempting to reduce the toxicity on
00:01:10.860 their platform.
00:01:11.680 We talk about what makes conversation healthy, the logic by which Twitter suspends people,
00:01:18.680 the reality of downranking and, quote, shadow banning.
00:01:21.780 I briefly make my case for banning Trump from the platform.
00:01:26.440 We talk about Jack's practice of meditation.
00:01:30.000 Anyway, I must say, I consider this interview a missed opportunity.
00:01:35.340 We really were the casualty of timing here, more than anything else.
00:01:40.400 Because we recorded this conversation a week before the Covington Catholic High School circus,
00:01:46.560 which, as you know, exemplified more or less everything that's wrong with social media at
00:01:51.660 this moment, and Twitter in particular.
00:01:54.240 If you recall, it really seemed in that week that Twitter accomplished something like the
00:01:59.780 ruination of journalism.
00:02:02.000 So that would have been great to talk about, and our silence on that topic will be ringing
00:02:06.540 in your ears.
00:02:07.160 So much of what we talked about with respect to Twitter's policy around suspending people
00:02:13.080 and the politics of all that really could have been sharpened up had we had a time machine.
00:02:20.480 We also had this conversation before some other interviews with Jack came out,
00:02:25.640 which I've since read in Rolling Stone.
00:02:29.200 And also he went on Joe Rogan's podcast in the interim.
00:02:33.120 And Joe, as you know, streams everything live.
00:02:34.980 So, um, I've seen the aftermath of all that, and Joe reaped a whirlwind of criticism for
00:02:43.060 not having pushed Jack hard enough.
00:02:45.120 I think he's going to have Jack back on his podcast.
00:02:48.040 I'm actually going to be on Joe's podcast later in the week, and I'm sure we'll talk about
00:02:51.780 all this.
00:02:53.040 But all that notwithstanding, I really enjoyed talking to Jack.
00:02:56.560 One thing I want to make clear, because I saw some of the pain that Joe was getting from
00:03:01.120 his audience, many people were alleging that Joe must have agreed not to push Jack on certain
00:03:08.340 points.
00:03:09.580 I can't speak for Joe, but I must say Jack had no restrictions at all on this conversation.
00:03:15.940 He was eager to talk about anything I wanted to raise.
00:03:19.360 There were no edits to it.
00:03:21.360 He didn't request any.
00:03:22.320 So, he's totally willing to have a conversation about where Twitter has been and where it's
00:03:31.340 going.
00:03:32.420 You'll hear that he is quite good at pirouetting around any concern a person raises.
00:03:40.620 You'll certainly witness that in this conversation, and it was there to be seen in Joe's and in
00:03:45.820 all these subsequent interviews that I've seen.
00:03:47.500 And, you know, he really does offer a more or less a full mea culpa on many of these
00:03:53.400 points.
00:03:53.720 You talk about how toxic Twitter is, and he fully acknowledges it.
00:03:57.780 You talk about how inscrutable the policy is around banning and how it lacks transparency,
00:04:04.960 and he fully owns that.
00:04:06.340 And so there's really, there's not that much to get from him on those points apart from his
00:04:12.820 stated commitment to fixing all of these problems that he acknowledges.
00:04:18.660 So, you know, I don't know what Joe's going to get out of him on a second pass, but given
00:04:23.640 the time I had this conversation with Jack, I really can't express too much regret, but
00:04:29.420 just in light of what's happened in the last few weeks, I would certainly want to turn down
00:04:34.520 the screws a little bit on a few of these points.
00:04:37.360 That said, I really enjoyed the conversation with Jack, and I hope you do too.
00:04:42.820 And now I bring you Jack Dorsey.
00:04:50.700 I'm here with Jack Dorsey.
00:04:52.080 Jack, thanks for coming on the podcast.
00:04:53.760 Thanks for having me.
00:04:54.820 This is an interesting conversation for me to approach because I think we're going to
00:04:57.880 talk about some things that I'm a little concerned you don't want to talk about, and I'm just
00:05:02.380 going to forge ahead.
00:05:03.540 I want to talk about everything.
00:05:04.860 Okay.
00:05:05.240 But then I think we'll get into things that, areas of mutual interest that I think we'll
00:05:10.040 both be very happy to talk about.
00:05:11.140 So let's start with the weird stuff and just how difficult your job is, or at least how
00:05:17.680 difficult your job appears to me to be.
00:05:20.800 Obviously, you have two jobs.
00:05:21.960 You've got this dual CEO role with Square and Twitter.
00:05:25.360 I don't know very much about Square.
00:05:27.160 I mean, perhaps you can introduce how you think of your job there, but we're going to talk about
00:05:32.460 Twitter almost exclusively.
00:05:33.840 So I just, I guess to start, how do you think of your career at this point, and how are you
00:05:40.240 managing, and I'm sure this is a question you've gotten a lot, but how are you managing
00:05:43.500 this dual CEO life?
00:05:47.320 A lot of it is experimenting and learning.
00:05:51.460 All the experiences that I've had at both companies have definitely formulated how I act
00:05:57.280 every day, and it's pushed me to focus first on my health, and a lot of that has to do with
00:06:07.820 mental health, and just how I can be aware and productive and observant throughout the
00:06:18.320 day.
00:06:19.000 A big part of that for me has been meditation, which I would hope to talk to you about.
00:06:23.420 Yeah, that's what I'm looking forward to talking about.
00:06:26.160 So we'll save that for the end, something I look forward to.
00:06:30.360 First the pain, then the meditation.
00:06:31.960 First the pain and observing the pain.
00:06:35.080 But a lot of it has just been doing it, and today I don't really segment the parts of
00:06:43.540 my day.
00:06:44.080 It's one job, this is my life, and I know that the companies will benefit, and the people
00:06:50.940 that we serve will benefit from me focusing on consistent self-improvement, and that starts
00:06:57.180 with how I think about things, and that starts with the mindset I bring to my work, and that's
00:07:05.520 certainly evolved over the past, Twitter will be 13 years in March, thinking about skipping
00:07:11.980 in the 13th year, like they skipped 13 floors and buildings, but it'll be 13 years in March,
00:07:17.360 and Square will be 10 years old this February.
00:07:22.340 But a lot of the balance between the two is possible, one, because of the team I've been
00:07:30.640 fortunate enough to assemble, and it took some iterations, but also how similar they are in
00:07:38.980 different mediums, Twitter is obviously focused on communication, and our purpose is serving
00:07:49.000 a public conversation.
00:07:50.760 We think we're very unique in that regard, and there's a lot of dynamics that are quite
00:07:56.300 powerful, and a lot of dynamics that can be taken advantage of, which we'll talk about.
00:08:00.820 Square, on the other hand, is around economic empowerment.
00:08:03.620 And one of the things that we saw early on in 2009 was that people in this country, and
00:08:09.960 certainly this is reflective of the rest of the world, were being left out of the economy
00:08:13.840 because they were being left with access to the slower mediums, like paper cash, while
00:08:19.940 the world was moving on to more digital.
00:08:22.560 And we are serving an underserved audience.
00:08:26.920 We started with sellers.
00:08:27.840 We're now moving to individuals.
00:08:30.780 We have this app called the Cash App, which we have significant percentages of the people
00:08:37.600 using it who were their only bank account.
00:08:41.120 And it's been a really powerful example of utilizing technology to provide access to people.
00:08:50.340 And it's needed in so many ways in how we organize our financial lives and how people make a living.
00:09:02.360 And, you know, as you've talked about on some of your podcasts, these systems have been under
00:09:09.680 a lot of central control in the past, and a lot of that centralized control has removed access
00:09:16.800 from people or not even created the potential to do so.
00:09:19.560 So one of the things we found in Square in the early days is the only way you could start
00:09:24.160 accepting credit cards was if you had a good credit score.
00:09:27.160 And a lot of entrepreneurs who are just getting started, they don't have a good credit score.
00:09:30.440 I didn't have a good credit score when we started Square.
00:09:32.380 I was massively in debt to credit cards, actually.
00:09:35.320 So by shifting that, using better technology, making it more inclusive, we were able to serve
00:09:40.320 a lot more people that the industry just wasn't able to.
00:09:43.100 Right.
00:09:43.220 So you've got these two massive companies, which at least from the public-facing view,
00:09:48.940 seem diametrically opposed in the level of controversy they bring to the world and to
00:09:55.320 your life, presumably.
00:09:56.720 I mean, Square seems like a very straightforward, successful, noble pursuit about which I can't
00:10:03.400 imagine there's a lot of controversy.
00:10:04.760 I'm sure there's some that I haven't noticed.
00:10:06.660 But it must be nothing like what you're dealing with with Twitter.
00:10:11.940 How are you triaging the needs of a big company that is just functioning like a normal big
00:10:19.180 company and Twitter, which is something which on any given day can be just front-page news
00:10:26.800 everywhere, given the sense of either how it's helping the world?
00:10:32.040 I mean, the thing that's amazing about Twitter is that it's enabling revolutions that we might
00:10:38.040 want to support, right?
00:10:39.200 Or the empowerment of dissidents.
00:10:42.180 And there's just this one Saudi teenager who was tweeting from a hotel room in the Bangkok
00:10:48.900 airport, that she was worried that her parents would kill her.
00:10:53.180 And I don't think it's too much to say that Twitter may have saved her life in that case.
00:10:58.020 I'm sure there are many other cases like this where she was granted asylum in Canada.
00:11:02.780 And so these stories become front-page news.
00:11:05.360 And then the antithetical story becomes front-page news.
00:11:07.780 So we know that ISIS recruits terrorists on Twitter, or their fears that misinformation
00:11:13.040 spread there undermines democracy, and we'll get to Trump, but how do you deal with being
00:11:21.260 a normal CEO and being a CEO in this other channel, which is anything but normal?
00:11:27.200 Well, both companies in both spaces that they create in have their own share of controversy.
00:11:35.580 But I find that in the financial realm, it's a lot more private, whereas with communication,
00:11:41.580 it has to be open.
00:11:43.260 And I would prefer them both to be out in the open.
00:11:47.920 I would prefer to work more in public.
00:11:51.180 I'm fascinated by this idea of being able to work in public, make decisions in public,
00:11:57.140 make mistakes in public.
00:11:58.500 And I get there because of my childhood.
00:12:01.880 I was a huge fan of punk rock back in the day, and then that transitioned to hip-hop.
00:12:06.960 And that led me to a lot of open source, where people would just get up on stage and do their
00:12:13.320 thing, and they were terrible.
00:12:14.580 And you saw them a month later, and they were a little bit better.
00:12:17.920 And then a month later, they were a little bit better.
00:12:19.580 And we see the same thing with open source, which led me to technology, ultimately.
00:12:24.780 So I approach it with that understanding of that we're not here just to make one single
00:12:33.540 statement that stands the test of time, that our medium at Twitter is conversation, and
00:12:39.240 conversation evolves.
00:12:40.740 And ideally, it evolves in a way that we all learn from it.
00:12:45.920 There's not a lot of people in the world today that would walk away from Twitter saying,
00:12:49.900 no, I learned something, but that would be my goal.
00:12:53.020 And we need to figure out what element of the service and what element of the product
00:13:01.400 we need to bolster or increase or change in order to do that.
00:13:06.040 So I guess in my role of CEO of Twitter, it's how do I lead this company in the open, realizing
00:13:16.680 that we're going to take a lot of bruises along the way.
00:13:19.600 But in the long term, what we get out of that, ideally, is earning some trust.
00:13:27.020 And we're not there yet, but that's the intention.
00:13:32.860 Well, on the topic of I learned something, actually, this is one of my, this is actually
00:13:36.900 the only idea that I've ever had for improving Twitter, which is to have a, in addition to
00:13:42.760 a like button, this changed my mind button, or I learned something button, so that you
00:13:48.700 can track, I mean, one, it would just kind of instantiate a new norm where people tweeting
00:13:54.980 would aspire to have that effect on people.
00:13:57.020 Like, this is, it's actually about dialogue.
00:13:59.560 It's about debate.
00:14:00.960 So I give that to you.
00:14:02.240 You can do what, actually, I had one other recommendation to you, to de-platform the
00:14:05.780 President of the United States, which I noticed you haven't taken me up on.
00:14:08.120 One of the, one of the ideas we had way back in the day, there was instead of a, we had
00:14:14.460 a, the button was actually called favorite before it was called like.
00:14:19.120 We transitioned to like, I think at one of our most reactive phases within the company.
00:14:24.060 We were drafting from a known behavior that you saw on Facebook and Instagram and whatnot.
00:14:29.600 But we were going to, there was a proposal to change it to thanks, which I like a lot.
00:14:38.360 I think it kind of gets at some of the things you're trying to express to the degree to which
00:14:44.660 you're influencing someone's thinking or you're changing someone's mind is another level.
00:14:49.540 But to build a service that people can express gratitude for things they find valuable more
00:14:56.360 directly instead of the emptiness of a light button is something that we are thinking a
00:15:00.940 lot about right now.
00:15:01.780 Right.
00:15:02.240 The incentives are where we are in the conversation.
00:15:05.440 We realize that what we need to do is not going to be done by changing policy.
00:15:12.980 What we need to do is look fundamentally at the mechanics of the service that we haven't
00:15:17.840 looked at in 12 years.
00:15:20.140 The fact that we have one action to follow and it's following accounts and following accounts
00:15:27.220 in the example of Brexit, for example, if you followed a bunch of accounts that were spouting
00:15:33.620 off reasons to leave, that's all you get.
00:15:37.520 You have no other ability to see another perspective of the conversation unless you did the work to
00:15:42.760 follow the account of someone who was opposed to that view.
00:15:47.420 Whereas we do have the infrastructure in the service right now in the form of search and
00:15:53.140 trends.
00:15:53.960 And if you were to follow the vote leave trend, 95% of the conversation would be reasons to
00:16:00.580 leave, but 5% would be some considerations to make to stay.
00:16:06.020 But we don't make it easy for anyone to do that and therefore no one does it.
00:16:10.020 So these are exactly the things we're looking at in terms of like, is like really the thing
00:16:16.700 that helps contribution back to the global conversation.
00:16:21.520 My own personal view is that it doesn't.
00:16:23.920 My own personal view is it's empty and it's a lot more destructive than what we considered
00:16:31.940 it to be by, well, you know, everyone knows how to take this action so we should put it
00:16:36.220 on our service as well.
00:16:37.640 As you were talking, it made me think you could have a kind of dashboard that showed people
00:16:43.020 how siloed they were in terms of partisan information.
00:16:47.160 Like if people may not know that they're getting only one side of a story.
00:16:51.580 Well, we actually saw that in the 2016 elections.
00:16:54.180 We did some research of the connections.
00:16:56.060 We've been spending a lot more time not looking at the content that people are saying, but the
00:17:00.880 behaviors and the connections between accounts and interactions and replies.
00:17:05.220 And one of the things that was very evident during the lead up to the election was just
00:17:12.540 looking at our journalist constituency, which is one of the most important constituencies
00:17:18.280 on Twitter to my mind.
00:17:20.360 The amount of journalists on the left who were following folks on the right end of the
00:17:27.040 spectrum was very, very small.
00:17:29.140 The amount of journalists on the right end of the spectrum following folks on the left
00:17:34.720 was extremely high.
00:17:37.080 That's interesting.
00:17:38.040 Even just that factoid is worth getting out there.
00:17:40.940 There's a good graphic that an MIT lab called Cortico put out that illustrates this effect.
00:17:47.860 And you can immediately see what happened at least in the media sphere in terms of these
00:17:54.500 these filter bubbles and echo chambers that we tend to create.
00:17:58.820 But that is something that I do take a lot of responsibility around.
00:18:04.900 We have definitely helped to create these isolated chambers of thought.
00:18:09.280 And it's because of the mechanics of how our system works.
00:18:12.720 Just the simplest thing of emphasizing the follower count, only allowing the following of
00:18:19.720 an account versus an interest, a topic, or a conversation.
00:18:23.260 These are the things that don't allow any fluidity and evolution.
00:18:28.540 It's very, very rigid.
00:18:29.900 And you have to do a lot of work to get to some of the fluidity that we know Twitter is,
00:18:36.240 but you have to be an expert to understand that it's even possible.
00:18:41.200 Right.
00:18:41.800 Well, yeah, so you were talking about the different constituencies on it.
00:18:44.380 And that's one thing that makes Twitter unique, that it really seems like the platform where
00:18:49.780 real journalists and real intellectuals and newsmakers, they're relying on it for conversation.
00:18:58.420 I mean, they're relying on both as a kind of a real-time response to things that are happening
00:19:02.860 in the world and as a way of just divulging things that are happening in the world and
00:19:06.720 a way of sharing their opinions.
00:19:10.880 And in that sense, it seems completely unlike every other social media platform to me.
00:19:17.480 And so I have this love-hate relationship, as many people do with Twitter.
00:19:21.260 I have just a hate-hate relationship with all the other social media platforms.
00:19:24.880 I mean, I've never been tempted to use them.
00:19:27.000 Well, at least we're halfway there.
00:19:27.920 But Twitter, I step away from it, and we can talk about just how you, even how you relate
00:19:35.340 to Twitter psychologically, but the idea of not being on it just seems like a non-starter
00:19:41.420 now because it is, it's almost like a public utility.
00:19:45.160 It really is just the, it is the one place where you can, you're guaranteed to see a response
00:19:51.860 to news events that you have curated, and it can be as good, really, or as informative
00:19:59.380 as you've curated it.
00:20:01.300 What do you think accounts for the adoption of Twitter by those groups?
00:20:06.560 And I mean, it's just integrated into, like, even television news has to use Twitter to
00:20:13.380 sort of leverage the conversation about what they're putting out, and they don't do that
00:20:17.240 with Instagram or Facebook.
00:20:19.280 Is it just the short form?
00:20:21.700 What made Twitter so sticky in the beginning?
00:20:24.020 Was it the 140 characters?
00:20:25.880 I think it's a few things.
00:20:26.820 I think, I don't believe we're a social network.
00:20:30.160 Social things happen on us, but my definition of a social network would be one that is dependent
00:20:36.040 upon the people that you know, you know, the graph of your past or your current career
00:20:41.920 or your future aspirations in terms of who you want to work with or who you want to be
00:20:47.640 with or whatnot.
00:20:48.220 And we don't benefit from the address book in your phone.
00:20:51.040 We benefit from more of an interest-based network.
00:20:55.440 We benefit because you're interested in something.
00:20:58.080 And because of that, there's no deliberate join or leave of any one particular community.
00:21:07.920 Simply talking about a topic puts you in it.
00:21:11.040 And the whole dynamic of Twitter enables that.
00:21:15.860 And that's extremely powerful, but it's also extremely complex for people.
00:21:23.160 And I think one of the reasons why journalists took to it so quickly is because it serves
00:21:32.020 as this, it's certainly a marketplace of ideas.
00:21:35.580 It certainly has, you know, people have similar expectations as they would a public square where
00:21:41.380 ideas are discussed and evolved and debated.
00:21:45.700 So it takes on a lot of characteristics of that because of the dynamic of it,
00:21:50.100 because of the real-time nature, because of the public nature.
00:21:52.060 But I think it serves as this in-between-the-articles function.
00:21:59.640 And, you know, we had journalists write article, broadcast it with Twitter, and then get into
00:22:05.060 conversation to get more perspectives, get different ideas, make corrections, make clarifications.
00:22:10.800 But then we also noticed something really interesting is that it really unlocked the journalists from
00:22:16.200 their publication.
00:22:17.260 So I've watched in the nearly 13 years, journalists that I follow go from a smaller blog to a BuzzFeed
00:22:26.520 to a New York Times to another institution.
00:22:29.240 And it became interesting to just follow them as a person rather than the publication that
00:22:36.440 they work for.
00:22:36.980 And I think that felt very freeing to a number of the journalists I've talked to about it.
00:22:43.900 It wasn't about the fact that I'm at the New York Times.
00:22:47.080 It was the fact that I'm doing great investigative journalism, and I have a direct connection with
00:22:52.660 my readers and my sources, and maybe even sources that I didn't know were going to be sources
00:23:01.560 because of the openness, because of the public nature of the service.
00:23:05.560 So I think that was a big part.
00:23:08.400 The constraint has had other ramifications.
00:23:11.980 We were really big with comedians.
00:23:14.720 That was a big wave, I think, because of the rhythmic nature of the constraint.
00:23:20.220 Really big with the hip-hop community for the exact same reasons.
00:23:24.480 We don't see as many poets this day and age, but anyone with a poetic band would have been
00:23:29.940 great for poets, but it also, to the negative, created more of a headline, outrage, fast-take
00:23:39.340 kind of approach and culture.
00:23:42.260 And the expansion to 280 has helped with that.
00:23:45.140 We haven't seen a decrease, or we haven't seen an increase in, when you send an organic
00:23:51.180 tweet out just as a broadcast, people typically don't go over the 140-character original constraint,
00:23:58.220 but when they reply to someone else, they do.
00:24:00.640 And that's where the 280 really matters, is because it allows for a little bit more nuance.
00:24:04.980 And those are the sorts of things we're looking at.
00:24:07.120 The journalists, I believe, were using it as a way to exist in between their work, and
00:24:15.480 also to have conversations with their peers about what's interesting.
00:24:20.100 And there's some positives and negatives for that.
00:24:22.600 What's the philosophy around not letting people edit tweets?
00:24:26.400 Now that I have you here, I'm just going to download all my customer service complaints.
00:24:30.300 When I type a typo and discover it six hours later, why can't I correct that typo?
00:24:35.560 It's going to sound like a really...
00:24:37.120 Boring answer, but I'm going to give you the context for it.
00:24:40.600 So we were born on SMS.
00:24:43.500 We were born on text messaging.
00:24:45.280 And you could view Twitter as, what if you could text with the world?
00:24:50.460 What if you could have a text conversation with the entire world?
00:24:53.800 With a text, you can't correct.
00:24:56.540 Once it's sent, it's sent.
00:24:57.760 It's gone.
00:24:58.300 And you build on top of it.
00:25:00.380 You evolve it.
00:25:01.920 You carry on the conversation.
00:25:03.260 We obviously were not limited by that, but we built our system so that when you send
00:25:09.940 a tweet, it immediately starts fanning it out.
00:25:13.060 So as soon as you send that, a lot of the potential damage is done.
00:25:17.460 So for us to introduce that edit, and these are things that we're looking at.
00:25:22.300 These are things that we're considering and whatnot.
00:25:25.260 But for us to introduce edit for a common use case of, I made a mistake.
00:25:29.060 I need to fix a link because I sent out the wrong one.
00:25:33.400 It adds a delay into the system.
00:25:35.480 And that's good in some context.
00:25:38.380 For a lot of the things that you tweet about, it's probably what you want.
00:25:42.800 But there's all these Twitters.
00:25:46.900 There's your Twitter, which you've built by following who you follow.
00:25:51.280 There's politics Twitter, which is a very, very different experience this day and age.
00:25:56.340 There's NBA Twitter, which is super exciting, but very real time.
00:26:01.240 And people use it while they're watching the game, and it becomes the roar of the crowd.
00:26:04.360 So even a 30-second delay in a tweet is meaningful.
00:26:09.360 So that's a consideration we need to make.
00:26:10.920 We need to make another consideration for another use case people want,
00:26:14.460 in that you might tweet something.
00:26:16.240 You want to go back to it a week later and correct something.
00:26:20.340 But meanwhile, people have retweeted it.
00:26:22.360 And it might be a point of view that you've taken on, and they've retweeted that point of view.
00:26:25.480 And then you decide to do something a little bit devious, and you change the point of view.
00:26:29.980 So they have then tweeted something that you've completely changed the message upon.
00:26:34.360 So that requires a changelog or some notification that this tweet has changed substantially,
00:26:39.020 and he might be saying something that you don't agree with anymore.
00:26:43.140 Right.
00:26:43.500 It's easy to say how people could game that.
00:26:45.340 You could have somebody who tweets something very sticky and innocuous,
00:26:49.600 and then they flip it to the next neo-Nazi meme that they want spread.
00:26:56.200 Exactly.
00:26:56.820 And then the final use case we're looking at is clarifications.
00:27:01.180 And that is this current moment where people are digging up tweets from 10 years ago or 5 years ago
00:27:08.540 and canceling the original Twitter and canceling their career or canceling various aspects of their life.
00:27:18.320 And we don't offer an ability for people to go back and say,
00:27:22.820 well, let me clarify what I meant.
00:27:24.800 And we do believe that's important, and we do believe we can help address it, but it just takes some work.
00:27:33.540 But the reason why it's taken us so long is because the majority of our systems are built in this real-time mindset with a real-time fan-out.
00:27:41.000 And we just want to be very deliberate about how we're solving these use cases and not just stop it, we need an edit button.
00:27:49.160 What are people actually trying to do, and let's solve that problem.
00:27:53.340 Okay, so let's push into some of the areas of controversy here because it seems to me you have an extremely hard job,
00:28:00.540 and so it's hard to imagine how you can actually get it right,
00:28:05.560 and actually do it so well that you won't continuously have this ambient level of criticism about how you're doing it.
00:28:13.440 And the job is to figure out how to get a handle on the toxicity on your platform.
00:28:20.460 And this has so many forms, one could scarcely list them all,
00:28:25.100 but from trolling to harassing to conspiracy theories and misinformation and lies
00:28:32.900 to doxing to what is generally called hate speech,
00:28:38.280 but it is speech that is, in the political context, protected by the First Amendment,
00:28:42.820 at least in the United States, but you have a global platform subject to different laws in different countries.
00:28:49.400 How are you trying to deal with this problem?
00:28:53.080 And I mean, feel free to grab any specific strand of that.
00:28:56.200 I'll start by saying that the problem is more amplified in particular parts of Twitter.
00:29:05.400 It is definitely the case that it is rampant in politics, Twitter.
00:29:11.480 And it comes with a lot of patterns which we're now starting to see be more consistent.
00:29:21.040 Excellent. So first and foremost, just to take it up a few notches,
00:29:25.680 we were asked a question some time ago.
00:29:29.840 What if you could measure the health of conversation?
00:29:33.340 Could you measure the health of conversation?
00:29:35.040 In the same way that you can measure the health of the human body?
00:29:39.300 And we thought that was a very intriguing question
00:29:41.980 because we've all had conversations where we felt it to be just completely toxic.
00:29:47.340 And the result of that is ideally we walk away from it.
00:29:51.040 And we've also had conversations that feel empowering,
00:29:54.020 that we learn something from, and we want to stay in it.
00:29:56.460 And we actually see this digitally as well.
00:29:58.640 We see people walk away from conversations on Twitter.
00:30:01.660 And we see people stay in conversations and persist them on Twitter.
00:30:06.120 And we're to the point where we can actually see it in our numbers and measure it.
00:30:12.040 So we went a little bit deeper with that.
00:30:13.780 And this must be algorithmic, right?
00:30:16.180 We're not talking about individuals tracking.
00:30:17.800 It's not algorithmic, but then checked by people as well,
00:30:21.600 just to verify our models are working.
00:30:25.780 We took it a step further.
00:30:27.380 And so what is health?
00:30:29.800 Health has indicators.
00:30:32.300 Like your body has an indicator of health, which is your temperature.
00:30:36.500 And your temperature indicates whether your system more or less is in balance.
00:30:41.100 If it's above 90.6, then something is wrong.
00:30:44.480 And we need to figure out what the measurement tools are to figure out what that measurement is,
00:30:50.020 what that metric is, which is, in this case, the thermometer.
00:30:53.400 And then, you know, we go down the line.
00:30:55.480 And as we develop solutions, we can see what effect they have on it.
00:31:01.340 So we've been thinking about this problem in terms of what we're calling conversational health.
00:31:06.780 And we're at the phase right now where we're trying to figure out the right indicators of conversational health.
00:31:13.040 And we have four placeholders.
00:31:15.600 The first is shared attention.
00:31:18.860 So what percentage of the conversation is attentive to the same thing versus disparate?
00:31:25.340 The second is shared reality.
00:31:27.260 So this is not determining what facts are facts, but what percentage of the conversation are sharing the same facts.
00:31:36.020 The third is receptivity.
00:31:37.920 So this is where we measure toxicity and people's desire to walk away from something.
00:31:43.380 And the fourth is variety of perspective.
00:31:45.200 And what we want to do is get readings on all of these things and then understanding that we're not going to optimize for one.
00:31:55.320 We want to try to keep everything in balance.
00:32:01.020 And by increasing one, it probably has a negative effect on another.
00:32:04.880 So you could increase the variety of perspective, but decrease the shared reality in doing so.
00:32:11.560 So step one is getting a sense of what the current state is through measurement.
00:32:19.700 And a lot of that we intend to do through algorithms, measuring how people talk.
00:32:27.680 And then, of course, humans pairing with that to make decisions around solutions.
00:32:31.900 And, you know, in the same way that, like, you might be sick and I will offer you, you know, this bottle of water and also offer you a glass of wine.
00:32:43.220 Based on all of our experience, if you reach for the water and you drink the water, there's more probability that you limit the amount of time that your system is out of balance and you're not healthy.
00:32:53.960 If you choose the wine, you'll probably increase the time it takes.
00:32:57.500 So how would we think about giving people more options to at least drive towards more conversational health?
00:33:07.680 So that's the abstract level.
00:33:10.920 At a tangible, tactical level, we're looking at being, we're looking at how people interact.
00:33:17.840 And you can subscribe now at samharris.org.
00:33:47.840 Thank you.
00:33:48.840 Thank you.