Making Sense - Sam Harris - September 24, 2020


#218 — Welcome to the Cult Factory


Episode Stats

Length

47 minutes

Words per Minute

179.768

Word Count

8,601

Sentence Count

368

Misogynist Sentences

2

Hate Speech Sentences

6


Summary

Tristan Harris is back on the podcast to talk about The Social Dilemma, a new Netflix documentary that explores the growing problem of social media and the fracturing of society. In this episode, Sam talks with Tristan about why we need to get a grip on the problem, and why we can t seem to converge on a shared understanding of what's happening so much of the time, and how we might be to blame for it. Sam also announces a new Zoom call for subscribers, and introduces a new feature-length episode of the podcast called "Wake Up! that will be released on October 7th, featuring a new episode of The Making Sense Podcast with Sam Harris, hosted by Alex Blumberg. Subscribe to the Making Sense podcast on Apple Podcasts, wherever you get your stuff, and don't forget to leave us a rating and review! You can also become a supporter of the show by becoming a patron patron of Making Sense, where you get 20% off the first month with discount code "MISINGSENSE" at checkout. If you like what you hear, please consider pledging a small monthly fee of $19.99. You'll get access to the show and access to all future episodes, plus an ad-free version for as little as $1.99, plus a free copy of the book "Making Sense" by clicking the link in the iTunes store. Thanks for supporting the podcast! Subscribe, rate, and review on Audible, and share the podcast on your favorite podcast app! Sam Harris's newest book "The Social Media Problem" by Good Mythology. Good Luck, Sam Harris and Good Luck! Timestar, Tim is a fellow Mentioned in the new book, Good Luck Out There! . Thanks, Sam Tim is Thank you, Tim Gooding, Tim, and Good Morning, Tim and Good Fortune, by: by: Tristan Harris. Tim Goodell, Good Morning and Good Life, by , by: Tim Gooden, by , & in the Good Morning's Good Thing, by Tim Gooder, , and is a book written by . . by Jeff Perhans, Good Day, and by David Goodell (Goodbye, and Thank You, and Thanks, Tim's Dad, by Mr. James Goodell.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.340 This is Sam Harris.
00:00:10.400 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.280 feed and will only be hearing partial episodes of the podcast.
00:00:18.340 If you'd like access to full episodes, you'll need to subscribe at samharris.org.
00:00:22.980 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:27.580 other subscriber-only content.
00:00:30.000 And as always, I never want money to be the reason why someone can't listen to the podcast.
00:00:34.980 So if you can't afford a subscription, there's an option at samharris.org to request a free
00:00:39.480 account, and we grant 100% of those requests.
00:00:42.720 No questions asked.
00:00:47.460 Welcome to the Making Sense Podcast.
00:00:49.780 This is Sam Harris.
00:00:52.020 Okay, very brief housekeeping today.
00:00:55.700 Just a couple of announcements.
00:00:57.760 First, I will be doing another Zoom call for subscribers, and that will be on October 7th.
00:01:07.400 I'm not sure if that's going to be an open-ended Q&A, or whether the questions will be focused
00:01:12.620 on a theme.
00:01:14.160 I'll decide that in the next few days.
00:01:15.800 But anyway, the last one was fun, and hopefully the fun will continue.
00:01:20.760 So I will see you on October 7th, and you should be on my mailing list if you want those details.
00:01:27.600 Also, there's a few exciting changes happening over on the waking up side of things.
00:01:32.800 So pay attention over there if you're an app user.
00:01:37.760 And I think that's it.
00:01:41.060 Okay.
00:01:42.660 Well, today I'm speaking with Tristan Harris.
00:01:45.840 Tristan has been on the podcast before, and he is one of the central figures in a new documentary,
00:01:52.440 which is available on Netflix now.
00:01:55.460 And that film is The Social Dilemma, which discusses the growing problem of social media
00:02:04.800 and the fracturing of society, which is our theme today.
00:02:10.420 So as you'll hear, I highly recommend that you watch this film.
00:02:14.820 But I think you'll also get a lot from this conversation.
00:02:17.660 I mean, if you're looking out at the world and wondering why things seem so crazy out there,
00:02:25.280 social media is very likely the reason.
00:02:28.480 Or it's the reason that is aggregating so many other reasons.
00:02:33.200 It's the reason why we can't converge on a shared understanding of what's happening so much of the time.
00:02:40.180 We can't agree about whether specific events attest to an epidemic of racism in our society
00:02:48.980 or whether these events are caused by some other derangement in our thinking
00:02:53.740 or just bad incentives or bad luck.
00:02:56.860 We can't agree about what's actually happening.
00:03:00.360 And amazingly, we are about to hold a presidential election that it seems our democracy might not even survive.
00:03:13.480 Really, it seems valid to worry whether we might be tipped into chaos by merely holding a presidential election.
00:03:22.740 It's fairly amazing that we are in this spot.
00:03:26.220 And social media is largely the reason.
00:03:31.860 It's not entirely the reason.
00:03:33.740 A lot of this falls on Trump.
00:03:35.740 Some of it falls on the far left.
00:03:38.560 But the fact that we can't stay sane as a society right now,
00:03:46.240 that is largely due to the fact that we are simply drowning in misinformation.
00:03:52.280 Anyway, that is the topic of today's conversation.
00:03:58.240 And I was very happy to get Tristan back on the podcast.
00:04:02.440 Apologies for the uneven sound.
00:04:06.060 Pre-COVID, we were bringing everyone into studios where they could be professionally recorded.
00:04:12.460 Now we're shipping people Zoom devices and microphones.
00:04:17.280 But occasionally, the technology fails.
00:04:22.180 And we have to rely on the Skype signal.
00:04:24.880 So, what you're hearing today is Skype.
00:04:27.400 It's actually pretty good for Skype.
00:04:29.880 But apologies if any of the audio sounds subpar.
00:04:33.780 And now I bring you Tristan Harris.
00:04:41.860 I am here with Tristan Harris.
00:04:44.300 Tristan, it's great to get you back on the podcast.
00:04:46.000 It's really good to be back, Sam.
00:04:48.240 It's been a while since the first time I was on here.
00:04:50.460 Yeah.
00:04:51.260 We will cover similar ground.
00:04:53.420 But a lot has happened since we last spoke.
00:04:56.420 And it's, to my eye, everything has gotten worse.
00:05:00.980 So, there's more damage to analyze and try to prevent in the future.
00:05:07.000 But before we jump in, remind people who you are and how you come at these things.
00:05:12.020 What's your brief bio that's relevant to this conversation?
00:05:16.480 Yeah.
00:05:17.340 Well, just to say briefly, I guess one of the reasons why we're talking now and most relevant
00:05:21.440 to my recent biography is the new Netflix documentary that just came out called The Social Dilemma.
00:05:27.480 Yeah.
00:05:27.700 You know, in which all these technology insiders are speaking about the Frankenstein that they've created.
00:05:32.440 We'll get into that later.
00:05:33.620 Prior to that, I was a Google design ethicist coming in through an acquisition of a technology company that I had started called Apture that Google acquired.
00:05:41.620 And after being at the company for a little while, migrated into a role of thinking about how do you ethically steer 2 billion people's attention when you hold the collective human psyche in your hands.
00:05:52.240 And then prior to that, as, you know, is also discussed in the film, is I was at Stanford to study computer science, human-computer interaction.
00:05:58.700 But specifically at a lab called the Persuasive Technology Lab, which I'm sure we'll get into, which relates to just sort of a lifelong view of how is the human mind vulnerable to psychological influence?
00:06:10.900 And have had a fascination with those topics from cults to sleight of hand magic to mentalism and heroes like Darren Brown, who's a mutual friend of ours, and how that plays into the things that we're seeing with technology.
00:06:22.180 Yeah, so I just want to reiterate that this film, The Social Dilemma, is on Netflix now, and yeah, that's the proximate cause of this conversation.
00:06:30.940 And it really is, it's great.
00:06:32.900 It really covers the issue in a compelling way.
00:06:36.940 So I highly recommend people go see that.
00:06:39.860 They don't have to go anywhere, obviously.
00:06:41.360 Just open Netflix, and there's no irony there.
00:06:44.780 I would count Netflix as, I'm sure they're an offender in some way, but they're, I mean, their business model really is distinct from much of what we're going to talk about.
00:06:54.240 I mean, they just, they could have made the choice to, they're clearly gaming people's attention because they're, they want to cancel churn, and they want people on the platform and deriving as much value from the platform as possible.
00:07:06.440 But there is something different going on over there with respect to not, not being part of the ad economy and the attention economy in quite the same way.
00:07:16.680 That's a distinction we could draw later on.
00:07:18.860 But is there a bright line between proper subscription services like that and what we're going to talk about?
00:07:27.040 Yeah, I mean, I think the core question we're here to talk about is in what, in which ways and where are technology's incentives aligned with the public good?
00:07:34.860 And I think the problem that brings us here today is where technology's incentives are misaligned with the public good through the business model of advertising and through models like user-generated content.
00:07:45.740 Clearly, because we live in a finite attention economy where there's only so much human attention, we are managing a commons, a collective environment.
00:07:53.920 And because Netflix, like any other actor, including politicians, including conferences, including you or I or this podcast or my podcast, we're all competing for the same finite resource.
00:08:05.460 And so there's a difference, I think, in how different business models engage in an attention economy, but a business model in which the cost of producing things that are going to reach exponential numbers of people, exponential broadcasts in the case of Netflix, but also in the case of these other companies, there's a difference when there's a sense of ethics or responsibility or privacy or child's controls that we add into that equation.
00:08:29.360 And I'm sure we'll get more into that equation, and I'm sure we'll get more into those topics.
00:08:32.020 Right. Okay, so let's take it from the top here.
00:08:36.200 What's wrong with social media at this point?
00:08:39.540 If you could boil it down to the elevator pitch answer, what is the problem that we're going to unspool over the next hour or so?
00:08:50.400 Well, it's funny because the film actually opens with that prompt, the blank stares of many technology insiders, including myself, because I think it's so hard to define exactly what this problem is.
00:09:00.280 There's clearly a problem of incentives, but beneath that, there's a problem of what those incentives are doing and where the exact harms show up.
00:09:08.460 And the way that we frame it in the film and in a big presentation we gave at the SF Jazz Center back in April 2019 to a bunch of the top technologists and people in the industry was to say that while we've all been looking out for the moment when AI would overwhelm human strengths and when we would get the singularity, when would AI take our jobs?
00:09:27.460 When would it be smarter than humans?
00:09:29.300 We missed this much, much earlier point when technology didn't overwhelm human strengths, but it undermined human weaknesses.
00:09:36.340 And you can actually frame the cacophony of grievances and scandals and problems that we've seen in the tech industry from distraction to addiction to polarization to bullying to harassment to the breakdown of truth, all in terms of progressively hacking more and more of human vulnerabilities and weaknesses.
00:09:54.540 So if we take it from the top, you know, our brain's short-term memory system have seven plus or minus two things that we can hold.
00:10:02.040 When technology starts to overwhelm our short-term and working memory, we feel that as a problem called distraction.
00:10:08.080 Oh my gosh, I can't remember what I was doing.
00:10:09.760 I came here to open an email.
00:10:10.900 I came here to go to Facebook to look something up, but now I got sucked down into something else.
00:10:14.260 That's a problem of overwhelming the human limit and weakness of just our working memory.
00:10:18.780 When it overwhelms our dopamine systems and our reward systems, that we feel that as a problem called addiction.
00:10:25.980 When it taps into and exploits our reliance on stopping cues that at some point I will stop talking and that's a cue for you to keep going.
00:10:33.640 When technology doesn't stop talking and it just gives you the infinite bottomless bowl, we feel that as a problem called addiction or addictive use.
00:10:39.820 When technology exploits our social approval and giving us more and more social approval, we feel that as a problem called teen depression because suddenly children are dosed with social approval every few minutes and are hungry for more likes and comparing themselves in terms of the currency of likes.
00:10:55.360 And when technology hacks the limits of our heuristics for determining what is true, for example, that that Twitter profile who just commented on your tweet five seconds ago, that photo looked pretty real.
00:11:05.480 They've got a bio that seems pretty real.
00:11:06.980 They've got 10,000 followers.
00:11:08.500 We only have a few cues that we can use to discern what is real and bots and deepfakes, and I'm sure we'll get into GPT-3, actually overwhelm that human weakness.
00:11:17.820 So we don't even know what's true.
00:11:19.500 So I think the main thing that we really want people to get is through a series of misaligned incentives, which we'll further get into, technology has overwhelmed and undermined human weaknesses.
00:11:29.620 And many of the problems that we're seeing as separate are actually the same.
00:11:33.020 And just one more thing on this analogy, it's kind of like, you know, collectively, this digital fallout of addiction, teen depression, suicides, polarization, breakdown of truth.
00:11:42.120 We think of this as a collective digital fallout or a kind of climate change of culture that much like the oil extractive economy that we have been living in an extractive race for attention, there's only so much when it starts running out.
00:11:55.620 We have to start fracking your attention by splitting your attention into multiple streams.
00:11:59.100 I want you watching an iPad and a phone and the television at the same time because that lets me triple the size of the attention economy.
00:12:05.860 But that extractive race for attention creates this global climate change of culture.
00:12:09.960 And much like climate change, it happens slowly, it happens gradually, it happens chronically.
00:12:15.060 It's not this sudden immediate threat.
00:12:16.780 It's this slow erosion of the social fabric.
00:12:19.700 And that collectively we called in that presentation human downgrading, but you can call it whatever you want.
00:12:24.040 The point is that, you know, if you think back to the climate change movement, before there was climate change as a cohesive understanding of emissions and linking to climate change,
00:12:33.440 we had some people working on polar bears, some people working on the coral reefs, we had some people working on species loss in the Amazon.
00:12:40.460 And it wasn't until we had an encompassing view of how all these problems get worse that we start to get change.
00:12:46.220 And so we're really hoping that this film can act as a kind of catalyst for a global response to this really destructive thing that's happened to society.
00:12:54.400 Okay, so let me play devil's advocate for a moment using some of the elements you've already put into play,
00:13:01.960 because you and I are going to impressively agree throughout this conversation on the nature of the problem.
00:13:07.100 But I'm channeling a skeptic here, and it's actually not that hard for me to empathize with a skeptic,
00:13:14.740 because as you point out, it really takes a fair amount of work to pry the scales from people's eyes on this point.
00:13:22.720 And the nature of the problem, though it really is everywhere to be seen, it's surprisingly elusive, right?
00:13:29.940 So if you reference something like, you know, a spike in teen depression and self-harm and suicide,
00:13:38.460 there's no one who's going to pretend not to care about that.
00:13:42.740 And then it really is just the question of, you know, what's the causality here?
00:13:45.780 And is it really a matter of exposure to social media that is driving it?
00:13:49.920 And I don't think people are especially skeptical of that, and that's a discrete problem that I think most people would easily understand and be concerned about.
00:13:59.500 But the more general problem for all of us is harder to keep in view.
00:14:06.020 And so when you talk about things, again, these are things you've already conceded in a way.
00:14:11.040 So attention has been a finite resource always, and everyone has always been competing for it.
00:14:18.460 So if you're going to publish a book, you are part of this race for people's attention.
00:14:23.260 If you were going to release something on the radio or television, it was always a matter of trying to grab people's attention.
00:14:29.860 And as you say, we're trying to do it right now with this podcast.
00:14:31.980 So when considered through that lens, it's hard to see what is fundamentally new here, right?
00:14:40.940 So yes, this is zero-sum.
00:14:43.540 And then the question is, is it good content or not?
00:14:46.640 I think people want to say, right?
00:14:48.960 This is just a matter of interfacing in some way with human desire and human curiosity.
00:14:55.800 And you're either doing that successfully or not.
00:14:58.960 And what's so bad about really succeeding, you know, just fundamentally succeeding in a way that,
00:15:05.180 yeah, I mean, you can call it addiction, but really it's just what people find captivating.
00:15:09.520 It's what people want to do.
00:15:10.760 They want to grant their attention to the next video that is absolutely enthralling.
00:15:16.700 But how is that different from, you know, leafing through the pages of, you know,
00:15:20.740 a hard copy of Vanity Fair in the year 1987 and feeling that you really want to read the next article
00:15:28.580 rather than work or do whatever else you thought you were going to do with your afternoon.
00:15:33.020 So there's that.
00:15:34.220 And then there's this sense that the fact that advertising is involved and really the foundation
00:15:43.680 of everything we're going to talk about, what's so bad about that?
00:15:46.460 So really, it's a story of ads just getting better.
00:15:51.500 You know, I don't have to see ads for Tampax anymore, right?
00:15:55.120 I go online and I see ads for things that I probably want or nearly want because I abandoned
00:16:02.120 them in my Zappos shopping cart, right?
00:16:04.440 So what's wrong with that?
00:16:05.940 And I think most people are stuck in that place.
00:16:09.860 Like they just, we have to do a lot of work to bring them into the place of the conversation
00:16:14.020 where the emergency becomes salient.
00:16:17.820 And so let's start there.
00:16:19.940 Gosh, there's so much good stuff to unpack here.
00:16:22.160 So on the attention economy, obviously, we've always had it.
00:16:25.520 We've had television competing for attention, radio, and we've had evolutions of the attention
00:16:29.260 economy before competition between books, competition between newspapers, competition
00:16:32.960 between television to more engaging television to more channels of television.
00:16:37.000 So in many ways, this isn't new.
00:16:38.800 But I think what we really need to look at is what was mediating, where that attention
00:16:43.500 went to.
00:16:44.160 Mediating is a big word.
00:16:45.660 Smartphones, we check our smartphones, you know, 100 times or something like that per day.
00:16:50.700 They are intimately woven into the fabric of our daily lives and ever more so because
00:16:55.420 of if we pre-establish addiction or just this addictive checking that we have, then any
00:16:59.240 moment of anxiety, we turn to our phone to look at it.
00:17:01.820 So it's intimately woven into where the attention starting place will come from.
00:17:05.960 It's also taken over our fundamental infrastructure for our basic verbs.
00:17:11.880 Like if I want to talk to you or talk to someone else, my phone has become the primary
00:17:15.720 vehicle for just about for many, many verbs in my life, whether it's ordering food or speaking
00:17:20.900 to someone or, you know, figuring out what I where to go on a map.
00:17:24.720 We are increasingly reliant on the central node of our smartphone to be a router for where
00:17:30.340 all of our attention goes.
00:17:31.340 So that's the first part of this intimately woven nature and the fact that it's our social
00:17:35.980 it's part of the social infrastructure by which we rely on.
00:17:38.820 We can't avoid it.
00:17:39.860 And part of what makes technology today inhumane is that we're reliant on infrastructure that's
00:17:44.140 not safe or contaminated for many reasons that we'll get into later.
00:17:47.680 A second reason that's different is the degree of asymmetry between, let's say, that newspaper
00:17:53.000 editor or journalist who is writing that enticing article to get you to turn to the next page
00:17:57.380 versus the level of asymmetry of when you watch a YouTube video and you think, yeah, this
00:18:01.540 time I'm just going to watch one video and then I've got to go back to work.
00:18:04.340 And you wake up from a trance, you know, two hours later and you say, man, what happened
00:18:08.680 to me?
00:18:08.940 I should have had more self-control.
00:18:11.000 What that misses is there's literally the Google, you know, Google's billions of dollars
00:18:15.260 of supercomputing infrastructure on the other side of that slab of glass in your hand pointed
00:18:20.140 at your brain doing predictive analytics on what would be the perfect next video to keep
00:18:25.560 you here.
00:18:25.860 And the same is true on Facebook.
00:18:27.180 You think, OK, I've sort of been scrolling through this thing for a while, but I'm just
00:18:29.980 going to swipe up one more time and then I'm done.
00:18:33.120 Each time you swipe up with your finger, you know, you're activating a Twitter or a Facebook
00:18:37.720 or a TikTok supercomputer that's doing predictive analytics, which has billions of data points
00:18:42.800 on exactly the thing that'll keep you here.
00:18:44.920 And I think it's important to expand this metaphor in a way that you've talked about on,
00:18:49.020 I think, in your show before about just the power, increasing power and computational power
00:18:52.800 of AI.
00:18:53.480 When you think about a supercomputer pointed at your brain trying to figure out what's the
00:18:58.080 perfect next thing to show you, that's on one side of the screen.
00:19:00.960 On the other side of the screen is my prefrontal cortex, which has evolved millions of years
00:19:04.120 ago and doing the best job it can to do goal articulation, goal retention and memory and sort
00:19:09.400 of staying on task, self-discipline, et cetera.
00:19:11.240 So who's going to win in that battle?
00:19:13.880 Well, a good metaphor for this is, let's say you or I were to play Gary Kasparov at chess.
00:19:19.360 Like, why would you or I lose?
00:19:21.520 It's because, you know, there I am on the chessboard and I'm thinking, OK, if I do this,
00:19:25.260 he'll do this.
00:19:25.800 But if I do this, he'll do this.
00:19:27.200 And I'm playing out a few new moves ahead on the chessboard.
00:19:30.000 But when Gary looks at that same chessboard, he's playing out a million more moves ahead
00:19:34.660 than I can.
00:19:35.420 Right.
00:19:35.600 And that's why Gary is going to win and beat you and I every single time.
00:19:39.520 But when Gary, the human, is playing chess against the best supercomputer in the world,
00:19:43.900 no matter how many million moves ahead that Gary can see, the supercomputer can see billions
00:19:48.920 of moves ahead.
00:19:50.220 And when he beats Gary, who is the best human chess player of all time, he's beaten like
00:19:54.820 the human brain at chess because that was kind of the best one that we had.
00:19:58.300 And so when you look at the degree of asymmetry that we now have, when you're sitting there
00:20:02.560 innocuously saying, OK, I'm just going to watch one video.
00:20:05.420 And then I'm out.
00:20:06.580 We have to recognize that we have an exponential degree of asymmetry and they know us and our
00:20:11.660 weaknesses better than we know ourselves, to borrow also from a mutual friend, Yuval
00:20:15.500 Harari.
00:20:16.760 So I guess I still think the nature of the problem will seem debatable even at this point.
00:20:24.460 Because, again, you're talking about successfully gaining attention, making various forms of content
00:20:31.900 more captivating, stickier.
00:20:35.540 People are losing time, perhaps, that they didn't know they were going to give over to
00:20:41.620 their devices.
00:20:43.100 But they were doing that with their televisions anyway.
00:20:45.740 I mean, the statistics long before we had smartphones, the statistics on watching television
00:20:51.680 were appalling.
00:20:53.400 I forget what they were.
00:20:54.080 There was something like the average television was on seven hours a day in the home.
00:20:58.920 So the picture was of people in a kind of Aldous Huxley-like dystopia just plugged in to the
00:21:07.080 boob tube and being fed bad commercials and therefore being monetized in some way that
00:21:14.120 strikes people as not fundamentally different from what's happening now.
00:21:18.660 Yes, there was less to choose from, you know, there were three different types of laundry
00:21:24.200 detergent, and it was not a matter of a really fine-grained manipulation of people's behavior.
00:21:32.080 But it was still, if you wanted, from the perspective of what seems optimal, it still had a character
00:21:39.700 of propagandizing people, you know, with certain messages that seem less than optimal.
00:21:46.140 I'm sure you could talk about teens or just people in general having, you know, body dysmorphia
00:21:53.160 around ideal presentations of human beauty that were, you know, unrealistic, you know,
00:21:58.980 whether Photoshop was involved at that point or not.
00:22:01.320 I mean, it was just good lighting and good makeup and, you know, selection effects that
00:22:05.840 make people feel obliged to aspire to irrational standards of beauty.
00:22:10.960 All of these problems that we tend to reference in a conversation like this seemed present.
00:22:16.920 I think the thing that strikes me as fundamentally new, and this is brought out in the film by
00:22:25.780 several people, relates to the issue of misinformation and the siloing of information, which really
00:22:35.120 does strike me as genuinely new, and there are a few analogies here that I find especially arresting.
00:22:42.040 I mean, one thing that Jaron Lanier said, he says it in the film, and he said it on this podcast a year or so
00:22:49.140 ago, which is, I think, frames it really well, is that just imagine if Wikipedia would present you with
00:22:57.660 information in a way that was completely dependent on your search history, all the data on you that had
00:23:04.280 been collected that show in your biases and your preferences and the ways in which your attention
00:23:08.600 can be gamed, so that when each of us went to Wikipedia, not only was there no guarantee that
00:23:15.040 we'd be seeing precisely the same facts, rather there was a guarantee that we wouldn't be, right?
00:23:20.840 That we're in this sort of, this shattered epistemology now, and we built this machine.
00:23:27.300 So the very machinery we're using to deliver information, really the only, what is almost
00:23:33.860 the only source of information for most people now, is a machine that is designed to partially
00:23:41.300 inform people, misinform people, spread conspiracy theories and lies faster than facts, spread outrage
00:23:49.320 faster than disinterested, nuanced analysis of stories.
00:23:54.840 So it's like we have designed an apparatus whose purpose is to fragment our worldview and
00:24:04.280 to make it impossible for us to fuse our cognitive horizons, so that if you and I start out in
00:24:09.620 a different place, we can never converge in the middle of this psychological experiment.
00:24:14.060 And that's the thing that, it strikes me, for which there is no analog in, you know,
00:24:19.560 all previous moments of culture.
00:24:22.180 Yeah, that's 100% right.
00:24:23.740 And I mean, if we jump to the chase about what is most concerning, it is the breakdown of a
00:24:28.680 shared reality and the breakdown, therefore, of our capacity to have conversations.
00:24:32.960 And, you know, you said it, that if we don't have conversation, we have violence.
00:24:36.520 And when you shatter the epistemic basis of how do we know what we know, and I've been
00:24:41.660 living literally in a different reality, a different Truman show, as Roger McNamee would
00:24:45.900 say, for the last 10 years, and we have to keep in mind, we're about 10 years into this
00:24:50.520 radicalization, polarization process, where each of us had been fed, you know, really a
00:24:55.960 more extreme view of reality for quite a long time, that what I really want people to do
00:24:59.680 isn't just to say, is technology addictive or these small questions?
00:25:02.560 It's really to rewind the tape and to ask, you know, how has my mind been fundamentally
00:25:07.200 warped?
00:25:07.800 And so just to go back to the points you made a second ago, you know, so what, you know,
00:25:12.020 YouTube is giving us information.
00:25:14.160 Well, first on that chess match I mentioned of, you know, are we going to win?
00:25:16.900 Are they going to win?
00:25:17.820 70% of the billion hours a day that people spend on YouTube is actually driven by the
00:25:22.840 recommendation system, by what the recommendation system is choosing for us.
00:25:26.400 Just imagine a TV channel where you're not choosing 70% of the time.
00:25:29.540 Then the question becomes, as you said, well, what is the default programming of that channel?
00:25:34.140 Is it, you know, Walter Cronkite and some kind of semi-reliable communal sensemaking, as
00:25:38.580 our friend Eric would say, or is it actually giving us more and more extreme views of reality?
00:25:42.840 So three examples of this several years ago, if you were a teenager and looked at a diet
00:25:46.700 video on YouTube, all the, several of the videos on the right-hand side would be Thinspo
00:25:51.320 anorexia videos because those things were better at keeping people's attention.
00:25:55.220 If you looked at, you know, the 9-11 videos, it would look at, it would give you Alex Jones,
00:26:00.600 Info Wars, 9-11 conspiracy theories.
00:26:02.960 YouTube recommended Alex Jones conspiracy theories 15 billion times in the right-hand sidebar,
00:26:09.060 which is more than the combined traffic of the New York Times, Fox News, MSNBC, Guardian,
00:26:13.920 et cetera, combined.
00:26:15.340 So the scale of what has actually transpired here is, is so enormous that I think it's really
00:26:20.800 hard for people to get their head around because also each of us only see our own Truman
00:26:24.880 show.
00:26:25.220 So the fact that I'm saying these stats, you might say, well, I've never seen a dieting
00:26:28.440 video or anorexia video, or someone else might say, I've never seen those conspiracy theories.
00:26:32.280 It's because it fed you some different rabbit hole.
00:26:34.620 You know, Guillaume Chaslow, who's the YouTube recommendations engineer in the film, talks
00:26:38.720 about in an interview we did with him on our podcast, how he, you know, the algorithm found
00:26:43.240 out that he liked seeing these videos of plane landings.
00:26:45.920 And it's this weird, addictive corner of YouTube where people like to see plane landings or the
00:26:50.260 example of flat earth conspiracy theories, which were recommended hundreds of millions of times.
00:26:54.240 And, you know, because we've been doing this work, Sam, for such a long time, and I've
00:26:57.480 talked to so many people, you know, I hear from teachers and parents who say, you know,
00:27:00.920 suddenly all these kids are coming into my classroom and they're saying the Holocaust
00:27:03.840 didn't happen, or they're saying the earth is flat.
00:27:06.360 And it's like, where are they getting these ideas, especially in a time of coronavirus where
00:27:10.020 parents are forced to sit their kids in front of the new television, the new digital
00:27:13.500 pacifier, which is really just YouTube.
00:27:15.640 You know, they're basically at the whims of whatever that automated system is showing them.
00:27:20.180 And of course, the reason economically why this happened is because the only way that
00:27:25.000 you can broadcast to 3 billion people in every language is you don't pay any human editors,
00:27:29.620 right?
00:27:29.880 You take out all of those expensive people who sat at the, you know, New York Times or
00:27:34.840 Washington Post editorial department or PBS editorial department saying what's good for
00:27:38.520 kids in terms of Saturday morning or Sesame Street.
00:27:41.420 And you say, let's have a machine decide what's good for people.
00:27:44.300 And the machine cannot know the difference between what we'll watch versus what we actually
00:27:49.860 really want.
00:27:50.600 And the easiest example there is if I'm driving down a freeway on the 5 in LA, and according
00:27:55.660 to YouTube, if my eyes go off to the side and I see a car crash and everybody's eyes
00:28:00.140 go to the side, they look at the car crash, then the world must really want car crashes.
00:28:04.120 And the next thing you know, there's a self-reinforcing feedback loop of they're feeding us more
00:28:07.700 car crashes and we keep looking at the car crashes.
00:28:09.640 They feed us more and more.
00:28:10.500 That's exactly what's happened over the last 10 years with conspiracy theories.
00:28:14.960 And one of the best predictors of whether you will believe in a new conspiracy is whether
00:28:18.680 you already believe in one.
00:28:20.220 And YouTube and Facebook have never made that easier than to sort of open the doorways into
00:28:25.020 a more paranoid style of thinking.
00:28:27.100 And just one last thing before handing it back is, you know, I think this is not to vilify
00:28:31.460 all conspiracy thinking.
00:28:32.700 You know, some conspiracies are real or some notions of, you know, what Epstein did with,
00:28:37.400 you know, running a child sex ring is all real.
00:28:40.500 So, but we need a more nuanced way to see this because when you're put into a surround
00:28:44.460 sound rabbit hole where everything is a conspiracy theory, everything that's ever happened over
00:28:48.640 the last 50 years is part of some master plan.
00:28:50.820 And there's actually this secret cabal that controls everything.
00:28:53.780 And Bill Gates and 5G and coronavirus, you know, this is where the thing goes off the
00:28:58.280 rails.
00:28:58.540 And I think this really became apparent to people once they were stuck at home where you're
00:29:03.360 not actually going out into the world.
00:29:04.540 You're not talking to as many neighbors.
00:29:05.960 And so the primary meaning making and sense making system that we are using to navigate
00:29:10.480 reality are these social media products.
00:29:13.140 And I think that has exacerbated the kind of craziness we've seen, you know, over the
00:29:16.800 last six months.
00:29:18.660 Yeah.
00:29:18.780 Well, you're really talking about the formation of cults.
00:29:22.000 And I know you've thought about a lot about cults and what we have here is a kind of cult
00:29:28.940 factory or, you know, a cult industrial complex that we have built inadvertently.
00:29:35.140 And again, what the inadvertence is, is really interesting because it does, it relates directly
00:29:41.740 to the business model.
00:29:43.260 It's because we have decided that the only way to pay for the internet or the primary way
00:29:49.420 to pay for the internet is with ads.
00:29:52.280 And when we'll get into the mechanics of this, that is the thing that has dictated everything
00:29:57.380 else we're talking about.
00:29:59.040 And it's, it really is incredible to think about because we, you know, we have created
00:30:03.100 a system where indisputably some of the smartest people on earth, I mean, this is really the,
00:30:08.980 where some of our brightest minds are using the most powerful technology we've ever built
00:30:16.880 not to cure cancer or mitigate climate change or respond to a very real and pressing problem
00:30:23.620 like a, an emerging pandemic.
00:30:25.780 They're spending their time trying to get better at gaming human attention more effectively
00:30:31.880 to sell random products and even random conspiracy theories, right?
00:30:37.600 In fact, they're doing all of this not merely as a, in a mode of failing to address other real
00:30:45.060 problems like in mitigating climate change or responding to a pandemic.
00:30:49.060 The consequences of what they're doing is making it harder to respond to those real
00:30:53.680 problems.
00:30:54.220 I mean, we have, you know, climate change and pandemics are now impossible to talk about
00:30:59.160 as a result of what's happening on social media.
00:31:01.620 And this is, this is a direct result of how social media is being paid for, or is it how it
00:31:09.240 has decided to make money?
00:31:11.740 And, you know, as you say, it's making it impossible for us to understand one another
00:31:17.820 because people are not seeing the same things.
00:31:21.660 I mean, like I, on a daily basis, have this experience of looking at people out in the world,
00:31:27.620 you know, on my own social media feed, or just reading news accounts of what somebody is
00:31:32.820 into.
00:31:33.280 I mean, let's say somebody is into QAnon, right?
00:31:35.340 And this cult is not too strong a word, this cult of indeterminate size, but massively well
00:31:42.640 subscribed at this point, of people who believe that not only is child sexual abuse a real problem
00:31:48.760 out there in the world, as more or less everyone believes, but they believe that there are uncountable
00:31:54.340 numbers of high profile, well-connected people, you know, from the Clintons on down who are part
00:32:00.780 of a cannibalistic cult of child sexual slavery, you know, where they extract the bodily essences
00:32:07.160 of children so as to prolong their lives, right?
00:32:09.580 I mean, it's just, it's as crazy as crazy gets.
00:32:12.180 And so when I, as someone who's outside this information stream, view this behavior, people
00:32:20.340 look frankly insane to me, right?
00:32:23.200 And some of these people have to be crazy, right?
00:32:25.320 This has to be acting like a bug light for crazy people, at least of some sort.
00:32:30.940 But most of the people are presumably normal people who are just drinking from a fire hose
00:32:37.400 of misinformation and just different information from the information I'm seeing.
00:32:42.340 And so their behavior is actually inexplicable to me.
00:32:46.420 And there's so many versions of this now.
00:32:49.020 I don't think it's too much to say that we're driving ourselves crazy.
00:32:52.660 We're creating a culture that is not compatible with basic sanity.
00:32:59.280 I mean, we're amplifying incommensurable delusions everywhere all at once.
00:33:05.860 And we've created a system where true information, you know, real facts and, you know, valid, you
00:33:12.580 know, skeptical analysis of what's going on isn't up to the task of dampening down the spread
00:33:20.100 of lies.
00:33:20.600 And I mean, maybe there's some other variable here that accounts for it.
00:33:24.520 But it's amazing to me how much of this is born of simply the choice over a business model.
00:33:33.260 Well, I think this is, to me, the most important aspect of what the film hopefully will do is
00:33:39.140 right now we're living in the shattered prism of a shared reality where we're each trapped
00:33:44.900 in a separate shard.
00:33:45.840 And like you said, when you look over at someone else and say, how can they believe those crazy
00:33:50.120 things?
00:33:50.520 How can they be so stupid?
00:33:51.900 Aren't they seeing the same information that I'm seeing?
00:33:54.580 And the answer is, they're not seeing the same information that you're seeing.
00:33:57.940 They've been living literally in a completely different feed of information than you have.
00:34:03.060 And that's actually one of the other, I think, psychological, not so much vulnerabilities, but we did not
00:34:07.500 evolve to assume that every person you would see physically around you would, inside of their
00:34:12.740 own mind, be actually living in a completely different virtual reality than the one that
00:34:16.780 you live in.
00:34:17.620 So nothing from an evolutionary perspective would enable us to have empathy with the fact
00:34:22.040 that each of us have our own little virtual reality in our own minds, and that each of
00:34:26.440 them could be so dramatically, not just a little bit, but so dramatically different.
00:34:30.500 Because another aspect you mentioned when you brought up cults at the beginning of what
00:34:33.440 you said was the power of groupthink and the power of an echo chamber, where, you know,
00:34:39.360 many of what's going, many of the things that are going on in conspiracy theory groups on
00:34:42.620 Facebook, I mean, the pandemic video spread actually through a massive network of QAnon
00:34:47.280 groups.
00:34:47.820 There's actually been a capturing of the new spirituality and sort of in psychedelics type
00:34:52.960 community into the QAnon world, interestingly, which are now...
00:34:56.320 That's what these people need, acid.
00:34:57.580 Yeah, that doesn't sound like a good addition to an already mad world.
00:35:03.460 But I think if we zoom out, it's like, the question is, who's in control of human history
00:35:07.920 right now?
00:35:08.520 Are human beings authoring our own choices?
00:35:11.160 Or by the fact that we've seeded the information that feeds into three billion people's brains
00:35:16.440 meant that we have actually seeded control to machines, because the machines control the
00:35:20.840 information that all three billion of us are getting.
00:35:23.980 It's become the primary way that we make sense of the world.
00:35:26.100 And to jump ahead and mind read some of the skeptics out there, some people saying, well,
00:35:30.500 hold on a second, weren't there filter bubbles and narrow partisan echo chambers with Fox
00:35:34.820 News and MSNBC and people sticking with those channels?
00:35:37.920 Yes, that's true.
00:35:38.820 But I would ask people the question, where are the editorial departments of those television
00:35:42.940 channels getting their news from?
00:35:44.740 Well, they're just living on Twitter.
00:35:46.280 And Twitter's algorithms are recommending, again, that same partisan echo chamber back to
00:35:51.160 you.
00:35:51.280 If you follow, you had Renee DiResta on your podcast, who's a dear friend and amazing colleague
00:35:56.200 talking about how radicalization spreads on social media.
00:35:59.640 And she worked back in the State Department in 2015, where they noticed that if you followed
00:36:04.000 one ISIS terrorist on Twitter, the suggested user system would say, oh, there's suggested
00:36:08.780 people you might want to follow.
00:36:10.080 And it gives you 10 more suggested ISIS terrorists for you to follow.
00:36:13.620 Likewise, if you were a new mom, as she was several years ago, and you joined some new
00:36:17.940 mom groups, specifically groups for like making your own baby food, kind of a do-it-yourself
00:36:22.060 organic moms movement.
00:36:24.560 Well, Facebook's algorithm said, well, hold on, what are other suggested groups we might
00:36:28.260 show for you that tend to correlate with users in this mom group that keeps people really
00:36:32.240 engaged?
00:36:33.060 And one of the top recommendations was the anti-vaccine conspiracy theory groups.
00:36:36.840 And when you join one of those, it says, well, those groups tend to be also in these
00:36:40.340 QAnon groups and the chemtrails groups and the flat earth groups.
00:36:43.060 And so you see very quickly how these tiny little changes, as they say, and Jaren says
00:36:48.660 in the beginning of the film, you know, the business model of just changing your beliefs
00:36:52.120 and identity, just 1%, you know, changing the entire world, 1% is a lot.
00:36:56.640 It's like climate change, quite literally, right?
00:36:58.160 Where you only have to change the temperature a tiny bit and change the basis of what people
00:37:02.640 are believing.
00:37:03.540 And it changes the rest of reality.
00:37:05.520 Because as you know from confirmation bias, when you have a hammer, everything looks like
00:37:09.200 a nail and technology is laying the foundation of hammers that are looking for specific kinds
00:37:14.100 of nails.
00:37:14.700 Once you see the world in a paranoid conspiratorial lens, you are seeing, you're looking for evidence
00:37:19.580 that confirms that belief.
00:37:20.880 And that's happening on all sides.
00:37:22.600 It's really a thing that's happened to all of us.
00:37:24.940 This is why my biggest hope really in the global impact of the film, and this is not a
00:37:29.420 marketing push, it's really a social impact push.
00:37:31.860 I genuinely am concerned that there may be no other way to put Humpty Dumpty back together
00:37:36.780 again, than to show the world that we have created, that we need a new shared reality
00:37:41.180 about that breakdown of our shared reality.
00:37:44.940 There are many aspects to the ad model.
00:37:47.400 And I think people can get, it doesn't take much work to convince people as we've, I hope
00:37:54.400 we have begun to hear that the shattering of shared reality is a problem.
00:38:01.400 It's at minimum a political problem.
00:38:03.600 I mean, whether it's a social problem for you, you know, out in the world or in your
00:38:08.600 primary relationships, to see the kind of hyper-partisanship we see now and the, and
00:38:14.980 the just inability to converge on an account of basic facts that could mitigate that partisanship.
00:38:21.600 I think people feel that that is a kind of assault on democracy.
00:38:25.580 And then when you add the piece that bad actors like the Russians or the Chinese or anyone can
00:38:33.460 decide to deliberately game that system, I mean, just the knowledge that, you know, Russia
00:38:37.380 is actively spreading, you know, Black Lives Matter information and pseudo-information so
00:38:43.700 as to heighten the anguish and, and polarization on, on that topic in America.
00:38:49.500 I mean, that just, the fact that we built the tools by which they can do that and they can
00:38:53.880 do it surreptitiously, right?
00:38:55.120 We don't see who's seeing these ads, right?
00:38:57.140 You don't see the, the 50,000 people who were, who were targeted in a specific state for a
00:39:02.720 specific reason.
00:39:03.660 That is new and sinister.
00:39:06.000 And I think people can understand that.
00:39:08.700 But when you're, we're talking about the problem with sharing information or using our information
00:39:15.720 in these ways, and, and I think we should get clear about what's happening here because
00:39:19.880 this is a distinction several people make in, in the film.
00:39:23.740 It's not that these platforms sell our, our data, right?
00:39:28.020 They don't really sell our data.
00:39:29.840 They gather the data, they analyze the data and what they sell are more and more accurate
00:39:35.140 predictions of our behavior to advertisers.
00:39:38.000 Right.
00:39:38.780 And the ability to, and as that gets more refined, you really have a, as close as we've ever come
00:39:45.440 to advertising being a kind of sure thing, right?
00:39:49.480 Where it really, you know, it really works.
00:39:52.500 And, and even there are people, I think most people won't necessarily care about that because
00:39:59.400 if you tell them, listen, that the thing you really thought you wanted and went out and
00:40:05.640 bought, you were played by the company, the company placed an ad with Facebook and Facebook
00:40:11.100 delivered it to you because you were the perfect target of that ad.
00:40:15.460 I think the person can, at the end of the day, own all of that process and say, and just
00:40:22.040 subsume it with their satisfaction at having bought the thing they, they now actually want,
00:40:27.960 right?
00:40:28.100 Like, so yeah, I actually, but I want, I wanted a new Prius, right?
00:40:32.680 I mean, that's, it was time.
00:40:33.940 I needed a new car, right?
00:40:34.860 Like there's some, whether it's confabulatory or not, there's some way in which they don't
00:40:39.980 necessarily feel violated.
00:40:41.280 And I think when, I think people think they care about privacy, but we don't really seem
00:40:47.120 to care about privacy all that much.
00:40:48.860 I mean, we care about convenience and we care about money.
00:40:51.380 I mean, at bottom, nobody wants to pay for these things.
00:40:55.500 No one wants to pay for Facebook.
00:40:56.840 They don't want to pay for Twitter.
00:40:58.200 They don't want to pay for most of what happens on the internet.
00:41:01.220 And they're happy to be enrolled in this psychological experiment so that they don't have to pay for
00:41:07.220 anything.
00:41:07.760 And that's, and the dysfunction of all of that is what we're trying to get across here.
00:41:13.400 But it's, I'm always amazed that it's, you focus on it and parts of this monstrosity begin
00:41:19.560 to disappear.
00:41:20.560 You know, it's like, it's very hard to keep what is wrong with this in view every moment
00:41:25.980 all at once.
00:41:26.780 And so maybe for the moment, let's just focus on, you know, information and privacy and,
00:41:31.900 and the ad model and, and just how we should think about it.
00:41:36.660 Well, when we talk about the advertising model, you know, people tend to think about the good
00:41:40.620 faith uses, like you're talking about, you know, a Prius or a pair of shoes, what dismisses
00:41:45.260 the geopolitical world war three information warfare that's happening right now.
00:41:50.020 Because, you know, a line I say often is, you know, while we've been obsessed with protecting
00:41:54.560 our physical borders as a country, we've left the digital border wide open.
00:41:59.040 I mean, if Russia or China tried to fly a cruise missile or a bomber, you know, plane into the
00:42:03.280 United States, they'd be blasted out of the sky by the Pentagon.
00:42:05.860 But when they try to fly an information bomb into the United States in our virtual infrastructure
00:42:10.820 of Facebook, they're met by a white glove that says, yes, exactly which zip code and
00:42:14.940 which African-American sub-district would you like to target?
00:42:17.620 And that, that is the core problem.
00:42:20.560 We are completely unprotected when it comes to the virtual infrastructure.
00:42:24.260 So if you go to the, the roads and the air and the, you know, telephone, telephone lines
00:42:29.020 that we use here in this country, they're completely air capped from, you know, Russia or China.
00:42:34.500 But when most of the activity happening in our country happens in a virtual digital online
00:42:39.220 environment, you know, as Marc Andreessen says, software is eating the world, meaning software
00:42:43.140 and the digital world are consuming more and more of the physical world and the physical
00:42:47.260 ways that we used to get around and the physical conversations we used to have.
00:42:50.440 That digital environment is basically the big five tech companies.
00:42:54.140 It's all happening through the landscape of YouTube, TikTok, Facebook, et cetera.
00:42:59.140 And, you know, how does an empire fall?
00:43:01.120 You know, you use the power of an empire against itself.
00:43:04.020 You know, after World War II, you know, we had all these nukes and the big powers couldn't
00:43:07.540 do conventional wars with each other.
00:43:09.600 So they had to use subtler methods, plausible deniability, proxy wars that would be waging
00:43:13.880 economic warfare, diplomatic warfare.
00:43:16.040 But if you're Russia or Iran or Turkey, you know, and you don't want to see the U.S.
00:43:20.120 in a position of global dominance, would you do, you know, a forward facing attack on the
00:43:24.940 country with all the nukes?
00:43:26.140 You know, obviously not.
00:43:27.340 But would you take the already existing tensions of that country and turn the enemy against
00:43:31.520 himself?
00:43:31.920 That's what Sun Tzu would say to do.
00:43:33.920 You know, that's what Chinese military strategy would say to do.
00:43:35.800 And Facebook just makes that a trillion times easier.
00:43:39.260 So, you know, if I was China, I would want extreme right and extreme left groups to proliferate
00:43:43.260 and fight each other.
00:43:44.340 And, you know, we know that this is basically happening and this has been stoking up groups
00:43:48.680 on all sides.
00:43:49.600 You know, I can go into your country and create an army of bots that look just as indistinguishable
00:43:53.760 from regular people.
00:43:55.000 If I'm China, I'm running TikTok and I can, you know, manipulate the political discourse in
00:43:58.680 your country with the fact that I have 300 million Americans, you know, on my service.
00:44:02.200 It might even be bigger than that, if I'm remembering correctly.
00:44:04.240 So I think, you know, the advertising model isn't just that it enables these good faith
00:44:08.820 users.
00:44:09.180 I think people have to recognize the amount of manipulated and deceptive activities that
00:44:13.340 are almost, like you said, untraceable.
00:44:15.420 I mean, the fact that I'm saying all this to you and the listeners out there would sound
00:44:18.660 like a conspiracy theory until you know the researchers who are tracking these things.
00:44:22.100 Because if you're, you know, if you're just looking at your own feed, I'm living in California.
00:44:26.480 I'm not actually part of a targeted group.
00:44:28.580 So I don't really see these things.
00:44:30.000 And it's actually invisible to me, anybody who is.
00:44:33.360 So again, our psychological vulnerabilities here, technology is not allowing us to empathize
00:44:37.920 with people who are closest to being harmed by these systems.
00:44:42.380 Yeah.
00:44:42.480 Okay.
00:44:42.640 So I think people can get the central fear here, which is that it seems at best difficult,
00:44:51.080 more likely impossible, to run a healthy democracy on bad information.
00:44:59.220 I mean, if we can do it for a few years, we probably can't do it for a century.
00:45:05.140 Something has to change here.
00:45:06.560 We can't be feeding everyone lies or half-truths, different lies and different half-truths, all
00:45:14.000 at once, 24 hours a day, year after year, and hope to have a healthy society, right?
00:45:21.120 So that's a discernible piece of this problem that I think virtually everyone will understand.
00:45:29.120 And then when you add the kind of the emotional valence of all these lies and half-truths, people
00:45:35.880 get that there's a problem amplifying outrage, right?
00:45:40.980 I mean, the fact that the thing that is most captivating to us is the feeling of in-group outrage
00:45:48.580 pointed outward toward the out-group for whom we have contempt growing into hatred.
00:45:55.820 That's the place we are so much of the time on social media.
00:46:00.120 That runs the gears of this machinery faster than any other emotion.
00:46:05.640 And whatever the, you know, if that changes tomorrow, if it turns out that, you know, sheer
00:46:10.580 terror is better than outrage, well, then the algorithm will find that and it'll be amplifying
00:46:17.680 terror.
00:46:18.180 But the thing that you have to be sure of is that it's contained in the very word, you
00:46:23.240 know, a dispassionate take on current events is never going to be the thing that gets this
00:46:29.940 machinery running hottest.
00:46:32.480 And so I think people can get that.
00:46:35.940 But when we talk about possible remedies for this problem, then I really think it's hard
00:46:42.500 to see a path forward.
00:46:44.000 So I mean, I think there are ways to come at this.
00:46:46.880 One is the distinct thing.
00:46:48.140 If you'd like to continue listening to this podcast, you'll need to subscribe at samharris.org.
00:46:54.980 You'll get access to all full length episodes of the Making Sense podcast and to other subscriber
00:47:00.060 only content, including bonus episodes and AMAs and the conversations I've been having
00:47:05.240 on the Waking Up app.
00:47:06.940 The Making Sense podcast is ad free and relies entirely on listener support.
00:47:11.200 And you can subscribe now at samharris.org.
00:47:13.740 Thank you.
00:47:18.900 I'm sorry.
00:47:19.780 Thank you.
00:47:20.720 Thank you.