Making Sense - Sam Harris - March 27, 2019


#152 — The Trouble with Facebook


Episode Stats

Length

50 minutes

Words per Minute

166.16283

Word Count

8,318

Sentence Count

503

Misogynist Sentences

4

Hate Speech Sentences

4


Summary

Roger McNamee has been a Silicon Valley investor for 35 years. He s co-founded successful venture funds, including Elevation, where he s partnered with U2 s Bono as a co-founder. He holds a BA from Yale and an MBA from the Tuck Business School at Dartmouth. But of relevance today is that he was an advisor to Mark Zuckerberg very early on, and helped recruit Sheryl Sandberg to Facebook. And he s now a very energetic critic of the company and many of its platforms, Google, Amazon, etc. We focus on Facebook in particular, but this conversation is a very deep look at all that is going wrong with digital media and how it s making it harder and harder to make sense to one another. It s a growing problem that I ve discussed many times on the podcast, but today s episode is an unusually deep dive. In it, I bring you a conversation I got connected to through Tristan Harris and Rene DiResta, who gave us a fairly harrowing tour of the Russian influence into our lives through social media and other hacking efforts. And they really have been allied with me in my efforts to deal with the problem that we re all here to solve, and they really are here to be the conscience of Silicon Valley. Thanks for coming on The Making Sense Podcast. Sam Harris This is a really special episode, and I hope you enjoy what we re doing here. Please consider becoming a supporter of the podcast by becoming a subscriber. It means you re getting a better, more informed, and more informed version of what s going on in the world. And it s made possible entirely through the support of our listeners, not only by our sponsors, but by our listeners are making it possible to support us all of our dreams and aspirations and dreams and dreams, too we can be a better of a good day, day, and we ll all have a better day, better of that, too, we know that we all will be listening to that, we ll hear more of it, too of it. Thank you, thank you, friend and good morning, and good night, good night. - Yours truly, Amy, Amy and Evelyn, - Sam, - Timestep: <_________ -- -- Thank you -- Yours Truly, -- The Good Morninger -- Cheers, ______________ -- Myself, , & ~


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.620 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.760 Welcome to the Making Sense Podcast.
00:00:49.040 This is Sam Harris.
00:00:51.260 Okay, very short housekeeping here.
00:00:54.140 Many things happening in the news.
00:00:55.860 The Mueller Report just came in.
00:00:59.520 I think I'll do a podcast on this when there's real clarity around it.
00:01:03.840 I'll get some suitable scholar on.
00:01:06.960 So I will defer that for the moment and just introduce today's guest.
00:01:13.200 Today I'm speaking with Roger McNamee.
00:01:16.640 Roger has been a Silicon Valley investor for 35 years.
00:01:19.980 He has co-founded successful venture funds, including Elevation, where he's partnered with
00:01:26.300 U2's Bono as a co-founder.
00:01:29.960 He holds a BA from Yale and an MBA from the Tuck Business School at Dartmouth.
00:01:35.880 But of relevance today is that he was an advisor to Mark Zuckerberg very early on and helped
00:01:42.580 recruit Sheryl Sandberg to Facebook.
00:01:44.880 And he is now a very energetic critic of the company and of many of these platforms, Google, Amazon, Facebook, etc.
00:01:55.000 We focus on Facebook in particular.
00:01:57.240 We talk about Google to some degree.
00:01:58.940 But this conversation is a very deep look at all that is going wrong with digital media
00:02:06.140 and how it is subverting democracy, making it harder and harder to make sense to one another.
00:02:13.300 It's a growing problem that I've discussed many times on the podcast, but today's episode is an unusually deep dive.
00:02:20.640 So now, without further delay, I bring you Roger McNamee.
00:02:31.520 I am here with Roger McNamee.
00:02:33.560 Roger, thanks for coming on the podcast.
00:02:35.640 Oh, Sam, what an honor to be here.
00:02:37.540 So I got connected to you through Tristan Harris, who's been on the podcast and who many people know
00:02:45.480 has been dubbed the conscience of Silicon Valley.
00:02:49.320 But I also realize another podcast guest who I also got through Tristan is another one of your partners in crime,
00:02:56.360 Rene DiResta, who gave us a fairly harrowing tour of the Russian influence into our lives through social media
00:03:05.020 and other hacking efforts.
00:03:07.260 So you know both of those people, and they really have been allied with you in your efforts to deal with the problem
00:03:16.160 that we're about to talk about, which is just what is happening on our social media platforms
00:03:22.020 with bad incentives and arguably unethical business models so as to all too reliably corrupt our public conversation
00:03:32.580 and very likely undermine our democracy.
00:03:36.240 So let's just start with your background in tech, and how is it that you come to have an opinion
00:03:42.340 and knowledge to back it up on this particular problem?
00:03:47.280 Yeah, so Sam, I began my career in the tech world professionally in 1982.
00:03:53.420 And when I was going back to college in 1978, I dropped out for a period of time.
00:03:58.920 My brother had given me a Texas Instruments speak and spell, you know, the toy for teaching kids how to spell.
00:04:04.560 And it was brand new that year, and he hands it to me in Christmastime 1978 and says,
00:04:11.160 you know, if they can make this thing talk with a display and keyboard,
00:04:16.100 you're going to be able to carry around all your personal information and device you can hold in your hand.
00:04:21.080 And it probably won't take that long.
00:04:22.400 So this is one year after the Apple II, three years before the IBM PC, and I think roughly 17 or 18 years before the Palm Pilot.
00:04:30.340 He planted that seed in my head, and I couldn't get rid of it.
00:04:35.060 And I spent four years trying to figure out how to become an engineer, discovered I was just terrible at it.
00:04:40.740 And so I got a job being a research analyst covering the technology industry, and I arrived in Silicon Valley just before the personal computer industry started.
00:04:50.720 And that was one of those moments of just pure dumb luck that can make a career and a lifetime, and in my case, it did both.
00:05:00.420 So I start there in 1982.
00:05:03.280 I follow the technology industry for a long, long period of time.
00:05:06.480 And I do this, like Zelig, I just wound up in the right place at the right time, a lot of different moments.
00:05:12.400 Beginning in the mutual fund business in Baltimore at T. Rowe Price, but covering tech, traveling around with the computer industry as it formed.
00:05:20.200 Then starting a firm inside Kleiner Perkins Caulfield & Buyers, the venture capital firm in 1991.
00:05:24.860 So I was actually in their office when the Internet thing happened.
00:05:30.160 So the day Marc Andreessen brought Netscape in, the day that Jeff Bezos brought in Amazon, the day that Larry and Sergey brought in Google, those were all things that I got to observe.
00:05:41.200 I wasn't the person who did them, but I was there when it happened.
00:05:44.480 And that was, if you're an analyst, that's a perfect situation.
00:05:48.180 And so in 2006, I had been in the business 24 years, and I get a phone call from the chief privacy officer at Facebook saying, my boss has got a crisis, and he needs to talk to somebody independent.
00:06:03.040 Can you help?
00:06:04.140 And so Marc came by my office that day, and he was 22.
00:06:07.920 The company was only two years old.
00:06:09.300 It's about a year after the end of the storyline from Social Network.
00:06:13.460 The company is only available to high school students and college students with an authenticated email address, and there's no news feed yet.
00:06:23.460 It's really early on.
00:06:25.160 And he comes into my office, and I say to him, Marc, you and I don't know each other.
00:06:30.000 I'm 50.
00:06:30.840 You're 22.
00:06:32.880 I need to give you two minutes of context for why I'm taking this meeting.
00:06:36.840 And I said, if it hasn't already happened, either Microsoft or Yahoo is going to offer a billion dollars for Facebook.
00:06:46.940 Keep in mind, the company had had 9 million in revenues before that, so a billion was huge.
00:06:51.260 And I said, everybody you know, your mother and father, your board of directors, your management team, everybody's going to tell you to take the money.
00:06:57.660 They're all going to tell you you can do it again, that with 650 million bucks at age 22, you can change the world.
00:07:02.500 And I just want you to know, I believe that Facebook, because of authenticated identity and control of privacy, is going to be the first really successful social product, and that you can build a social network that will be more important than Google is today.
00:07:16.760 So keep in mind, it's 2006.
00:07:18.500 Google's already very successful, but obviously nothing like what it is today.
00:07:21.940 And I said, they will tell you you can do it again, but in my experience, nobody ever does.
00:07:27.780 And so I just want you to know, I think what you have here is unique, it's cool, and I hope you'll pursue it.
00:07:33.180 What followed Sam was the most painful five minutes of my entire life.
00:07:38.800 You have to imagine a room that is totally soundproofed because it was a video game lounge inside our office.
00:07:45.940 And this young man is sitting there pantomiming thinker poses.
00:07:49.940 At the first one-minute mark, I'm thinking, wow, he's really listening.
00:07:54.880 This is like, you know, he's showing me respect.
00:07:58.500 The two-minute mark, I'm going, this is really weird.
00:08:01.880 At three minutes, I'm starting to dig holes in the furniture.
00:08:06.420 At four minutes, I'm literally ready to scream.
00:08:08.500 And then finally, he relaxes.
00:08:10.200 And he goes, you won't believe this, but I'm here because the thing you just described, that's what just happened.
00:08:15.660 That is why I'm here.
00:08:16.820 And so that began a three-year period where somehow I was one of his advisors.
00:08:23.380 And my experience with him, Sam, was, it was fantastic.
00:08:26.960 He was the perfect mentee in the sense that he reached out to me on issues where he was open to ideas.
00:08:33.420 He always followed through.
00:08:35.300 I never saw any of the antisocial behavior that was in the movie.
00:08:39.340 You know, I didn't have a social relationship with him.
00:08:42.040 It was purely business.
00:08:43.880 And, but for three years, it was really rich.
00:08:46.700 And I saw him almost every week.
00:08:49.800 And the key thing that I did in addition to helping him get through the first problem, because he didn't want to sell the company when he came into my office.
00:08:57.340 But he was really afraid of disappointing everybody.
00:08:59.300 And I helped him figure out how to do that.
00:09:01.940 And then he needed to switch out his management team.
00:09:04.420 So I helped him do that.
00:09:05.420 And the key person I helped bring in was Sheryl Sandberg.
00:09:08.760 And so you have to imagine the context for this thing is I'm a lifelong technology optimist.
00:09:14.340 And I grew up in the era, I'm the same age as Bill Gates and Steve Jobs.
00:09:19.500 So I grew up in the era where technology was something that made people's lives better and that we were all committed to changing the world in kind of a sort of hippie libertarian value system.
00:09:31.580 And Mark appeared to me to be different from the other entrepreneurs.
00:09:35.800 You know, I was not a fan of the PayPal mafias approach.
00:09:38.980 And I had consciously turned down some things where I really was philosophically out of line with the management teams.
00:09:46.780 And, you know, I look at Peter Thiel and Elon Musk and Reid Hoffman as incredibly brilliant people who had ideas that transformed tech and transformed the world.
00:10:00.740 But philosophically, I come from a different place.
00:10:03.500 And so I wasn't so comfortable with them.
00:10:06.700 But Mark seemed to be different.
00:10:08.140 And Sheryl, I thought, was different.
00:10:09.760 And, you know, so what wound up happening is I retired from the investment business because it turned out that I guess I'd gotten past my philosophical sell-by date,
00:10:20.140 that I was seeing too many businesses with strategies that I just couldn't sign up for, things that I knew would be successful, things like Uber and Spotify,
00:10:27.660 where, you know, they delivered a lot of value to the customer, but only by causing some harm to other people in the chain.
00:10:37.920 And I wasn't good with that.
00:10:40.260 And sadly, I wasn't paying close attention to Facebook.
00:10:43.100 I stopped being a mentor to Mark in 2009.
00:10:46.460 So I wasn't around when the business model formed in 2011, 12, and 13.
00:10:52.280 And I did a crappy analytical job.
00:10:55.480 I just missed the development of the persuasive technology and the manipulative actions that really came to dominate things.
00:11:06.880 So in 2016, I'm retired from the business.
00:11:11.020 I'm still a fanboy.
00:11:12.300 I really love Facebook.
00:11:13.540 But all of a sudden, I start to see a series of things that tell me there's something really wrong.
00:11:18.680 And that's what got me going.
00:11:21.000 So between January 2016 and October, I saw election issues in the Democratic primary and in Brexit,
00:11:29.040 where it was clear that Facebook had an influence that was really negative because it gave an advantage to inflammatory and hostile messages.
00:11:40.360 And then I saw civil rights violations, a corporation that used the Facebook ad tools to scrape data on anybody who expressed interest in Black Lives Matter.
00:11:49.840 And they sold that to police departments in violation of the Fourth Amendment.
00:11:53.280 And then Housing and Urban Development, the government agency cited Facebook for ad tools that allowed violations of the Fair Housing Act,
00:12:00.460 the very thing that Facebook just settled the civil litigation on in the past week.
00:12:06.740 And so you have civil rights violations.
00:12:09.120 You see election things.
00:12:10.200 And I'm freaked out.
00:12:11.520 And I write an op-ed for Recode.
00:12:14.740 And instead of publishing, I sent it to Mark Zuckerberg and Sheryl Sandberg on October 30th of 2016, so nine days before the election.
00:12:23.280 And it basically says, I'm really, really concerned that Facebook's business model and algorithms allow bad actors to harm innocent people.
00:12:35.960 It's a two-page, single-spaced essay.
00:12:39.460 It was meant to be an op-ed, so it's more emotional than I wish.
00:12:43.120 If I'd had a chance to do it again, I would have rewritten it for them.
00:12:45.800 But I wanted to get it in their hands because I was really afraid the company was the victim of essentially well-intended strategies producing unintended consequences.
00:12:57.360 And that's what led to it.
00:12:59.380 And they got right back to me.
00:13:01.080 Both of them did.
00:13:01.780 They were incredibly polite but also dismissive.
00:13:04.000 They treated it like a public relations problem.
00:13:05.800 But they hand me off to Dan Rose, who was one of the most senior people at Facebook and a good friend of mine.
00:13:12.500 And they said, well, Dan will work with you.
00:13:14.420 And he's just saying to me, Roger, we're a platform, right?
00:13:18.520 The law says we're not responsible for what third parties do because we're not a media company.
00:13:23.700 And so Dan and I talk numerous times.
00:13:26.740 And then the election happens.
00:13:28.140 And I just go completely ape.
00:13:31.740 And I'm literally the morning after the election, I'm screaming at him that the Russians have tipped the election using Facebook.
00:13:41.240 And he's going, no, no, we're cool because Section 230 of the Communications Decency Act says we're a platform.
00:13:47.620 We're not responsible for third parties.
00:13:48.980 I'm going, dude, you're in a trust business.
00:13:51.960 I mean, I'm an investor.
00:13:53.340 I'm your friend.
00:13:54.200 I'm not trying to be hostile here.
00:13:55.680 I'm trying to save you from, like, killing this business that you got to do what Johnson & Johnson did when that guy put poison in bottles of Tylenol in 1982 in Chicago, which is they took every bottle of Tylenol off the shelf until they could invent and deploy tamper-proof packaging.
00:14:12.780 They defended their customers.
00:14:14.880 Even though they didn't put the poison in, they weren't technically responsible.
00:14:18.000 And I thought Facebook could convert a potential disaster into a winning situation by opening up to the investigators and working with the people who used the product to understand what had happened.
00:14:32.240 And for three months, I begged them to do this.
00:14:35.880 And finally, I realized they were just never going to take it seriously.
00:14:40.100 And that's when I went looking for, you know, like, I didn't have any data.
00:14:44.440 I mean, Stan, you know how hard this is when you're talking to really, really smart technical people.
00:14:49.160 You got to have a lot of data.
00:14:50.440 And all I had was 35 years of spider sense.
00:14:53.720 And I went shopping for friends.
00:14:56.980 And that's when I met Tristan Harris.
00:14:58.580 And that changed everything because I was looking at this as a issue of civil rights and an issue of democracy.
00:15:05.520 And Tristan's on 60 Minutes and he's talking about brain hacking and the use of manipulative techniques, persuasive technology to manipulate attention and create habits that become addictions.
00:15:19.300 And then how that makes people vulnerable and how filter bubbles can be used to create enormous economic value, but at the same time, increase polarization and undermine democracy.
00:15:30.940 And I had a chance to interview him on Bloomberg a couple days after the 60 Minutes thing.
00:15:38.920 And I call him up immediately after the show's done and go, dude, do you need a wingman?
00:15:43.600 Because I'm convinced he's like the messiah of this thing.
00:15:46.940 He's the guy who gets it.
00:15:48.440 And I thought, well, maybe I can help him get the message out.
00:15:51.740 And so that's how we came together.
00:15:53.160 So that was April 2017.
00:15:56.060 And we literally both dropped everything we were doing and committed ourselves to seeing if we could stimulate a conversation.
00:16:03.540 And it was really clear we were going to focus on public health because I was certain that Tristan's idea was the root cause of the problem.
00:16:13.460 And so that's what we went out to do.
00:16:15.920 And the hilarious thing was, he may have told you this, but it began with going to the TED conference.
00:16:22.900 Eli Pariser, the man who identified filter bubbles and wrote the amazing book about that, got Tristan onto the schedule of the TED conference two weeks before the conference itself.
00:16:34.140 It was amazing what he did.
00:16:36.260 Actually, it was Chris Anderson got in touch with me having heard Tristan on this podcast a few weeks before the TED conference.
00:16:43.580 And that was also part of the story.
00:16:45.980 Oh, outstanding.
00:16:47.060 Well, thank you for that.
00:16:48.080 Okay.
00:16:48.320 So I did not know that piece of it.
00:16:50.340 No, it was super gratifying to see that effect because I wanted Tristan's voice amplified.
00:16:57.140 Okay.
00:16:57.360 Well, so then we owe it to you.
00:16:58.820 So I look at this as, so it may, that's really funny because it, then that's perfect.
00:17:04.920 That explains a lot of things.
00:17:06.240 So anyway, we go to the TED conference, right?
00:17:08.100 We're thinking there's a thousand people there.
00:17:10.100 We're going to make this thing a big story overnight, right?
00:17:13.200 We're going to solve this two weeks from the day we meet.
00:17:15.720 We go to TED, right?
00:17:16.860 He gives us an impassioned thing.
00:17:19.140 And you've seen the, you've seen the TED talk.
00:17:21.140 Yeah.
00:17:21.480 Yeah.
00:17:21.720 Yeah.
00:17:21.840 And, you know, we go around to collect business cards.
00:17:24.580 I think we came out of there with two.
00:17:26.560 Right.
00:17:26.820 You're, you're, you're talking to people whose job depends on not understanding what he's talking about.
00:17:31.580 Yeah.
00:17:31.780 Oh my God.
00:17:32.380 It's just, it's exactly right.
00:17:33.980 And so we're just like completely traumatized because we don't know anybody who's not in tech.
00:17:38.960 And, and that's when a miracle occurred.
00:17:43.060 So when Tristan was on 60 Minutes, the woman who did his makeup happened to be someone whose regular gig was doing the makeup for Ariana Huffington.
00:17:53.620 And she called up Ariana for whom she'd worked for a decade and said, Ariana, I've never asked you to do this, but you need to meet this young man.
00:18:02.560 And so she sets up for, for Tristan to meet Ariana.
00:18:07.620 So the two of us go to New York and Ariana takes Tristan under her wing, gets him onto Bill Maher and introduce him to a gazillion other people.
00:18:16.560 And, you know, so all of a sudden we go from not having any relationship at all.
00:18:20.620 And then this purely beautiful woman, Brenda from, who did, did Tristan's makeup, gets him on there.
00:18:27.460 And she recurs in the story throughout because she did his makeup on Bill Maher.
00:18:32.100 She did mine when I was on Bill Maher.
00:18:34.540 Yeah, mine too.
00:18:34.800 And it's like, you know, and it's like, you just, it's like you sit there and you go, you know, it's the butterfly's wings.
00:18:42.660 Yeah.
00:18:42.860 Right. And she was the butterfly. And while Tristan's meeting with Ariana for the first time, I get an email from Jonathan Taplin, who wrote the book, Move Fast and Break Things.
00:18:56.940 And Jonathan was a friend who had the first insight about the antitrust issues on Google, Facebook and Amazon and wrote a book about it in early 2017 that had really helped frame my ideas.
00:19:13.240 And he sends me a card for an aide to Senator Mark Warner.
00:19:18.780 And if you recall, in May of 2017, the only committee of Congress where the Democrats and Republicans were working together was the Senate Intelligence Committee, of which Mark Warner was the vice chair.
00:19:31.040 So to get a card for somebody who was policy aide to him was a huge deal.
00:19:38.100 And so I called him up and I said, have you guys, I know, I know your, your oversight mission is intelligence agencies, but is there anybody in Washington who's going to protect the 2018 elections from interference over social media?
00:19:52.080 And, you know, you know, it was clearly outside their jurisdiction.
00:19:57.500 Anyway, he brings us to Washington to meet Warner because he goes, you're right.
00:20:02.640 If it's not us, it's not going to happen.
00:20:04.680 So we've got to find some way to get to it.
00:20:06.280 You need to meet Warner.
00:20:07.360 And it took a couple of months to set up.
00:20:09.260 And in between, we get a contact from Senator Elizabeth Warren, who has a hypothesis about, about the tech group that is really profoundly insightful, where the question she asks is, isn't this basically the same problem as the banks had in 2008, that you have one side, the powerful banks in that case, had perfect information, and their clients only had the information the banks were willing to give them?
00:20:37.000 And she had this insight that Facebook and Google and, to a lesser extent, Amazon were doing exactly the same thing, that they were maintaining markets of perfect information on one side and starving the other side.
00:20:49.200 So they were essentially distorting capitalism, really undermining the notion of capitalism, which requires at least some uncertainty on both sides to have a market.
00:20:57.360 And, you know, using that in a monopolistic way, which, I mean, I was gobsmacked.
00:21:03.980 I've been in the investment business for 35 years.
00:21:06.340 I know a lot about antitrust.
00:21:08.120 I was a first party to the Microsoft antitrust case and to the AT&T breakup.
00:21:12.440 So I really got to watch both of those up close.
00:21:15.540 I'm a huge believer in using antitrust in tech.
00:21:18.860 And here is a senator who has this whole thing figured out in 2017.
00:21:25.100 And, you know, so that's the start of our day.
00:21:26.980 And then we go and meet Warner.
00:21:28.120 And Warner immediately gets the need to do something about to protect the elections.
00:21:32.600 And he goes, what should we do?
00:21:34.660 And Tristan, this is how genius he is, Tristan, without blinking an eye, goes, oh, we need to hold a hearing.
00:21:41.380 You need to make Zuckerberg explain why he isn't responsible for the outcome of the 2016 election.
00:21:47.680 Well, listen, I want to drill down there.
00:21:51.120 I want to fast forward at some point to those hearings because those hearings were, I think, famously unproductive, at least.
00:21:58.740 The public's perception of them is that.
00:22:01.880 But so let's articulate what the problem is here with the business model of Facebook in particular.
00:22:09.800 And this extends to Google.
00:22:11.820 I mean, I think Facebook has a uniquely culpable story here.
00:22:15.840 And the ethics around this are interesting because you knew these guys.
00:22:20.120 You knew Zuckerberg.
00:22:20.980 You knew Sandberg.
00:22:21.920 You had a reason to believe that they would appreciate their ethical obligations once this became evident that there was a problem.
00:22:30.500 And the problem, as I understand it, is this.
00:22:33.900 I should remind people that we're talking about your book, Zucked, which is about Facebook in particular.
00:22:42.320 But it covers the general footprint of this problem of bad incentives and a business model trafficking in user data.
00:22:51.440 And generically, the issue here is that misinformation spreads more effectively than facts.
00:22:58.200 The more lurid story is more clickable than the more nuanced one.
00:23:05.640 And you add to that the emotional component that outrage increases people's time on any social media site.
00:23:13.920 And this leads to an amplification of tribalism and partisanship and conspiracy theory.
00:23:23.100 And all of these things are more profitable than a healthy conversation about facts.
00:23:29.780 They simply are more profitable given the business model.
00:23:34.040 And one could have always said that this dynamic vitiates other media too.
00:23:39.600 I mean, this is true for newspapers.
00:23:41.140 It's true for television.
00:23:42.520 It's just true that if it bleeds, it leads on some level.
00:23:46.000 But this is integrated into Facebook's business model to an unusual degree.
00:23:51.880 And yet, to hear you tell the story of your advising of Zuckerberg and your—I don't think you said it here,
00:23:59.620 but it's in the book that you actually introduced Sandberg to him and facilitated that marriage.
00:24:05.420 That was at a time where the ethical problem of this business model wasn't so obvious, to hear you tell.
00:24:14.220 I mean, were they having to confront this back in 2007 or not?
00:24:18.960 Well, they were certainly not confronting it in any way that I was aware of.
00:24:23.460 To be clear, in the early days of Facebook, they had one objective only, which was to grow the audience.
00:24:29.460 There was really no effort made during the period I was engaged with them to build the business model.
00:24:35.940 Cheryl's arrival was about putting in place the things to create the business model, but there was a great deal of uncertainty.
00:24:42.400 In fact, Mark was initially very hesitant to hire Cheryl because he didn't believe that Google's model would apply or work at Facebook.
00:24:53.080 And it turned out he was correct about that.
00:24:54.620 So my perception of the model—I love the way you just described that.
00:25:00.560 You know, the thing that I always try to explain to people is that when you think about filter bubbles and you think about when it bleeds, it leads.
00:25:11.320 That whole notion's been with us for 150 years.
00:25:15.200 But before Google and Facebook, it was always in a broadcast model.
00:25:19.500 So when I was a kid, everybody my age saw the Kennedy funeral, the Beatles on Ed Sullivan, and the moon landing.
00:25:28.700 And we all saw it together.
00:25:30.100 And the filter bubble brought people together because we had a shared set of facts.
00:25:35.660 And the complaint at the time was conformity, right?
00:25:40.140 Because we all saw exactly the same thing.
00:25:42.840 With Facebook and Google, they create this world of, in Facebook's case, across all their platforms, 3 billion Truman shows where each person gets their own world, their own set of facts with constant reinforcement,
00:26:02.520 where they lure you onto the site with rewards, right, whether it's notifications or likes, to build a habit.
00:26:10.420 And for many people, that turns into an addiction.
00:26:13.240 I always ask people.
00:26:14.100 People say, oh, I'm not addicted.
00:26:15.120 And I go, okay, great.
00:26:16.360 When do you check your phone first thing in the morning?
00:26:18.660 Is it before you pee or while you're peeing?
00:26:22.060 Because everybody I know is one or the other.
00:26:24.500 And, you know, we're all addicted to some degree.
00:26:27.240 And then once you're coming back regularly, they have to keep you engaged.
00:26:31.120 And this is the stuff that was not happening until roughly 2011, which was this notion of, you know, before 2011, what they had to keep people engaged was Zynga, right?
00:26:41.700 They had social games.
00:26:44.280 That was the big driver of usage time before 2011.
00:26:48.140 And but what they realized was that appealing to outrage and fear was much more successful than appealing to happiness because one person's joy is another person's jealousy.
00:26:59.940 Whereas if you're afraid or outraged, you share stuff in order to make other people also afraid or outraged because that just makes you feel better.
00:27:09.120 And Tristan had this whole thing figured out.
00:27:11.840 And, you know, we obviously shared that in Washington.
00:27:14.880 And that was, you know, an important stimulus.
00:27:18.080 But when I think about the problem, there's all that's one piece of it, which is the the manipulation of people's attention for profit and the natural divisiveness of using fear and outrage and filter bubbles that isolate people.
00:27:34.380 That, you know, if if you start out vax, anti-vax curious and they can get you into an anti-vax group within a year, you're going to be in the streets fighting vaccination.
00:27:46.960 It's just how it works.
00:27:48.620 That constant reinforcement makes your positions more rigid and makes them more extreme.
00:27:53.020 And we cannot help that.
00:27:54.080 It's it's about the fundamental.
00:27:55.500 It's not a question of character or whatever.
00:27:58.140 It's about the most basic evolutionary wiring.
00:28:02.500 I just want to cover this ground again, not to be pedantic.
00:28:05.880 But I do have this sense that there are many people who are skeptical that this is really even a problem or that there's something fundamentally new about this.
00:28:14.880 So I just want to just cover a little bit of that ground again.
00:28:17.080 You've used this phrase filter bubble a bunch of times.
00:28:20.620 If I recall, that actually came from Eli Pariser's TED Talk, where many of many of us were first made aware of this problem.
00:28:28.960 He might have mentioned Facebook, but I remember him putting it in terms of Google searches, where if you do a Google search for vaccines and I do one, we are not going to get the same search results.
00:28:40.560 Your search history and all the other things you've done online are getting fed into an algorithm that is now dictating what Google decides to show you in any query.
00:28:53.780 And the problem here is that, and I think it was Tristan who, no, either Tristan or Jaron Lanier, you might correct me here.
00:29:01.900 One of them said, just imagine if when any one of us consulted Wikipedia, we got different facts, you know, however subtly curated to appeal to our proclivities on any topic we researched there.
00:29:16.980 And there could be no guarantee that you and I would be seeing the same facts.
00:29:20.540 That's essentially the situation we're in on social media, and social media is the, and Google, and this is obviously the majority of anyone's consumption of information at this point.
00:29:33.080 Exactly. And so if we take that as one part of the problem, so when Eli first talked about filter bubbles, he used both Google and Facebook and showed these examples and how essentially these companies were pretending to be neutral when in fact they were not, and they were not honest about it.
00:29:56.020 So, you know, the Harvard scholar Shoshana Zuboff has a new book called The Age of Surveillance Capitalism, and there are some things in there where she spent a dozen years studying Google's business and gathering data about it.
00:30:15.440 And in my book, which I wrote at the same time she was writing her, so I was totally unaware of her work, I hypothesize a bunch of things, and Shoshana has data, so she's like, in my opinion, a god.
00:30:26.580 But the core thing that Google did, and here's how the flow worked, because without this, what Facebook did would have been less harmful.
00:30:34.520 But when you talk about the people who are skeptical of harm, when you see the Google piece, then the two of them together make it really clear.
00:30:41.140 So Google begins like a traditional marketer, they have one product, it's 2002, the product is search, they're gathering data from their users in order to improve the quality of the product for those users, and they have an insight, which is that they only need a couple percent of the data they're gathering to improve the search engine.
00:30:58.980 So they decide to figure out, is there any signal in the other 98%, and the insight is traditionally, I think, credited to Hal Varian, an economist at Google, that there was in fact predictive signal.
00:31:18.800 So they could basically do behavioral prediction based on this stream of excess data that they were capturing from search results.
00:31:26.180 And the signal wasn't hugely strong, because it was just from search.
00:31:29.860 So they had the insight, we need to find out the identity of people.
00:31:33.920 And then they did something incredibly bold.
00:31:36.580 They create Gmail, which would have given them identity, which you could tie to the search queries, and therefore you'd know purchase intent and whose purchase intent it was.
00:31:46.560 But the really bold thing they did was they decided they were going to scan every message.
00:31:52.280 And they put ads into the Gmail, ostensibly to pay for it.
00:31:57.780 But I think that was actually just duck food.
00:31:59.760 This is a hypothesis of mine, that they knew people would squawk at the ads and force them to be removed.
00:32:05.400 But once they were removed, people would stop complaining, and Google would still be scanning all the messages.
00:32:10.700 So we essentially, if you're looking for data for behavioral prediction, it's hard to get a better product than email for telling you what people are thinking.
00:32:21.460 And for whatever reason, people who signed up for Gmail went along with this.
00:32:26.880 So suddenly Google got this massive treasure trove of data about what people are going to do with their name and their search results to tie it to actual purchase intent.
00:32:36.460 Then they decide, well, we need to know where they are.
00:32:38.900 So they create Gmail.
00:32:40.320 Or sorry, they create Google Maps.
00:32:41.940 And so now they know where everybody is.
00:32:43.500 And then they realize, wait a minute, there's all these open spaces.
00:32:46.420 We can turn them into data and monetize them.
00:32:49.140 So they start driving cars up and down the street to create Street View.
00:32:54.560 And they do this without permission.
00:32:56.420 But nobody really pushes back very hard.
00:32:58.440 There are a few complaints.
00:32:59.460 Germany got very uppity about it.
00:33:01.040 And there was a big stink over there.
00:33:02.560 But in the U.S., people sort of went along with it.
00:33:04.540 And then they realize, well, wouldn't it be cool if we also took pictures of everybody's house from the top?
00:33:08.680 So they do satellite view.
00:33:10.480 And then they create Google Glass so they can get up close.
00:33:13.580 And that doesn't work.
00:33:14.560 People blow that up.
00:33:15.440 So the guy leaves, creates Niantic.
00:33:18.160 And so they do Pokemon Go.
00:33:20.280 And they do all the APIs.
00:33:21.800 So they get all this.
00:33:22.940 You know, people think they're playing a game, but they're really gathering data for Google.
00:33:26.620 And when you put all these pieces together, you realize, oh, my gosh, the business model initially was about improving the targeting of the ads.
00:33:33.620 But then they have the genius insight that with filter bubbles and with recommendation engines, they can take that market of behavioral prediction and increase the probability of a good outcome by steering people towards the outcomes that the predictions have suggested.
00:33:53.320 And so that's how they use filter bubbles.
00:33:55.680 That's how they use.
00:33:57.260 And so the way to think about it is if you're a marketer today, Google and Facebook have all of your customers behind a paywall.
00:34:07.760 But you can do this Faustian deal with these guys, which is you can get perfect information on these people as long as you're willing to do it on their terms.
00:34:17.740 Now, the other side of that trade, if you're a consumer, the data you're getting is coming primarily from Google or Facebook, right?
00:34:25.400 It's being controlled by them.
00:34:26.560 So if you take the emotional component of what Facebook has been doing and that whole thing with, you know, manipulation of attention and the notion of creating habits that become addictions and that inflaming of lizard brain emotions like outrage and fear and the use of disinformation and conspiracy theories to essentially get past people's civility.
00:34:52.600 Civility is a mask and you want to strip people of that and get to their underlying reality because that's where all the behavioral prediction value is.
00:35:01.220 And then you overlay onto that what Google was doing and you realize, oh, my God, these people have created digital avatars for each and every one of us.
00:35:10.060 And they've got this choke collar on it and a leash and they control our digital avatars.
00:35:16.320 We do not control them and they control them simply because they went into a place where there were these where there's this unclaimed asset called data.
00:35:27.060 And they claimed ownership of all of it and we let them get away with it.
00:35:32.100 So on the one hand, you're talking about companies.
00:35:35.020 Let's just focus on Google and Facebook here.
00:35:37.160 I'm sure Twitter is involved as well, but I can't figure out how Twitter is functioning.
00:35:42.940 Microsoft and Amazon are the other guys who really do this.
00:35:45.700 Right.
00:35:46.280 OK, well, let's just let's just focus on the products you've already described here.
00:35:51.040 So Google rolls out Gmail and Maps and the user perception of this and search before them.
00:36:00.320 The user perception is this is adding immense value to our lives.
00:36:05.860 I mean, just to be able to navigate in a city based on, you know, accurate mapping data and to understand, you know, what streets to avoid because the traffic is so bad.
00:36:14.560 And this is what technology should be doing for us.
00:36:17.420 And, you know, Gmail, I was never a fan of of the idea of Gmail until I started getting spammed.
00:36:24.840 I don't know who put me on the on the devil's list.
00:36:26.920 But there was I woke up one day and I was literally getting ninety nine to one spam to real email and no spam detector could deal with it.
00:36:37.000 And I ran my email through Google servers and, you know, all the spam magically disappeared forever.
00:36:42.960 So I was immensely grateful for this.
00:36:45.580 And there are many other instances of this where if you're a user of Facebook, which I'm really not, I can imagine you like the fact that Facebook is serving you stuff that you find interesting.
00:36:59.320 But the general principle here is that everything that these platforms do that is good for a user or seems good for a user is really doubly good for advertisers.
00:37:12.800 Otherwise, they wouldn't do it.
00:37:14.180 That is the bottom line and what's so perverse about the incentives built into the business model.
00:37:19.740 Yeah. So the way I handicap it is this way.
00:37:23.740 If all they were doing was capturing the data that you put into the system in the form of the routes for your going to the office and back or the emails that you send or the photos or posts that you put on Facebook, everything I think would be fine.
00:37:41.800 The problem is there is also a third leg of the stool, which is the third-party market in our most personal data.
00:37:53.180 So this is our credit card transactions data, which is sold by Equifax, Experian, and TransUnion.
00:38:00.780 It is our location sold by our cellular carrier, but also captured through the APIs that Google has with companies like Uber.
00:38:09.040 It is wellness and health data captured from apps and devices that are outside the protection of HIPAA, the Health Information Protection Act.
00:38:21.880 And it is also our browsing history, which can be freely acquired.
00:38:28.040 And, you know, to me, we've never asked the question, well, wait a minute, why is it legal for companies to do commerce in our most private data?
00:38:39.580 We've actually never agreed to that, right?
00:38:41.140 There's nothing that I can find in a credit card transaction that gives those people the right to sell that data.
00:38:46.920 They've simply asserted and no one has said no.
00:38:48.900 And we live in this really deregulated economic environment where the government, you know, in contrast to a normally highly functioning capitalist system, where the government would set the rules and then enforce them uniformly on everybody.
00:39:03.560 Well, it must be in their terms of service that nobody ever reads, right?
00:39:06.400 It's got to be in the fine print somewhere.
00:39:07.900 Well, hang on.
00:39:08.540 I don't have a business relationship with Experian or Equifax.
00:39:12.200 Right.
00:39:12.640 My relationship is with Visa.
00:39:14.660 Visa just runs the technology and Equifax actually handles the transaction processing.
00:39:20.180 I don't have a relationship with them.
00:39:22.860 Okay.
00:39:23.240 And so most of these guys have something buried in the terms of service, but I think on that one, I don't even know where it would show up, right?
00:39:31.600 And, you know, I can't imagine why it would be in Visa's interest to have that happen.
00:39:36.900 Also, they just have a monopoly.
00:39:38.720 You can't opt out of using credit cards or at least you can't do that easily.
00:39:42.220 Exactly.
00:39:42.580 Exactly.
00:39:43.460 And so my point is, if you take those three problems, the emotional component, the essential data capture and, you know, the claim of ownership.
00:39:54.100 So it's almost like they're acting like a government exercising a right of eminent domain, right?
00:40:02.520 They're claiming, okay, well, this data has touched our servers, therefore we own it forever and we can do whatever we want with it.
00:40:07.620 And then you've got these third parties who simply will trade your digital life to anybody who wants it.
00:40:14.960 So in that scenario, you wind up with this thing where the gatekeepers, in this case, Google, Facebook, Amazon, and maybe to a lesser degree Microsoft, can offer perfect information to marketers in exchange for access to all of their consumer customers.
00:40:32.620 And the consumers are in this extraordinary position of having their access to information limited by those people.
00:40:39.780 And my point here is not that Google or Facebook do not provide good value.
00:40:45.460 I think they provide tremendous value.
00:40:47.580 What I believe is true is that the value they're receiving in return is not only disproportionate, it's that they have the ability to influence our choices in ways that we are not aware of.
00:41:03.320 And they're taking our agency away, they do a lot of first-party gathering, that would be the Google apps, that would be the Facebook apps, and then they acquire data wherever it's available.
00:41:12.320 So they create this digital, high-resolution digital avatar for each and every one of us, and they sell access to that avatar.
00:41:22.540 That's the perfect information.
00:41:23.620 And so they're selling access for cash, right?
00:41:26.820 So they're getting paid.
00:41:27.740 That's why they're so immensely profitable, right?
00:41:30.500 And my simple observation is if you want to understand the nature of the relationship is ask yourself how much more value you get from Google Maps or Gmail today than you got, say, two years ago.
00:41:42.880 And then look at Google's average revenue per user over those two years and see how much more value they got from you.
00:41:51.880 And here's where the moral problem gets really dicey is there is no opt-out.
00:41:57.920 We all say, hey, my data's out there.
00:41:59.760 I don't care, and I'm an honest person.
00:42:02.340 And I sit there and go, that would be true if the only impact of the data was on you.
00:42:07.300 But I don't use Gmail.
00:42:08.560 And if I send an email to somebody in a Gmail account, it is being scanned by Google, and it is going into their behavioral prediction on lots of people, including me.
00:42:18.740 And I have no voice in that.
00:42:21.420 And there are hundreds of examples just like that all over the economy.
00:42:27.440 And so if you sit there and think that phase one was, again, improving the quality of the ad targeting, which is the thing you liked inside Facebook,
00:42:34.540 and phase two is using recommendation engines and filter bubbles and things like that to steer you to desired outcomes,
00:42:42.540 you're sitting there saying, ooh, I maybe don't like that quite so much.
00:42:46.080 Here's phase three.
00:42:47.960 Anyone who is a subscriber to things like the Financial Times has run into the Google Captcha system where they say, hey, we want to figure out if you're a robot.
00:42:57.500 So look at these photographs and touch all the photographs that have a traffic light or all the ones that have a bus.
00:43:03.840 And I think we've all seen that one degree or another.
00:43:06.120 And those things are getting harder and harder.
00:43:07.520 And we think, okay, well, they're just trying to figure out if we're human or not.
00:43:10.400 And, of course, that's not what you're doing at all.
00:43:13.200 What you're doing is training the artificial intelligence for Google's self-driving cars.
00:43:16.840 That's why it's getting harder because they're getting to corner cases now.
00:43:21.500 They've figured out you're a human because of the movement of your mouse.
00:43:24.440 Now, I assume that they're keeping a log of all of that.
00:43:29.180 And I assume that Amazon does the same thing and Facebook does the same thing,
00:43:33.620 which means that they may already be able to do this.
00:43:37.020 But if not, very soon, when my mouse movement becomes slower than it used to be and gets more wobbly,
00:43:44.720 that may be the first indication that I have a disease like Parkinson's.
00:43:50.180 Now, here's the problem, and this is a deep moral problem.
00:43:54.440 Whichever company captures that, whether it's Facebook, Google, Amazon, is under no obligation to tell me.
00:44:00.600 In fact, they're not even under an obligation to keep it private.
00:44:04.560 They are free, and all the incentives point to them selling it to the highest bidder,
00:44:10.220 which would almost certainly be my insurance company,
00:44:12.900 which would almost certainly raise my rates or cut off my coverage,
00:44:16.460 and I still wouldn't know I'd shown a symptom.
00:44:19.040 And I would simply point out that that same technology could be used in an insurance product
00:44:24.440 that simply said, pay us $10 a year, and if you ever show a symptom of a neurological problem,
00:44:30.420 we're going to let you know.
00:44:31.480 Like, you'll be the only one.
00:44:32.480 We'll be covered by HIPAA, and it will protect your secret and get you to a hospital quickly.
00:44:36.800 And, of course, all of this could be probabilistic, so it might not actually apply to you,
00:44:40.240 but it just has to apply to people like you in the aggregate to be worth trading in this data and acting on it.
00:44:46.840 Exactly.
00:44:47.400 And so the issue that we have here is that in a traditional advertising business,
00:44:51.440 we would say that you're not the customer, you're the product.
00:44:54.820 But in the model of these guys, in what Zuboff calls the surveillance capitalism,
00:45:02.360 you're not even the product.
00:45:04.320 You're the fuel.
00:45:05.100 Well, each one of us is a reservoir of data that they're pumping the data out of.
00:45:10.640 And I simply make the observation that their influence, if we simply look in democracy,
00:45:19.180 their influence on democracy in every country in which they operate is enormous,
00:45:24.840 that their code, their algorithms have so much more influence on our lives than the law does.
00:45:30.540 And yet they're not elected.
00:45:32.920 They're not accountable to anyone.
00:45:35.100 And that, from a democracy point, is a huge problem.
00:45:39.520 You have all the issues on public health where, you know,
00:45:42.720 why is it legal to even capture data, much less exploit it relative to minors under 18?
00:45:49.860 Yet Google has whole businesses in Chromebooks and YouTube that do precisely that.
00:45:56.840 And if you simply look at the imperatives created by the business model they have,
00:46:01.100 they sit there and their first rule of thumb is, well, we'll let any content go on there.
00:46:07.100 And then the users will be responsible for telling us when there's a problem.
00:46:11.940 So if, you know, a madman kills 50 people in New Zealand, the users have to tell us first.
00:46:17.340 And then when that didn't work politically, they said, okay, well, we'll have moderators who will sit and watch stuff.
00:46:25.680 But I would like to point out that all of these things happen after the fact.
00:46:30.120 And the reason they happen after the fact and the reason these guys are so insistent on doing it that way
00:46:34.460 is they want to capture what Zuboff calls the behavioral surplus,
00:46:42.100 the signals that come from the rawest parts of our psychology, right?
00:46:49.400 They want to strip the veneer of civility off us and get to that, you know,
00:46:53.260 what are our real underlying emotions?
00:46:56.660 What are our real biases?
00:46:58.500 And so they're going to fight us every step of the way.
00:47:01.280 I mean, obviously, you know, you had Renee DiResta on here, and Renee's completely brilliant.
00:47:07.960 And one of the things that she taught me is this notion that freedom of speech is not the same as freedom of reach
00:47:17.940 and that the issue here isn't censoring people.
00:47:22.480 The issue we're talking about is avoiding amplification of the most hostile voices in society.
00:47:31.280 And these platforms are platforms designed to provide status as a service.
00:47:38.500 And in that model, you're basically rewarding people for being more outrageous, more angry,
00:47:48.440 more disinformed, if you will, more conspiracy-oriented,
00:47:52.320 and then making, leaving it to the users, right, to clean up the mess.
00:48:00.100 And I got a problem with that.
00:48:02.100 And I just don't think that there's any amount of value that you can get from Google Maps or Gmail
00:48:09.840 or from Facebook or Instagram that offsets the damage that they're doing right now to society as a whole,
00:48:17.620 that individually we may love these products, and I don't dispute that, but they're causing huge harm.
00:48:22.480 And my basic point here is I believe we can get rid of the harm without having to eliminate what we like about the products.
00:48:28.920 They're going to be a lot less profitable, a lot less profitable, but tough noogies.
00:48:33.520 I mean, you know, companies, corporations are not allowed to destroy civilization just because it's more profitable than building civilization.
00:48:43.960 I want to zoom out for a second.
00:48:45.200 I want to talk about these specific problems more,
00:48:48.920 and I do want to get to the government's response and to possible solutions,
00:48:51.660 but I just want to zoom out for a second and talk about this basic quandary which I'm fairly sympathetic with.
00:49:01.340 I mean, the most charitable view of what these platforms are doing is simply thinking of themselves as platforms.
00:49:09.800 They insist, you know, we're platforms, we're not media companies.
00:49:12.820 And, you know, on its face, that seems like a legitimate distinction
00:49:18.240 which could absolve them of responsibility for the content that appears on their platform.
00:49:25.580 If you'd like to continue listening to this conversation, you'll need to subscribe at samharris.org.
00:49:33.300 Once you do, you'll get access to all full-length episodes of the Making Sense podcast,
00:49:37.380 along with other subscriber-only content,
00:49:39.140 including bonus episodes and AMAs and the conversations I've been having on the Waking Up app.
00:49:45.220 The Making Sense podcast is ad-free and relies entirely on listener support.
00:49:49.700 And you can subscribe now at samharris.org.
00:49:52.460 Thank you.
00:50:02.560 Amen.