Roger McNamee has been a Silicon Valley investor for 35 years. He s co-founded successful venture funds, including Elevation, where he s partnered with U2 s Bono as a co-founder. He holds a BA from Yale and an MBA from the Tuck Business School at Dartmouth. But of relevance today is that he was an advisor to Mark Zuckerberg very early on, and helped recruit Sheryl Sandberg to Facebook. And he s now a very energetic critic of the company and many of its platforms, Google, Amazon, etc. We focus on Facebook in particular, but this conversation is a very deep look at all that is going wrong with digital media and how it s making it harder and harder to make sense to one another. It s a growing problem that I ve discussed many times on the podcast, but today s episode is an unusually deep dive. In it, I bring you a conversation I got connected to through Tristan Harris and Rene DiResta, who gave us a fairly harrowing tour of the Russian influence into our lives through social media and other hacking efforts. And they really have been allied with me in my efforts to deal with the problem that we re all here to solve, and they really are here to be the conscience of Silicon Valley. Thanks for coming on The Making Sense Podcast. Sam Harris This is a really special episode, and I hope you enjoy what we re doing here. Please consider becoming a supporter of the podcast by becoming a subscriber. It means you re getting a better, more informed, and more informed version of what s going on in the world. And it s made possible entirely through the support of our listeners, not only by our sponsors, but by our listeners are making it possible to support us all of our dreams and aspirations and dreams and dreams, too we can be a better of a good day, day, and we ll all have a better day, better of that, too, we know that we all will be listening to that, we ll hear more of it, too of it. Thank you, thank you, friend and good morning, and good night, good night. - Yours truly, Amy, Amy and Evelyn, - Sam, - Timestep: <_________ -- -- Thank you -- Yours Truly, -- The Good Morninger -- Cheers, ______________ -- Myself, , & ~
00:04:22.400So this is one year after the Apple II, three years before the IBM PC, and I think roughly 17 or 18 years before the Palm Pilot.
00:04:30.340He planted that seed in my head, and I couldn't get rid of it.
00:04:35.060And I spent four years trying to figure out how to become an engineer, discovered I was just terrible at it.
00:04:40.740And so I got a job being a research analyst covering the technology industry, and I arrived in Silicon Valley just before the personal computer industry started.
00:04:50.720And that was one of those moments of just pure dumb luck that can make a career and a lifetime, and in my case, it did both.
00:05:03.280I follow the technology industry for a long, long period of time.
00:05:06.480And I do this, like Zelig, I just wound up in the right place at the right time, a lot of different moments.
00:05:12.400Beginning in the mutual fund business in Baltimore at T. Rowe Price, but covering tech, traveling around with the computer industry as it formed.
00:05:20.200Then starting a firm inside Kleiner Perkins Caulfield & Buyers, the venture capital firm in 1991.
00:05:24.860So I was actually in their office when the Internet thing happened.
00:05:30.160So the day Marc Andreessen brought Netscape in, the day that Jeff Bezos brought in Amazon, the day that Larry and Sergey brought in Google, those were all things that I got to observe.
00:05:41.200I wasn't the person who did them, but I was there when it happened.
00:05:44.480And that was, if you're an analyst, that's a perfect situation.
00:05:48.180And so in 2006, I had been in the business 24 years, and I get a phone call from the chief privacy officer at Facebook saying, my boss has got a crisis, and he needs to talk to somebody independent.
00:06:09.300It's about a year after the end of the storyline from Social Network.
00:06:13.460The company is only available to high school students and college students with an authenticated email address, and there's no news feed yet.
00:06:32.880I need to give you two minutes of context for why I'm taking this meeting.
00:06:36.840And I said, if it hasn't already happened, either Microsoft or Yahoo is going to offer a billion dollars for Facebook.
00:06:46.940Keep in mind, the company had had 9 million in revenues before that, so a billion was huge.
00:06:51.260And I said, everybody you know, your mother and father, your board of directors, your management team, everybody's going to tell you to take the money.
00:06:57.660They're all going to tell you you can do it again, that with 650 million bucks at age 22, you can change the world.
00:07:02.500And I just want you to know, I believe that Facebook, because of authenticated identity and control of privacy, is going to be the first really successful social product, and that you can build a social network that will be more important than Google is today.
00:08:49.800And the key thing that I did in addition to helping him get through the first problem, because he didn't want to sell the company when he came into my office.
00:08:57.340But he was really afraid of disappointing everybody.
00:08:59.300And I helped him figure out how to do that.
00:09:01.940And then he needed to switch out his management team.
00:09:05.420And the key person I helped bring in was Sheryl Sandberg.
00:09:08.760And so you have to imagine the context for this thing is I'm a lifelong technology optimist.
00:09:14.340And I grew up in the era, I'm the same age as Bill Gates and Steve Jobs.
00:09:19.500So I grew up in the era where technology was something that made people's lives better and that we were all committed to changing the world in kind of a sort of hippie libertarian value system.
00:09:31.580And Mark appeared to me to be different from the other entrepreneurs.
00:09:35.800You know, I was not a fan of the PayPal mafias approach.
00:09:38.980And I had consciously turned down some things where I really was philosophically out of line with the management teams.
00:09:46.780And, you know, I look at Peter Thiel and Elon Musk and Reid Hoffman as incredibly brilliant people who had ideas that transformed tech and transformed the world.
00:10:00.740But philosophically, I come from a different place.
00:10:03.500And so I wasn't so comfortable with them.
00:10:09.760And, you know, so what wound up happening is I retired from the investment business because it turned out that I guess I'd gotten past my philosophical sell-by date,
00:10:20.140that I was seeing too many businesses with strategies that I just couldn't sign up for, things that I knew would be successful, things like Uber and Spotify,
00:10:27.660where, you know, they delivered a lot of value to the customer, but only by causing some harm to other people in the chain.
00:11:21.000So between January 2016 and October, I saw election issues in the Democratic primary and in Brexit,
00:11:29.040where it was clear that Facebook had an influence that was really negative because it gave an advantage to inflammatory and hostile messages.
00:11:40.360And then I saw civil rights violations, a corporation that used the Facebook ad tools to scrape data on anybody who expressed interest in Black Lives Matter.
00:11:49.840And they sold that to police departments in violation of the Fourth Amendment.
00:11:53.280And then Housing and Urban Development, the government agency cited Facebook for ad tools that allowed violations of the Fair Housing Act,
00:12:00.460the very thing that Facebook just settled the civil litigation on in the past week.
00:12:06.740And so you have civil rights violations.
00:12:14.740And instead of publishing, I sent it to Mark Zuckerberg and Sheryl Sandberg on October 30th of 2016, so nine days before the election.
00:12:23.280And it basically says, I'm really, really concerned that Facebook's business model and algorithms allow bad actors to harm innocent people.
00:12:39.460It was meant to be an op-ed, so it's more emotional than I wish.
00:12:43.120If I'd had a chance to do it again, I would have rewritten it for them.
00:12:45.800But I wanted to get it in their hands because I was really afraid the company was the victim of essentially well-intended strategies producing unintended consequences.
00:13:55.680I'm trying to save you from, like, killing this business that you got to do what Johnson & Johnson did when that guy put poison in bottles of Tylenol in 1982 in Chicago, which is they took every bottle of Tylenol off the shelf until they could invent and deploy tamper-proof packaging.
00:14:14.880Even though they didn't put the poison in, they weren't technically responsible.
00:14:18.000And I thought Facebook could convert a potential disaster into a winning situation by opening up to the investigators and working with the people who used the product to understand what had happened.
00:14:32.240And for three months, I begged them to do this.
00:14:35.880And finally, I realized they were just never going to take it seriously.
00:14:40.100And that's when I went looking for, you know, like, I didn't have any data.
00:14:44.440I mean, Stan, you know how hard this is when you're talking to really, really smart technical people.
00:14:58.580And that changed everything because I was looking at this as a issue of civil rights and an issue of democracy.
00:15:05.520And Tristan's on 60 Minutes and he's talking about brain hacking and the use of manipulative techniques, persuasive technology to manipulate attention and create habits that become addictions.
00:15:19.300And then how that makes people vulnerable and how filter bubbles can be used to create enormous economic value, but at the same time, increase polarization and undermine democracy.
00:15:30.940And I had a chance to interview him on Bloomberg a couple days after the 60 Minutes thing.
00:15:38.920And I call him up immediately after the show's done and go, dude, do you need a wingman?
00:15:43.600Because I'm convinced he's like the messiah of this thing.
00:15:56.060And we literally both dropped everything we were doing and committed ourselves to seeing if we could stimulate a conversation.
00:16:03.540And it was really clear we were going to focus on public health because I was certain that Tristan's idea was the root cause of the problem.
00:16:15.920And the hilarious thing was, he may have told you this, but it began with going to the TED conference.
00:16:22.900Eli Pariser, the man who identified filter bubbles and wrote the amazing book about that, got Tristan onto the schedule of the TED conference two weeks before the conference itself.
00:17:33.980And so we're just like completely traumatized because we don't know anybody who's not in tech.
00:17:38.960And, and that's when a miracle occurred.
00:17:43.060So when Tristan was on 60 Minutes, the woman who did his makeup happened to be someone whose regular gig was doing the makeup for Ariana Huffington.
00:17:53.620And she called up Ariana for whom she'd worked for a decade and said, Ariana, I've never asked you to do this, but you need to meet this young man.
00:18:02.560And so she sets up for, for Tristan to meet Ariana.
00:18:07.620So the two of us go to New York and Ariana takes Tristan under her wing, gets him onto Bill Maher and introduce him to a gazillion other people.
00:18:16.560And, you know, so all of a sudden we go from not having any relationship at all.
00:18:20.620And then this purely beautiful woman, Brenda from, who did, did Tristan's makeup, gets him on there.
00:18:27.460And she recurs in the story throughout because she did his makeup on Bill Maher.
00:18:32.100She did mine when I was on Bill Maher.
00:18:42.860Right. And she was the butterfly. And while Tristan's meeting with Ariana for the first time, I get an email from Jonathan Taplin, who wrote the book, Move Fast and Break Things.
00:18:56.940And Jonathan was a friend who had the first insight about the antitrust issues on Google, Facebook and Amazon and wrote a book about it in early 2017 that had really helped frame my ideas.
00:19:13.240And he sends me a card for an aide to Senator Mark Warner.
00:19:18.780And if you recall, in May of 2017, the only committee of Congress where the Democrats and Republicans were working together was the Senate Intelligence Committee, of which Mark Warner was the vice chair.
00:19:31.040So to get a card for somebody who was policy aide to him was a huge deal.
00:19:38.100And so I called him up and I said, have you guys, I know, I know your, your oversight mission is intelligence agencies, but is there anybody in Washington who's going to protect the 2018 elections from interference over social media?
00:19:52.080And, you know, you know, it was clearly outside their jurisdiction.
00:19:57.500Anyway, he brings us to Washington to meet Warner because he goes, you're right.
00:20:02.640If it's not us, it's not going to happen.
00:20:04.680So we've got to find some way to get to it.
00:20:07.360And it took a couple of months to set up.
00:20:09.260And in between, we get a contact from Senator Elizabeth Warren, who has a hypothesis about, about the tech group that is really profoundly insightful, where the question she asks is, isn't this basically the same problem as the banks had in 2008, that you have one side, the powerful banks in that case, had perfect information, and their clients only had the information the banks were willing to give them?
00:20:37.000And she had this insight that Facebook and Google and, to a lesser extent, Amazon were doing exactly the same thing, that they were maintaining markets of perfect information on one side and starving the other side.
00:20:49.200So they were essentially distorting capitalism, really undermining the notion of capitalism, which requires at least some uncertainty on both sides to have a market.
00:20:57.360And, you know, using that in a monopolistic way, which, I mean, I was gobsmacked.
00:21:03.980I've been in the investment business for 35 years.
00:23:42.520It's just true that if it bleeds, it leads on some level.
00:23:46.000But this is integrated into Facebook's business model to an unusual degree.
00:23:51.880And yet, to hear you tell the story of your advising of Zuckerberg and your—I don't think you said it here,
00:23:59.620but it's in the book that you actually introduced Sandberg to him and facilitated that marriage.
00:24:05.420That was at a time where the ethical problem of this business model wasn't so obvious, to hear you tell.
00:24:14.220I mean, were they having to confront this back in 2007 or not?
00:24:18.960Well, they were certainly not confronting it in any way that I was aware of.
00:24:23.460To be clear, in the early days of Facebook, they had one objective only, which was to grow the audience.
00:24:29.460There was really no effort made during the period I was engaged with them to build the business model.
00:24:35.940Cheryl's arrival was about putting in place the things to create the business model, but there was a great deal of uncertainty.
00:24:42.400In fact, Mark was initially very hesitant to hire Cheryl because he didn't believe that Google's model would apply or work at Facebook.
00:24:53.080And it turned out he was correct about that.
00:24:54.620So my perception of the model—I love the way you just described that.
00:25:00.560You know, the thing that I always try to explain to people is that when you think about filter bubbles and you think about when it bleeds, it leads.
00:25:11.320That whole notion's been with us for 150 years.
00:25:15.200But before Google and Facebook, it was always in a broadcast model.
00:25:19.500So when I was a kid, everybody my age saw the Kennedy funeral, the Beatles on Ed Sullivan, and the moon landing.
00:25:30.100And the filter bubble brought people together because we had a shared set of facts.
00:25:35.660And the complaint at the time was conformity, right?
00:25:40.140Because we all saw exactly the same thing.
00:25:42.840With Facebook and Google, they create this world of, in Facebook's case, across all their platforms, 3 billion Truman shows where each person gets their own world, their own set of facts with constant reinforcement,
00:26:02.520where they lure you onto the site with rewards, right, whether it's notifications or likes, to build a habit.
00:26:10.420And for many people, that turns into an addiction.
00:26:16.360When do you check your phone first thing in the morning?
00:26:18.660Is it before you pee or while you're peeing?
00:26:22.060Because everybody I know is one or the other.
00:26:24.500And, you know, we're all addicted to some degree.
00:26:27.240And then once you're coming back regularly, they have to keep you engaged.
00:26:31.120And this is the stuff that was not happening until roughly 2011, which was this notion of, you know, before 2011, what they had to keep people engaged was Zynga, right?
00:26:44.280That was the big driver of usage time before 2011.
00:26:48.140And but what they realized was that appealing to outrage and fear was much more successful than appealing to happiness because one person's joy is another person's jealousy.
00:26:59.940Whereas if you're afraid or outraged, you share stuff in order to make other people also afraid or outraged because that just makes you feel better.
00:27:09.120And Tristan had this whole thing figured out.
00:27:11.840And, you know, we obviously shared that in Washington.
00:27:14.880And that was, you know, an important stimulus.
00:27:18.080But when I think about the problem, there's all that's one piece of it, which is the the manipulation of people's attention for profit and the natural divisiveness of using fear and outrage and filter bubbles that isolate people.
00:27:34.380That, you know, if if you start out vax, anti-vax curious and they can get you into an anti-vax group within a year, you're going to be in the streets fighting vaccination.
00:27:55.500It's not a question of character or whatever.
00:27:58.140It's about the most basic evolutionary wiring.
00:28:02.500I just want to cover this ground again, not to be pedantic.
00:28:05.880But I do have this sense that there are many people who are skeptical that this is really even a problem or that there's something fundamentally new about this.
00:28:14.880So I just want to just cover a little bit of that ground again.
00:28:17.080You've used this phrase filter bubble a bunch of times.
00:28:20.620If I recall, that actually came from Eli Pariser's TED Talk, where many of many of us were first made aware of this problem.
00:28:28.960He might have mentioned Facebook, but I remember him putting it in terms of Google searches, where if you do a Google search for vaccines and I do one, we are not going to get the same search results.
00:28:40.560Your search history and all the other things you've done online are getting fed into an algorithm that is now dictating what Google decides to show you in any query.
00:28:53.780And the problem here is that, and I think it was Tristan who, no, either Tristan or Jaron Lanier, you might correct me here.
00:29:01.900One of them said, just imagine if when any one of us consulted Wikipedia, we got different facts, you know, however subtly curated to appeal to our proclivities on any topic we researched there.
00:29:16.980And there could be no guarantee that you and I would be seeing the same facts.
00:29:20.540That's essentially the situation we're in on social media, and social media is the, and Google, and this is obviously the majority of anyone's consumption of information at this point.
00:29:33.080Exactly. And so if we take that as one part of the problem, so when Eli first talked about filter bubbles, he used both Google and Facebook and showed these examples and how essentially these companies were pretending to be neutral when in fact they were not, and they were not honest about it.
00:29:56.020So, you know, the Harvard scholar Shoshana Zuboff has a new book called The Age of Surveillance Capitalism, and there are some things in there where she spent a dozen years studying Google's business and gathering data about it.
00:30:15.440And in my book, which I wrote at the same time she was writing her, so I was totally unaware of her work, I hypothesize a bunch of things, and Shoshana has data, so she's like, in my opinion, a god.
00:30:26.580But the core thing that Google did, and here's how the flow worked, because without this, what Facebook did would have been less harmful.
00:30:34.520But when you talk about the people who are skeptical of harm, when you see the Google piece, then the two of them together make it really clear.
00:30:41.140So Google begins like a traditional marketer, they have one product, it's 2002, the product is search, they're gathering data from their users in order to improve the quality of the product for those users, and they have an insight, which is that they only need a couple percent of the data they're gathering to improve the search engine.
00:30:58.980So they decide to figure out, is there any signal in the other 98%, and the insight is traditionally, I think, credited to Hal Varian, an economist at Google, that there was in fact predictive signal.
00:31:18.800So they could basically do behavioral prediction based on this stream of excess data that they were capturing from search results.
00:31:26.180And the signal wasn't hugely strong, because it was just from search.
00:31:29.860So they had the insight, we need to find out the identity of people.
00:31:33.920And then they did something incredibly bold.
00:31:36.580They create Gmail, which would have given them identity, which you could tie to the search queries, and therefore you'd know purchase intent and whose purchase intent it was.
00:31:46.560But the really bold thing they did was they decided they were going to scan every message.
00:31:52.280And they put ads into the Gmail, ostensibly to pay for it.
00:31:57.780But I think that was actually just duck food.
00:31:59.760This is a hypothesis of mine, that they knew people would squawk at the ads and force them to be removed.
00:32:05.400But once they were removed, people would stop complaining, and Google would still be scanning all the messages.
00:32:10.700So we essentially, if you're looking for data for behavioral prediction, it's hard to get a better product than email for telling you what people are thinking.
00:32:21.460And for whatever reason, people who signed up for Gmail went along with this.
00:32:26.880So suddenly Google got this massive treasure trove of data about what people are going to do with their name and their search results to tie it to actual purchase intent.
00:32:36.460Then they decide, well, we need to know where they are.
00:33:22.940You know, people think they're playing a game, but they're really gathering data for Google.
00:33:26.620And when you put all these pieces together, you realize, oh, my gosh, the business model initially was about improving the targeting of the ads.
00:33:33.620But then they have the genius insight that with filter bubbles and with recommendation engines, they can take that market of behavioral prediction and increase the probability of a good outcome by steering people towards the outcomes that the predictions have suggested.
00:33:53.320And so that's how they use filter bubbles.
00:33:57.260And so the way to think about it is if you're a marketer today, Google and Facebook have all of your customers behind a paywall.
00:34:07.760But you can do this Faustian deal with these guys, which is you can get perfect information on these people as long as you're willing to do it on their terms.
00:34:17.740Now, the other side of that trade, if you're a consumer, the data you're getting is coming primarily from Google or Facebook, right?
00:34:26.560So if you take the emotional component of what Facebook has been doing and that whole thing with, you know, manipulation of attention and the notion of creating habits that become addictions and that inflaming of lizard brain emotions like outrage and fear and the use of disinformation and conspiracy theories to essentially get past people's civility.
00:34:52.600Civility is a mask and you want to strip people of that and get to their underlying reality because that's where all the behavioral prediction value is.
00:35:01.220And then you overlay onto that what Google was doing and you realize, oh, my God, these people have created digital avatars for each and every one of us.
00:35:10.060And they've got this choke collar on it and a leash and they control our digital avatars.
00:35:16.320We do not control them and they control them simply because they went into a place where there were these where there's this unclaimed asset called data.
00:35:27.060And they claimed ownership of all of it and we let them get away with it.
00:35:32.100So on the one hand, you're talking about companies.
00:35:35.020Let's just focus on Google and Facebook here.
00:35:37.160I'm sure Twitter is involved as well, but I can't figure out how Twitter is functioning.
00:35:42.940Microsoft and Amazon are the other guys who really do this.
00:35:46.280OK, well, let's just let's just focus on the products you've already described here.
00:35:51.040So Google rolls out Gmail and Maps and the user perception of this and search before them.
00:36:00.320The user perception is this is adding immense value to our lives.
00:36:05.860I mean, just to be able to navigate in a city based on, you know, accurate mapping data and to understand, you know, what streets to avoid because the traffic is so bad.
00:36:14.560And this is what technology should be doing for us.
00:36:17.420And, you know, Gmail, I was never a fan of of the idea of Gmail until I started getting spammed.
00:36:24.840I don't know who put me on the on the devil's list.
00:36:26.920But there was I woke up one day and I was literally getting ninety nine to one spam to real email and no spam detector could deal with it.
00:36:37.000And I ran my email through Google servers and, you know, all the spam magically disappeared forever.
00:36:45.580And there are many other instances of this where if you're a user of Facebook, which I'm really not, I can imagine you like the fact that Facebook is serving you stuff that you find interesting.
00:36:59.320But the general principle here is that everything that these platforms do that is good for a user or seems good for a user is really doubly good for advertisers.
00:37:14.180That is the bottom line and what's so perverse about the incentives built into the business model.
00:37:19.740Yeah. So the way I handicap it is this way.
00:37:23.740If all they were doing was capturing the data that you put into the system in the form of the routes for your going to the office and back or the emails that you send or the photos or posts that you put on Facebook, everything I think would be fine.
00:37:41.800The problem is there is also a third leg of the stool, which is the third-party market in our most personal data.
00:37:53.180So this is our credit card transactions data, which is sold by Equifax, Experian, and TransUnion.
00:38:00.780It is our location sold by our cellular carrier, but also captured through the APIs that Google has with companies like Uber.
00:38:09.040It is wellness and health data captured from apps and devices that are outside the protection of HIPAA, the Health Information Protection Act.
00:38:21.880And it is also our browsing history, which can be freely acquired.
00:38:28.040And, you know, to me, we've never asked the question, well, wait a minute, why is it legal for companies to do commerce in our most private data?
00:38:39.580We've actually never agreed to that, right?
00:38:41.140There's nothing that I can find in a credit card transaction that gives those people the right to sell that data.
00:38:46.920They've simply asserted and no one has said no.
00:38:48.900And we live in this really deregulated economic environment where the government, you know, in contrast to a normally highly functioning capitalist system, where the government would set the rules and then enforce them uniformly on everybody.
00:39:03.560Well, it must be in their terms of service that nobody ever reads, right?
00:39:06.400It's got to be in the fine print somewhere.
00:39:23.240And so most of these guys have something buried in the terms of service, but I think on that one, I don't even know where it would show up, right?
00:39:31.600And, you know, I can't imagine why it would be in Visa's interest to have that happen.
00:39:43.460And so my point is, if you take those three problems, the emotional component, the essential data capture and, you know, the claim of ownership.
00:39:54.100So it's almost like they're acting like a government exercising a right of eminent domain, right?
00:40:02.520They're claiming, okay, well, this data has touched our servers, therefore we own it forever and we can do whatever we want with it.
00:40:07.620And then you've got these third parties who simply will trade your digital life to anybody who wants it.
00:40:14.960So in that scenario, you wind up with this thing where the gatekeepers, in this case, Google, Facebook, Amazon, and maybe to a lesser degree Microsoft, can offer perfect information to marketers in exchange for access to all of their consumer customers.
00:40:32.620And the consumers are in this extraordinary position of having their access to information limited by those people.
00:40:39.780And my point here is not that Google or Facebook do not provide good value.
00:40:45.460I think they provide tremendous value.
00:40:47.580What I believe is true is that the value they're receiving in return is not only disproportionate, it's that they have the ability to influence our choices in ways that we are not aware of.
00:41:03.320And they're taking our agency away, they do a lot of first-party gathering, that would be the Google apps, that would be the Facebook apps, and then they acquire data wherever it's available.
00:41:12.320So they create this digital, high-resolution digital avatar for each and every one of us, and they sell access to that avatar.
00:41:27.740That's why they're so immensely profitable, right?
00:41:30.500And my simple observation is if you want to understand the nature of the relationship is ask yourself how much more value you get from Google Maps or Gmail today than you got, say, two years ago.
00:41:42.880And then look at Google's average revenue per user over those two years and see how much more value they got from you.
00:41:51.880And here's where the moral problem gets really dicey is there is no opt-out.
00:42:08.560And if I send an email to somebody in a Gmail account, it is being scanned by Google, and it is going into their behavioral prediction on lots of people, including me.
00:42:21.420And there are hundreds of examples just like that all over the economy.
00:42:27.440And so if you sit there and think that phase one was, again, improving the quality of the ad targeting, which is the thing you liked inside Facebook,
00:42:34.540and phase two is using recommendation engines and filter bubbles and things like that to steer you to desired outcomes,
00:42:42.540you're sitting there saying, ooh, I maybe don't like that quite so much.
00:42:47.960Anyone who is a subscriber to things like the Financial Times has run into the Google Captcha system where they say, hey, we want to figure out if you're a robot.
00:42:57.500So look at these photographs and touch all the photographs that have a traffic light or all the ones that have a bus.
00:43:03.840And I think we've all seen that one degree or another.
00:43:06.120And those things are getting harder and harder.
00:43:07.520And we think, okay, well, they're just trying to figure out if we're human or not.
00:43:10.400And, of course, that's not what you're doing at all.
00:43:13.200What you're doing is training the artificial intelligence for Google's self-driving cars.
00:43:16.840That's why it's getting harder because they're getting to corner cases now.
00:43:21.500They've figured out you're a human because of the movement of your mouse.
00:43:24.440Now, I assume that they're keeping a log of all of that.
00:43:29.180And I assume that Amazon does the same thing and Facebook does the same thing,
00:43:33.620which means that they may already be able to do this.
00:43:37.020But if not, very soon, when my mouse movement becomes slower than it used to be and gets more wobbly,
00:43:44.720that may be the first indication that I have a disease like Parkinson's.
00:43:50.180Now, here's the problem, and this is a deep moral problem.
00:43:54.440Whichever company captures that, whether it's Facebook, Google, Amazon, is under no obligation to tell me.
00:44:00.600In fact, they're not even under an obligation to keep it private.
00:44:04.560They are free, and all the incentives point to them selling it to the highest bidder,
00:44:10.220which would almost certainly be my insurance company,
00:44:12.900which would almost certainly raise my rates or cut off my coverage,
00:44:16.460and I still wouldn't know I'd shown a symptom.
00:44:19.040And I would simply point out that that same technology could be used in an insurance product
00:44:24.440that simply said, pay us $10 a year, and if you ever show a symptom of a neurological problem,
00:48:02.100And I just don't think that there's any amount of value that you can get from Google Maps or Gmail
00:48:09.840or from Facebook or Instagram that offsets the damage that they're doing right now to society as a whole,
00:48:17.620that individually we may love these products, and I don't dispute that, but they're causing huge harm.
00:48:22.480And my basic point here is I believe we can get rid of the harm without having to eliminate what we like about the products.
00:48:28.920They're going to be a lot less profitable, a lot less profitable, but tough noogies.
00:48:33.520I mean, you know, companies, corporations are not allowed to destroy civilization just because it's more profitable than building civilization.