Tristan Harris has been called by The Atlantic Magazine the closest thing that Silicon Valley has to a conscience. He was a design ethicist at Google, and then left the company to start a movement called Time Well Spent, which is a movement whose purpose is to align technology with our deepest interests. Tristan was recently profiled on 60 Minutes last week. He graduated from Stanford with a degree in computer science, having focused on human-computer interaction. And he s worked at various companies, including Apple, Wikia, and Google. In this episode, I chat with Tristan about what it means to be a good human being, and why we should all be trying to be good human beings. We don t run ads on the podcast and therefore, therefore, are made possible entirely through the support of our subscribers. If you enjoy what we re doing here, please consider becoming a supporter of the podcast by becoming a subscriber. You ll get access to all kinds of great shows and resources, including The Huffington Post, The New York Times, The Atlantic, and The New Republic, and much more! Thanks for listening and Happy Listening! -Sam Harris Make Sense Podcast Subscribe to the Making Sense Podcast: A Podcast About Stuff That Matters to You? by and . Learn more about your ad choices? by searching for in the App Store or App Store or ? by PODCAST by , or by clicking here to become a reviewer , and to learn more about what s going to be mentioned in the podcast? , can be found on on this episode is a post on this post? or a review on this podcast a post that s via & v= #_ has a review will be should be a tweet or a tweet so I of this post is a review or a place does that tweet a review of this episode is this is a link AND : an also ... _ (a link to a clip from a post has a link to :) ) + ;) And a video on this is also a retweeted tweet? (A screenshot of this is not a clip
00:02:51.600But what you're doing is fascinating and incredibly timely, given our dependence on this technology.
00:02:56.980And I think this conversation we're going to have, I'm imagining it's going to be something like a field guide to what technology is doing to the human mind.
00:03:06.800I think we'll talk about how we can decide to move intentionally in that space of possibilities in a way that's healthier for all of us.
00:03:15.360And this is obviously something you're focused on.
00:03:17.040But to bring everyone up to speed, because even I was not up to speed until just a few days ago, what is your background?
00:03:25.240And I've heard you've had some very interesting job titles at Google, perhaps among other places.
00:03:29.980One was the resident product philosopher and design ethicist at Google.
00:03:35.640So, how did Tristan Harris get to be Tristan Harris?
00:03:46.380So, yeah, my role at Google, that was an interesting name.
00:03:50.380So, design ethicist and product philosopher, I was really interested in essentially when a small number of people in the tech industry, you know, influence how a billion people think every day without even knowing it.
00:04:04.200If you think about your role as a designer, how do you ethically steer a billion people's thoughts, framings, cognitive frames, behavioral choices, basically the schedule of people's lives?
00:04:16.040And so much of what happens on a screen, even though people feel as if they're making their own choices, will be determined by the design choices of the people at Apple and Google and Facebook.
00:04:24.920So, we'll talk, I'm sure, a lot more about that.
00:04:26.840I guess prior to that, when I was a kid, I was a magician very early.
00:04:30.980And so, I was really interested in the limits of people's minds that they themselves don't see, because that's what magic is all about.
00:04:37.640That there really is a kind of band of attention or short-term memory or ways that people make meaning or causality that you can exploit as a magician.
00:04:48.680And that had me fascinated as a kid, and I did a few little magic shows.
00:04:53.680And then, flash forward, when I was at Stanford, I did computer science, but I also studied as part of a lab called the Persuasive Technology Lab with BJ Fogg,
00:05:03.120which basically taught engineering students how this kind of library of persuasive techniques and habit formation techniques in order to build more engaging products.
00:05:14.160Basically, different ways of taking advantage of people's cognitive biases so that people fill out email forms,
00:05:20.420so that people come back to the product, so that people register a form, so that they fill out their LinkedIn profiles, so that they tag each other in photos.
00:05:27.000And I became aware, when I was at Stanford doing all this, that there was no conversation about the ethics of persuasion.
00:05:35.340And just to ground how impactful that cohort was, in my year in that class in the Persuasive Technology Lab,
00:05:43.480my project partners in that class and very close friends of mine were the founders of Instagram.
00:05:48.840And many other alumni of that year in 2006 actually went on to join the executive ranks at many companies we know, LinkedIn and Facebook.
00:05:59.100And again, never before in history have such a small number of people with this tool set influenced how people think every day by explicitly using these persuasive techniques.
00:06:09.360And so at Google, I just got very interested in how we do that.
00:06:13.680And so you were studying computer science at Stanford?
00:06:15.820Originally computer science, but I dabbled a ton in linguistics and actually symbolic systems.
00:06:35.380So yeah, so persuasion, the connection to magic is interesting.
00:06:38.660There's an inordinate number of magicians and fans of magic in the skeptical community as well, perhaps somewhat due to the influence of James Randi.
00:06:48.620But I mean, magic is really the ultimate act of persuasion.
00:06:53.040You're persuading people of the impossible.
00:06:55.580So you see a significant overlap between the kinds of hacks of people's attention that magicians rely on and our new persuasive technology.
00:07:04.940Yeah, I think, well, I think if you just abstract away what persuasion is, it's the ability to do things to people's minds that they themselves won't even see how that process took place.
00:07:16.240And I think that parallels your work in a big way in that beliefs do things.
00:07:20.800To have a belief shapes the subsequent experience of what you have.
00:07:24.420I mean, in fact, in magic, there's like principles where, you know, you kind of want to start bending reality and creating these aha moments so that you can do a little hypnosis trick later, for example, that people be more likely to believe having gone through a few things that have kind of bent their reality into being more superstitious or more open.
00:07:41.780And there's just so many ways of doing this that most people don't really recognize.
00:07:46.840I wrote an article called How Technology Hijacks Your Mind that ended up going viral to about a million people.
00:07:53.020And it goes through a bunch of these different techniques.
00:07:55.260But yeah, that's not something people mostly think about.
00:07:57.760You also said in the setup for this interview that you have an interest in cults.
00:08:03.060And to what degree have you looked at cults?
00:08:06.440Well, I find cults fascinating because they're kind of like vertically integrated persuasive environments.
00:08:13.660Instead of just persuading someone's behavior or being the design of a supermarket or the design of, you know, a technology product, you are designing the social relationships, the power dynamic between a person standing in front of an audience.
00:08:31.540You can control many more of the variables.
00:08:34.420And so I've done a little bit of sort of undercover investigation of some of these things.
00:09:04.960But I did actually, one of the interesting things is the way that people that I met in those cults who eventually left and later talked about their experience and the confusion that you face.
00:09:16.520And I know this is an interest you've had.
00:09:18.080The confusion that you face when you've gotten many benefits from a cult.
00:09:23.840You've actually deprogrammed, let's say, early childhood traumas or identities that you didn't know you were holding or different ways of seeing reality that they helped you, you know, get away from.
00:09:34.780And you get these incredible benefits and you feel more free, but then you also realize that was all part of this larger persuasive game to get you to spend a lot of money on classes or courses or these kinds of things.
00:09:45.000And so what the confusion that I think people experience in knowing that they got all these benefits, but then also felt manipulated.
00:09:53.180And they don't know in the sort of mind's natural black and white thinking how to reconcile those two facts.
00:09:58.780I actually think there's something parallel there with technology.
00:10:01.200Because, for example, in my previous work on this, a lot of people expect you, if you're criticizing how technology is designed, if you might say something like, oh, you're saying Facebook's bad, but look, I get all these benefits from Facebook.
00:10:12.660Look at all these great things it does for me.
00:10:14.280And it's because people's minds can't hold on to both the truth that we do derive lots of value from Facebook and there's many manipulative design techniques across all these products that are not really on your team to help you live your life.
00:10:30.200And that distinction is very interesting when you start getting into what ethical persuasion is.
00:10:36.420Yeah, it is a bit of a paradox because you can get tremendous benefit from things that are either not well-intentioned or just objectively bad for you or not optimal.
00:10:48.920The ultimate case is you hear from all these people who survived cancer and cancer was the most important thing that ever happened to them.
00:10:56.460So a train wreck can be good for you on some level because your response to it can be good for you.
00:11:02.100You can become stronger in all kinds of ways, even by being mistreated by people.
00:11:07.860But it seems to me that you can always argue that there's probably a better way to get those gains.
00:11:14.800Frankly, with your work on the moral landscape, when you're thinking about if you're a designer at Facebook or at Google, because of how frequently people turn to their phone, you're essentially scheduling these little blocks of people's time.
00:11:29.140If I immediately notify you for every Snapchat message, which Snapchat is one of the most abusive, more manipulative of the technology products, when you see a message from a friend in that moment urgently, that will cause a lot of people to go swipe over and not just see that message, but then get sucked into all the other stuff that they've been hiding for you.
00:11:50.480Right. And that's all very deliberate. And so if you think of it as, let's say you're a designer at Google and you want to be ethical and you're steering people towards these different timelines, you're steering people towards schedule A in which these events will happen or schedule B in which these other events will happen.
00:12:04.520You know, back to your point, should I schedule something that you might find really challenging or difficult, but that later you'll feel is incredibly valuable?
00:12:12.960Do I take into account the peak end effect where people will have a peak of an experience and an end? Do I take a lot of their time or a little bit of their time? Should the goal be to minimize how much time people spend on the screen?
00:12:23.580What is the value of screen time and what are people doing that's lasting and fulfilling? And when are you steering people as a designer towards choices that are more shallow or empty?
00:12:34.520You're clearly concerned about time, as we all should be. It's the one non-renewable resource. It's the one thing we can't possibly get back any of, no matter what other resources we marshal.
00:12:46.820And it's clear that our technology, especially smartphone-based technology, is just a kind of bottomless sink of time and attention.
00:12:58.020I guess there's the other element that we're going to want to talk about, which is the consequence of bad information or superficial information and just what it's doing to our minds.
00:13:09.800I mean, the fake news phenomenon being of topical interest. But just the quality of what we're paying attention to is crucial. But the automaticity of this process, the addictiveness of this process, the fact that we're being hooked and we're not aware of how calculated this intrusion into our lives is.
00:13:29.140So this is the thing that's missing is that people don't realize, because there's the most common narrative, and we hear this all the time, that technology is neutral and it's just up to us to choose how we want to use it.
00:13:39.680And if it happens, if people do fake news or if people start wasting all their time, that that's just people's responsibility.
00:13:45.860What this misses is that because of the attention economy, which is every basically business, whether it's a meditation app or the New York Times or Facebook or Netflix or YouTube, you're all competing for attention.
00:13:58.860The way you win is by getting someone's attention and by getting it again tomorrow and by extending it for as long as possible.
00:14:05.580So it becomes this arms race for getting attention. And the best way to get attention is to know how people's minds work so that you can basically push some buttons and get them to not just come, but then to stay as long as possible.
00:14:17.980So there are design techniques like making a product more like a slot machine that has a variable schedule reward.
00:14:24.440So, you know, for example, I know you use Twitter, you know, when you land on Twitter, notice that there's that extra variable time delay between like one and three seconds before that little number shows up.
00:14:34.560You have a return and the page loads. There's this extra delay.
00:14:39.880Yeah. Hold your breath. And then there's a little number that shows up for the notifications.
00:14:43.480And that delay is makes it like a slot machine. You're literally when you load the page, you you're as if you're pulling a lever and you're waiting and you don't know how many there's going to be.
00:14:51.840Is there going to be 500 because some big tweet storm or is there going to be doesn't always say 99?
00:14:57.460Well, you're not. Everyone is Sam Harrison has so many.
00:15:00.480No, no. But I mean, I mean, isn't isn't that always the maximum never says 500, right?
00:15:04.180You know, I so I don't because again, I'm not you. I don't have as many followers.
00:15:08.260Well, I think I can attest that. I mean, mine is always at 99. So it's no longer salient to me.
00:15:13.180Well, right. Which actually speaks to how addictive variable rewards work, which is the point is it has to be a variable reward.
00:15:18.120So the idea that I push a lever or pull a lever and sometimes I get, you know, two and sometimes I get nothing and sometimes I get, you know, 20.
00:15:26.720And this is the same thing with email.
00:15:28.520Well, let's talk about what is the interest of the company, because I think most people are only dimly aware.
00:15:35.620I mean, they're certainly aware that these companies make money off of ads very often.
00:15:41.040They sell your data. So your your attention is their resource.
00:15:46.060Yep. But take an example. I mean, so something like Twitter can't seemingly can't figure out how to make money yet.
00:15:51.320But Facebook doesn't have that problem. Let's take the clearest case.
00:15:54.360What is Facebook's interest in you as a user?
00:15:59.660Well, obviously, the there's other there's many sources of revenue, but it all comes down to whether it's data or everything else.
00:16:06.980It comes down to advertising and time because of the link that more of your attention or more of your time equals more money.
00:16:15.440They have an infinite appetite in getting more of your time.
00:16:23.120That's right. And this is literally how the metrics and the dashboards look.
00:16:25.780I mean, they measure what is the current sort of distribution of time on site.
00:16:30.060Time on site is the that in seven day actives are the currency of the tech industry.
00:16:34.680And so the only other industry that measures users that way is sort of drug dealers, right, where you have the number of active users who log in every single day.
00:16:43.680So that combined with time on site are the key principal metrics.
00:16:47.600And the whole goal is to maximize time on site.
00:16:50.640So Netflix wants to maximize how much time you spend there.
00:16:53.020YouTube wants to maximize time on site.
00:16:54.960They recently celebrated people watching more than a billion hours a month.
00:16:58.720And that was a goal and not because there's anyone who's evil or who, you know, wants to steal people's time.
00:17:05.500But because of the business model of advertising, there is simply no limit on how much attention that they would like from people.
00:17:12.260Well, they must be concerned about the rate at which you click through to their ads or are they not?
00:17:19.620But because and ad rates are depreciating.
00:17:22.100But because they can make money just by simply showing you the thing and there is some link between showing it to you and you clicking, you can imagine with more and more targeted things that you are seeing things that that are profitable.
00:17:32.720And there's always going to be someone willing to pay for that space.
00:17:36.240But this problem means that as this starts to saturate, because we only have so much time to even hold on to your position in the attention economy, what do you do?
00:17:44.580You have to ratchet up how persuasive you are.
00:18:17.640And that's what these companies often hide behind this notion that if you don't like it, you can stop using the product.
00:18:23.140But while they're saying that, I mean, they have teams of thousands of engineers whose job is to deploy these techniques I learned at the Persuasive Technology Lab to get you to spend as much time as possible.
00:18:32.420But just with that one example, let's say YouTube adds autoplay the next video.
00:19:06.760And the reason, though, that they're doing that, what people miss about this is it's not by accident.
00:19:10.440The web and all of these tools will continue to evolve to be more engaging and to take more time because that is the business model.
00:19:17.800And so you end up in this arms race for essentially who's a better magician, who's a better persuader, who knows these backdoors in people's minds as a way of getting people to spend more time.
00:19:27.280Now, do you see this as intrinsically linked to the advertising model of revenue?
00:19:33.420Or would this also be a problem if it was a subscription model?
00:19:38.260It's a problem in both cases, but advertising exacerbates the problem.
00:19:42.040So you're actually right that, for example, Netflix also maximizes time on site.
00:19:48.400What I heard from someone through some back channels was that the reason they have to do this is they found that if they don't maximize, for example, they have this auto countdown watching the next episode.
00:20:02.840Try to figure that out, psychologists among you.
00:20:05.640Well, and this is where it gets down to what is ethical persuasion because that's a one persuasive transaction where they are persuading you to watch the next video.
00:20:13.320But in that case, you're happy about it.
00:20:15.160I guess the reason why I'm happy about it there is that it is at least nine times out of 10.
00:20:20.560It is, by definition, something I want to watch because it's in the same series as the series I'm already watching, right?
00:20:26.480Whereas YouTube is showing me just some random thing that they think is analogous to the thing I just watched.
00:20:31.480And then when you're talking about Facebook, or I guess I've seen this feature on embeds in news stories like in the Atlantic or Vanity Fair, the moment you bring the video into the frame of the browser, it'll start playing.
00:20:44.060I just find that annoying, especially if your goal is to read the text rather than watch the video.
00:20:48.320Yep. But again, there's this because of the game theory of it, when one news website evolves that strategy, you can think of these as kind of organisms that are mutating new persuasive strategies that either work or not at holding on to people's attention.
00:21:00.520And so you have some neutral playing field, and one guy mutates this strategy on the news website of autoplaying that video when you land.
00:21:07.680So now the other news websites, if they want to compete with that, they have to, and assuming that CNN has enough market share that that makes a difference, the other ones have to start trending in that direction.
00:21:17.960And this is why the internet has moved from being this neutral feeling resource where you're kind of just accessing things to feeling like there's this gravitational wormhole suck kind of quality that pulls you in.
00:21:30.540And this is what I think is so important.
00:21:31.740You asked, you know, how much of this is due to advertising and how much of it is due to the hyper competition for attention.
00:21:38.900One is we have to be able to decouple the link between how much attention we get from you and how much money we make.
00:21:46.720And we actually did the same thing with, you know, for example, in energy markets where it used to be the energy companies made more money, the more energy you use.
00:21:55.540And so therefore they have an incentive.
00:21:56.820They want you to please leave the lights on.
00:22:00.620We're making so much more money that way.
00:22:02.540But of course, that was a perverse incentive.
00:22:03.960And so this new regulatory commission got established that basically decoupled, it was called decoupling, it decoupled the link between how much energy you use and how much energy they, how much money they make.
00:22:17.240Well, and there's some ads online, I can't even figure out how they're working or why they're there.
00:22:23.500There are these horrible ads at the bottom of even the most reputable websites like the Atlantic, you'll have these ads.
00:22:31.140I think usually they're framed with, you know, from around the web and it'll be an ad like, you won't believe what these child celebrities look like today.
00:22:40.840Yeah, Taboola and Outbrain, there's a whole actual kind of market of companies that specifically provide these related links at the bottom of news websites.
00:22:49.260But I mean, they're so tawdry and awful.
00:22:51.500I mean, you can go from just, you know, reading literally the best long form journalism and hit just one garish ad after another.
00:23:02.600But the thing that mystifies me is when you click through to these things, I can't see that it ever lands at a product that anyone who was reading that article would conceivably buy.
00:23:13.240I mean, you're just going down the sinkhole into something horrible.
00:23:20.680The reason why, so I actually know a lot about this because the company, the way I arrived at Google was they bought our little startup company for our talent.
00:23:27.800And we didn't do what this sort of market of websites did.
00:23:31.620But we were almost being pushed by publishers who used our technology to do that.
00:23:36.340So one of the reasons I'm so sensitive to this time on site stuff is because I had a little company called Apture, which provided little in-depth background pieces of information without making you leave news websites.
00:23:48.300So you'd be on The Economist and it would talk about Sam Harris and you'd say, who's Sam Harris?
00:23:52.760You'd highlight it and we'd give you sort of a multimedia background or thing and you could interactively explore and go deeper.
00:23:58.320And the reason we sold this, the reason why Economist wanted it on their website is because it increased time on site.
00:24:05.180And so I was left in this dilemma where the thing that I got up to do in the morning as a founder was, let's try to help people understand things and learn about things.
00:24:13.060But then the actual metric was, is this increasing our time on site or not?
00:24:17.760And publishers would push us to either increase revenue or increase time on site.
00:24:22.280And so the reason that The Economist and all these other even reputable websites have these bucket of links at the bottom is because they actually make more money from Taboola and Outbrain and a few others.
00:24:32.060Now, time on site seems somewhat insidious as a standard, except if you imagine that the content is intrinsically good.
00:24:43.140Now, I'm someone who's slowly but surely building a meditation app, right?
00:24:47.520So now time on my app will be time spent practicing meditation.
00:24:52.980And so insofar as I think that's an intrinsically good thing for someone to be doing, anything I do in the design of the app so as to make that more attractive to do, and in the best case, irresistible to do, right?
00:25:08.720I mean, the truth is, I would like an app in my life that got me to do something that is occasionally hard to do, but I know is worth doing and good for me to do.
00:25:18.840Rather than waste my time on Twitter, something like meditation, something like exercise, eating more wisely.
00:25:25.920I don't know how that can be measured in terms of time, but there are certain kinds of manipulations, speaking personally, of my mind that I would happily sign up for, right?
00:25:38.880So because of the attention economy constantly ratcheting up these persuasive tricks, the price of entry for, say, a new meditation app is you're going to have to try and find a way to sweeten that front door so that that competes with the other front doors that are on someone's screen at the moment when they wake up in the morning.
00:25:57.600And of course, you know, as much as I know, and I think many of us don't like to do this, it's like the Twitter and the Facebook and the email ones are just so compelling first thing in the morning, even if that's not what we'd like to be doing.
00:26:08.560And so because all of these different apps are neutrally competing on the same playing field for morning attention and not a specific kind of like helping Sam wake up best in the morning, for your meditation app and what many meditation apps I personally know, they have to provide these usually these notifications.
00:26:26.840So they start realizing, oh, shoot, Facebook and Twitter are notifying people first thing in the morning to get their attention.
00:26:32.380So if we're going to stand a chance to get in the game, we have to start notifying people.
00:26:36.400Right. And then everyone starts, again, amping up in the arms race and you don't end up with it's this race, classic race to the bottom.
00:26:42.520You don't end up with, you know, a screen you want to wake up to in the morning at all.
00:26:46.140It's not good for anybody, but it all became came from this this need to basically get there first to race up.
00:26:53.560And so wouldn't we want to change the structure of what you're competing for?
00:26:56.540So it's not just attention at all cost.
00:26:58.100So, yeah, so you have called for what I think you've called a Hippocratic oath for software designers.
00:27:06.820What do you think designers should be doing differently now?
00:27:11.840Well, I think of it less as the Hippocratic oath.
00:27:14.020That's the thing that got captured in the Atlantic article.
00:27:16.700But a different way to think about it is that the attention economy is like this city.
00:27:22.920You know, essentially, Apple and Google and Facebook are the urban planners of this city that a billion people live inside of.
00:27:28.080And we all live inside of it, like a billion people live inside of this attention city.
00:27:33.260And in that city, it's designed entirely for commerce.
00:27:37.120It's maximizing basically attention at all costs.
00:27:40.120And that was fine when we first got started.
00:27:43.300But now this is a city that people live inside of.
00:27:47.300I mean, the amount of time people spend on their phone, they wake up with them, they go to sleep with them, they check them 150 times a day.
00:27:52.920That's actually a real figure, too, right?
00:27:54.220150 times a day is a real figure, for sure, yeah.
00:27:56.580And so now what we'd want to do is organize that city, almost like, you know, Jane Jacobs created this sort of livable cities movement and said, you know, there are things that make a great city great.
00:28:08.660There are things that make a city livable.
00:28:10.560You know, she pointed out Eyes on the Street, you know, Stoops in New York.
00:28:14.360She was talking about Greenwich Village.
00:28:16.000These are things that make a neighborhood feel different, feel more homey, livable, safe.
00:28:21.400These are values people have about what makes a good urban planned city.
00:28:26.000There is no set of values to design this city for attention.
00:28:31.140So far, it's been this Wild West, let each app compete on the same playing field to get attention at all costs.
00:28:38.580So when you ask me, what should app designers do?
00:28:41.800I'm saying it's actually a deeper thing.
00:28:43.740That's like saying, what should the casinos who are all building stuff in the city do differently?
00:28:47.900If a casino is there and the only way for it to even be there is to do all the same manipulative stuff that the other casinos are doing, it's going to go out of business if it doesn't do that.
00:28:57.220So the better question to ask is, how would we reorganize the city by talking to the urban planners, by talking to Apple, Google, and Facebook to change the basic design?
00:29:08.540And one of the zones in the attention economy city would be the morning habits zone.
00:29:13.020So now you just get things competing for what's the best way to help people wake up in the morning, which could also include the phone being off, right?
00:29:20.720That could be part of how the phone, the option of the phone being off for a period of time and telling your friends that you're not up until 10 in the morning or whatever could be one of the things competing for the morning part of your life in the life zone there.
00:29:33.300And that would be a better strategy than trying to change meditation app designers to take a Hippocratic Oath to be more responsible when the whole game is just not set up for them to succeed.
00:29:44.680Well, to come back to that question, because it's of personal interest to me, because I do want to design this app in a way that seems ethically impeccable.
00:29:55.880If the thing you're directing people to is something that you think is intrinsically good, and forget about all the competition for mindshare that exists that you spoke about, it's just hard to do anyway.
00:30:09.000I mean, people are reluctant to do it.
00:30:10.500That's why I think an app would be valuable, and I think the existing apps are valuable.
00:30:15.620So if you think that any time on app is time well spent, which I don't think Facebook can say, I don't think Twitter can say, but I think Headspace can say that.
00:30:27.860Whether or not that's true, someone else can decide.
00:30:31.320But I think without any sense of personal hypocrisy, I think they feel that if you're using their app more, that's good for you, right?
00:30:38.700Because they think that it's intrinsically good to meditate.
00:30:41.220And I'm sure any exercise app, you know, or the health app or whatever it is, I'm sure that they all feel the same way about that.
00:30:48.760Take that case, and then let's move on to a case where everyone's motives are more mercenary, and where time on the app means more money for the company, which isn't necessarily the case for some other apps.
00:31:02.260When time on the app is intrinsically good, why not try to get people's attention any way you can?
00:31:09.960Right. Well, so this is where the question of metrics is really important, because in an ideal world, the thing that each app would be measuring would align with the thing that each person using the app actually wants.
00:31:23.200So time well spent would mean, in the case of a meditation app, asking the user, I mean, just not saying the app would do this, but if you were to think about it, a user would say, okay, in my life, what would be time well spent for me in the morning waking up?
00:31:36.120And then imagine that whatever the answer to that question is, should be the rankings in the app store, rewarding the apps that are best at that.
00:31:44.780So that again is more the systemic answer that the systems like the app stores and the ranking functions that run, say, search, Google search, or Facebook newsfeed would want to sort things by what helps people the most, not what's got the most time.
00:32:02.220And the measure of that would be the evaluation of the user. I mean, there's some questionnaire based rating. Is this working for you?
00:32:11.580Yeah. And in fact, we've done some initial work with this. Actually, there's an app called Moment on iOS. So Moment tracks how much time you spend in different apps, you send it a screenshot of your battery page on the iPhone, and it just captures all that data. And we've actually, they partnered with Time Well Spent to ask people, which apps do you find are most time well spent, most happy about the time you spent when you can finally see, this is all the time you spent in it. And which apps do you most regret?
00:32:39.820And we have the data back that people regret that people regret the time that they spend in Facebook, Instagram, Snapchat, and WeChat the most. And they tend, so far, the current rankings are, for the most are like MyFitnessPal and Podcasts, and there's a bunch of other ones that I forgot.
00:32:58.080The irony is that being ranked first in regret is probably as accurate a measure as any of the success of your app.
00:33:08.060Yeah, exactly. And this is why the economy isn't ranking things or aligning things with what we actually want. I mean, if you think about it as everything is a choice architecture, and you're sitting there as a human being worth picking from a menu, and currently the menu sorts things by what gets the most downloads, the most sales, the most, gets most talked about, the things that most manipulate your mind.
00:33:29.900And so the whole economy has become this, if you assume marketing is as persuasive as it is on a bigger level, the economy reflects what's best at manipulating people's psychology, not what's actually best in terms of delivered benefits in people's lives.
00:33:42.320And so if you think about this as a deeper systemic thing about if you would want, how would you want the economy to work, you'd want it to rank things so that the easiest thing to reach for would be the things that people found to be most time well spent in their lives, for whatever category of life choice that they're making at that moment.
00:34:00.760In terms of making choices easier or hard, because you can't escape, you know, in every single moment there is a menu, and some choices are easy to make, and some choices are hard to make.
00:34:09.420It seems to me you run into a problem which behavioral economists know quite well, and this is something that Danny Kahneman has spoken a lot about, that there's a difference between the experiencing self moment-to-moment and the remembered self.
00:34:23.220So when you're giving someone a questionnaire, asking them whether their time on all these apps and websites was well spent, you are talking to the remembered self.
00:34:33.980And Danny and I once argued about this, how to reconcile these two different testimonies, but
00:34:39.420at minimum you can say that they're reliably different, so that if you were experienced sampling people along the way, you know, for every 100 minutes on Facebook, every 10 minutes you were saying, how happy are you right now, you would get one measure.
00:34:54.680If at the end of the day you ask them, how good a use of your time was that to be on Facebook for 100 minutes, you would get a different measure.
00:35:01.760Sometimes they're the same, but they're very often different, and the question is who to trust.
00:35:07.740Where are the data that you're going to use to assess whether people are spending their time well?
00:35:13.300Well, I mean, the problem right now is that all of the metrics just relate to the current present self version, right?
00:35:20.720Everything is only measuring what gets most clicked or what gets most shared.
00:35:25.460So back to fake news, just because something is shared the most doesn't mean it's the most true.
00:35:31.660Just because something gets clicked the most doesn't mean it's the best.
00:35:34.680Just because something is talked about the most doesn't mean that it's real or true, right?
00:35:39.420The second that Facebook took away its human editorial team for the Facebook trends, and they fired that whole team, and so it's just an AI picking what the most popular news stories are.
00:35:51.540Within 24 hours, it was gamed, and the top story was a false story about Megyn Kelly and Fox News.
00:35:56.520And so right now, getting into AI about all of these topics, AIs essentially have a pair of eyes or sensors that are trying to pick from these impulsive or immediate signals, and it doesn't have a way of being in the loop or in conversation with our more reflective selves.
00:36:13.060It can only talk to our present-in-the-moment selves.
00:36:16.020And so you can imagine some kind of weird dystopian future where the entire world is only listening to your present-in-the-moment feelings and thoughts, which are easily gameable by persuasion.
00:36:25.700Although it just is a question how to reconcile the difference between being pleasantly engaged moment by moment in an activity at the end of which you will say, I kind of regret spending my time that way.
00:36:42.380There are certain things that are captivating where you're hooked for a reason, right?
00:36:46.940You know, whether it's a video game or whether you're eating french fries or popcorn or something that is just perfectly salted so that you just can't stop, you're binging on something because in that moment it feels good, and then retrospectively, very often you regret that use of time.
00:37:05.920Well, so one frame of this is this sort of shallow versus deep sense.
00:37:09.380That's what you're getting at here is a sense of something can either be full but empty, which we don't have really words in the English language for this, or something can be full and fulfilling.
00:37:18.200Things can be very engaging or pleasurable but not fulfilling.
00:37:24.700Yes, and even more specifically regretted.
00:37:27.000And then there's the set of choices that you can make for a timeline if you're, again, scheduling someone else's life for them, as people at Google and Facebook do every day,
00:37:34.160you know, where you can schedule a choice that is full and fulfilling.
00:37:38.820Now, does that mean that we should never put choices on the menu that are full but you regret?
00:37:43.920Like, should we never do that for Google or for Facebook?
00:37:46.960That's one frame, but let me actually flip it around and make it, I think, even more philosophically interesting.
00:37:50.980Let's say that in the future, YouTube is even better at knowing exactly what at every bone in your body you've been meaning to watch,
00:37:58.680like the professor or lecture that you've been told is the best lecture in the world,
00:38:03.120or just think about what every bone in your body tells you, in fact, would be full and fulfilling for you.
00:38:07.900And let's imagine this future deep mind-powered version of YouTube is actually putting those perfect choices next on the menu.
00:38:15.660So now it's autoplaying the perfect next thing that is also full and fulfilling.
00:38:21.260There's still something about the way the screen is steering your choices that are not about being in alignment with the life you want to live,
00:38:30.160because it's not in alignment with the time dimension now.
00:38:33.280So now it's sort of blowing open or blowing past boundaries.
00:38:36.500You have to bring your own boundaries.
00:41:24.000Nobody thinks they want that, but we're all acting like that's exactly what we want.
00:41:28.200Well, and I think this is where the language gets interesting, because when we talk about what we want, we talk about what we click.
00:41:33.980But in the moment right before you click, I mean, I'm kind of a meditator, too.
00:41:37.880It's like I notice that what's going on for me right before I click is not, as you know from free will, like, how much is that a conscious choice?
00:41:44.780What's really going on phenomenologically in that moment right before the click?
00:41:48.500None of your conscious choices are conscious choices.
00:41:53.120You're the last to know why you're doing the thing you're about to do, and you're very often misinformed about it.
00:41:59.600We can set up experiments where you'll reliably do the thing for reasons that you, when you're forced to articulate them, are completely wrong about.
00:42:07.280Absolutely. And even moreover, people, again, when they're about to click on something, don't realize there's a thousand people on the other side of the screen whose job it was was to get you to click on that, because that's what Facebook and Snapchat and YouTube are all for.
00:42:23.520But do you think that fact alone would change people's behavior if you could make that transparent?
00:42:29.980It just seems it would be instructive for most people to see the full stream of causes that engineered that moment for them.
00:42:38.060Well, one thing I've got some friends in San Francisco were talking about this, that people don't realize, and especially when you start applying some kind of normativity and saying, you know, the newsfeed's really not good.
00:43:01.740If you were to give it a name, if Google has page rank, Facebook is engagement rank.
00:43:05.780Now, let's say, let's take it all the way to the end.
00:43:08.500Let's say you could switch modes as a user, and you could actually switch Facebook to addiction rank.
00:43:14.400Facebook actually has a version of newsfeed that I'm sure it could deploy called, you know, let's just actually tweak the variables so that whatever,
00:43:21.900let's show people the things that will addict them the most.
00:43:24.600Or we have outrage rank, which will show you the things that will outrage you the most.
00:43:27.920Or we have NPR rank, which actually shows you the most boring, long comment threads, where you have, like, these, you know, long, in-depth conversations.
00:43:36.860So that your whole newsfeed is these long, deep, threaded conversations.
00:43:40.440Or you could have the Bill O'Reilly mode, where you get these, as something I know you care about, these sort of attack dog style comment threads where people are yelling at each other.
00:43:48.000You can imagine that the newsfeed could be ranked in any one of these ways.
00:43:51.020Actually, this form of choice is already implemented on Flickr, where you, when you look for images, you can choose relevant or interesting.
00:44:00.020So you could have that same drop-down menu for any of these other media.
00:44:04.980And this is your point of, like, people don't see transparently what the goals of the designers who put that choice in front of you are.
00:44:17.220It's not just something for you to use.
00:44:18.840You can, obviously, with enough effort, get, you know, use Facebook for all sorts of things.
00:44:23.200But the point is, the default sort of compass or North Star on the GPS that is Facebook of steering your life is not steering your life towards,
00:44:32.020hey, help me have the dinner party, you know, that I want to have.
00:44:34.220Or help me get together with my friends on Tuesday.
00:44:37.500Or help me make sure I'm not feeling lonely on a Tuesday night.
00:44:41.060But there is, it seems to me, a necessary kind of paternalism here that we just have to accept.
00:44:49.660Because it seems true that we're living in a world where no one or virtually no one would consciously choose the outrage tab.
00:45:28.500If you'd like to continue listening to this conversation, you'll need to subscribe at SamHarris.org.
00:45:34.800Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along with other subscriber-only content, including bonus episodes and AMAs and the conversations I've been having on the Waking Up app.
00:45:46.180The Making Sense podcast is ad-free and relies entirely on listener support.
00:45:51.600And you can subscribe now at SamHarris.org.