The Joe Rogan Experience - October 30, 2020


Joe Rogan Experience #1558 - Tristan Harris


Episode Stats

Length

2 hours and 21 minutes

Words per Minute

182.03845

Word Count

25,719

Sentence Count

1,652

Misogynist Sentences

11

Hate Speech Sentences

21


Summary

In this episode of the Joe Rogan Experience podcast, Joe talks to Tristan Thompson about his new Netflix documentary, The Social Dilemma. Tristan talks about how he got into the field of technology ethics, and why he thinks the "Don't Be Evil" mantra should be taken down. Joe also talks about his background as a magician and how he ended up in Silicon Valley, and what it means to be a "design ethicist" in the early days of Google, and how they adopted the mantra, "Don t be evil." Joe also discusses the impact of the film on the teen mental health crisis, and the impact it can have on our perception of the world and how technology can be a tool for manipulation and control. Joe also asks the question, why did Google take down their own mantra, Don't be evil? and what would you do if you had the power to make a statement that could change the world? And how would you take that statement down? What would you say to your friends, family, colleagues, and enemies? How would you remove it? What is the best piece of advice you could give them? Do you agree or disagree with that statement? Is it a good or bad one, or a bad one? Joe and Tristan answer these questions, and explain why you think it would be a good one, and if you would take it down, or not? If you have a problem with it, let us know what you think about it. We'd love to hear your thoughts on it. Tweet us in the comments section below! or send us your thoughts, thoughts, opinions, or thoughts, feelings, or anything else we could improve on it! or anything you think we could help us do about it and we'd like to hear about it in the next episode of The SocialDilemma? in a future episode about it :) Timestamps: 5:30 - What is a good thing that s going to happen in the future? 6:00 - What do you think of it? 5:20 - What are you looking for? 7:00 8:00 | What s a good idea? 9:30 | Is it better? 10: What s the worst thing you ve done? 11:40 | What does it matter? 12:30 13:40 15:40 - How do you feel about it?


Transcript

00:00:03.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 Tristan, how are you?
00:00:13.000 Good.
00:00:13.000 Good to be here.
00:00:14.000 Good to have you here, man.
00:00:16.000 You were just telling me before we went on air the numbers of The Social Dilemma, and they're bonkers.
00:00:22.000 So just say that.
00:00:23.000 Yeah.
00:00:24.000 The Social Dilemma was seen by 38 million households in the first 28 days on Netflix, which I think is broken records.
00:00:33.000 And if you assume, you know, a lot of people are seeing it with their family because parents seeing it with their kids, the issues that are around teen mental health, So if you assume one out of ten families saw it with a few family members, we're in the 40 to 50 million people range, which is just broken records, I think, for Netflix.
00:00:48.000 I think it was the second most popular documentary throughout the month of September, or film throughout the month of September.
00:00:54.000 It was a really well-done documentary, but I think it's one of those documentaries that affirmed a lot of people's worst suspicions about the dangers of social media, and then on top of that, It sort of alerted them to what they were already experiencing in their own personal life and,
00:01:13.000 like, highlighted it.
00:01:14.000 Yeah, I think that's right.
00:01:15.000 I mean, most people were aware—I think it's a thing everyone's been feeling—that the feeling you have when you use social media isn't that this thing is just a tool or it's on my side.
00:01:26.000 Is an environment based on manipulation, as we say in the film.
00:01:29.000 And that's really what's changed.
00:01:31.000 I remember, you know, I've been working on these issues for something like eight years or something now.
00:01:38.000 Can you please tell people who didn't see the documentary what your background is and how you got into it?
00:01:43.000 Yeah, so I... The film goes back as a set of technology insiders.
00:01:50.000 My background was as a design ethicist at Google.
00:01:53.000 So I first had a startup company that we sold to Google, and I landed there through a talent acquisition.
00:01:59.000 And then started, about a year into being at Google, made a presentation that was about how, essentially, technology was holding the human collective psyche in its hands, that we were really controlling the world's psychology.
00:02:15.000 Because every single time people look at their phone, they are basically experiencing thoughts and scrolling through feeds and believing things about the world.
00:02:22.000 This has become the primary meaning-making machine for the world.
00:02:25.000 And that we as Google had a moral responsibility to, you know, hold the collective psyche in a thoughtful, ethical way and not create this sort of race to the bottom of the brainstem attention economy that we now have.
00:02:40.000 So my background was as a kid.
00:02:42.000 I was a magician.
00:02:43.000 We can get into that.
00:02:45.000 I studied at a class called the Stanford Persuasive Technology class that taught a lot of the engineers in Silicon Valley kind of how the mind works.
00:02:55.000 Some of the co-founders of Instagram were there.
00:02:58.000 And then later studied behavioral economics and how the mind is sort of influenced.
00:03:03.000 I went into cults and started studying how cults work and then arrived at Google through this lens of, you know, technology isn't really just this thing that's in our hands.
00:03:10.000 It's more like this manipulative environment that is tapping into our weaknesses.
00:03:15.000 Everything from the slot machine rewards to the way you get tagged in a photo and it sort of manipulates your social validation and approval, these kinds of things.
00:03:24.000 When you were at Google, did they still have the don't be evil sign up?
00:03:29.000 I don't know if there's actually a physical sign, was there?
00:03:31.000 There was never a physical sign?
00:03:32.000 I thought there was something that they actually had.
00:03:34.000 I think it was, there was this guy, was it Paul, not Paul, what was his last name?
00:03:39.000 He was the inventor, one of the inventors of Gmail, and they had a meeting, and they came up with this mantra.
00:03:43.000 Because they realized the power that they had, and they realized that there was going to be a conflict of interest between advertising on the search results and regular search results.
00:03:51.000 And so we know that, they know that they could abuse that power, and they came up with this mantra, I think, in that meeting in the early days, to don't be, don't be evil.
00:03:59.000 There was a time where they took that mantra down, and I remember reading about it online.
00:04:04.000 They took it off their page, I think.
00:04:06.000 That's what it was.
00:04:07.000 Yeah.
00:04:07.000 And when I read that, I was like, that should be big news.
00:04:11.000 Like, there's no reason to take that down.
00:04:14.000 Why would you take that down?
00:04:16.000 Yeah.
00:04:17.000 Why would you say, well, maybe it could be a little evil.
00:04:20.000 Let's not get crazy.
00:04:21.000 It's a good question.
00:04:22.000 I mean, I wonder what logic would have you remove a statement like that.
00:04:26.000 That seems like a standard statement.
00:04:28.000 Like, it's a great statement.
00:04:29.000 Okay, here it is.
00:04:29.000 Google removes don't be evil clause from its code of conduct.
00:04:32.000 In 2018?
00:04:34.000 Yeah.
00:04:34.000 Yeah.
00:04:35.000 I wonder why.
00:04:36.000 Did they have an explanation?
00:04:38.000 Did it say anything?
00:04:42.000 Don't Be Evil has been a part of the company's corporate code of conduct since 2000 when Google was reorganized under a new parent company, Alphabet.
00:04:50.000 In 2015, Alphabet assumed a slightly adjusted version of the model.
00:04:53.000 Do the right thing.
00:04:54.000 Do the right thing.
00:04:55.000 Oh, that's a Spike Lee movie, bitch.
00:04:57.000 However, Google retained its original Don't Be Evil language until the past several weeks.
00:05:02.000 The phrase has been deeply incorporated into Google's company culture.
00:05:12.000 I think I remember that, yeah.
00:05:19.000 Wow.
00:05:21.000 I wonder why they decided...
00:05:24.000 Well, I mean, they did change it to do the right thing.
00:05:27.000 I mean, we always used to say that, just to friends, not within Google, but just, you know, instead of saying, don't be evil, just say, let's do some good here, right?
00:05:34.000 That's nice.
00:05:35.000 Let's do some good here.
00:05:36.000 Yeah, think positive.
00:05:38.000 Think doing good instead of don't do bad.
00:05:42.000 Yeah, but the problem is when you say do good, the question is who's good, because you live in a morally plural society, and there's this question of who are you to say what's good for people, and it's much easier to say let's reduce harms than it is to say let's actually do good like this.
00:05:54.000 It says, the updated version of Google's Code of Conduct still retains one reference to the company's unofficial motto.
00:06:00.000 The final line of the document is still, and remember, dot, dot, dot, don't be evil, and if you see something that you think isn't right, speak up.
00:06:11.000 Okay.
00:06:13.000 Well, they still have Don't Be Evil, so maybe it's much ado about nothing.
00:06:17.000 But having that kind of power, we were just before the podcast, we were watching Jack Dorsey speak to members of the Senate in regards to Twitter censoring the Hunter Biden story and censorship of conservatives, but allowing dictators to spread propaganda,
00:06:34.000 dictators from other countries, and why and what this is all about.
00:06:39.000 One of the things that Jack Dorsey has been pretty adamant about is that they really never saw this coming when they started Twitter.
00:06:47.000 And they didn't think that they were ever going to be in this position where they were going to be really the arbiters of free speech for the world.
00:06:56.000 Right.
00:06:56.000 Which is essentially in some ways what they are.
00:06:59.000 I think it's important to roll back the clock for people because it's easy to think...
00:07:03.000 You know, that we just sort of landed here and that they would know that they're going to be influencing the global psychology.
00:07:08.000 But I think we should really reverse engineer for the audience.
00:07:13.000 How did these products work the way that they did?
00:07:15.000 So, like, let's go back to the beginning days of Twitter.
00:07:17.000 I think his first tweet was something like, Yeah.
00:07:36.000 And the real genius of these things was that they weren't just offering this thing you could do, they found ways of keeping people engaged.
00:07:46.000 I think this is important for people to get, that they're not competing for your data or for money, they're competing to keep people using the product.
00:07:56.000 And so when Twitter, for example, invented This persuasive feature of the number of followers that you have.
00:08:02.000 If you remember, like, that was a new thing at the time, right?
00:08:04.000 You log in and you see your profile.
00:08:07.000 Here's the people who you can follow.
00:08:08.000 And then here's the number of followers you have.
00:08:10.000 That created a reason for you to come back every day to see how many followers do I have.
00:08:14.000 So that was part of this race to keep people engaged, as we talk about in the film, like these things are competing for your attention, that if you're not paying for the product, you are the product, but the thing that is the product is your predictable behavior.
00:08:27.000 You're using the product in predictable ways.
00:08:30.000 And I remember a conversation I had with...
00:08:33.000 Someone at Facebook, who's a friend of mine, who said in a coffee shop one day, people think that we, Facebook, are competing with something like Twitter, that one social network is competing with another social network.
00:08:46.000 But really, he said, our biggest competitor is YouTube, because they're not competing for social networks, they're competing for attention.
00:08:53.000 And YouTube is the biggest competitor in the digital space for attention.
00:08:57.000 And that was a real lightbulb moment for me, because you realize that as they're designing these products, they're finding new clever ways to get your attention.
00:09:06.000 That's the real thing that I think is different in the film The Social Dilemma, rather than talking about, you know, censorship and data and privacy and these themes.
00:09:14.000 It's really what is the core influence or impact that the shape of these products have on how we're making meaning of the world when they're steering our psychology.
00:09:23.000 Do you think that it was inevitable that someone manipulates the way people use these things to gather more attention?
00:09:30.000 And do you think that any of this could have been avoided if there was laws against that?
00:09:36.000 If instead of having these algorithms That specifically target things that you're interested in or things that you click on or things that are going to make you engage more.
00:09:46.000 If someone said, listen, you can have these things, you can allow people to communicate with each other, but you can't manipulate their attention span.
00:09:59.000 So we've always had an attention economy, right?
00:10:01.000 And you're competing for it right now.
00:10:03.000 And politicians compete for it.
00:10:04.000 Can you vote for someone you've never paid attention to, never heard about, never heard them say something outrageous?
00:10:09.000 No.
00:10:11.000 So there's always been an attention economy.
00:10:13.000 And so it's hard to say we should regulate who gets attention or how.
00:10:17.000 But it's organic in some ways.
00:10:20.000 Right.
00:10:20.000 Like, this podcast is an organic, I mean, if we're in competition, it's organic.
00:10:25.000 I just put it out there, and if you watch it, you don't, or you don't, I don't, you know, I don't have any say over it, and I'm not manipulating it in any way.
00:10:33.000 Sort of.
00:10:34.000 So, I mean, let's imagine that the podcast apps were different, and they actually, while you're watching, they had, like, the hearts and the stars and the kind of voting up in numbers, and you could, like, send messages back and forth, and Apple Podcasts worked in a way that didn't just reward, you know, the things that you clicked follow on.
00:10:50.000 It actually sort of promoted the stuff that someone said the most outrageous thing.
00:10:55.000 Then you as a podcast creator have an incentive to say the most outrageous thing and then you arrive at the top of the Apple Podcasts or Spotify app.
00:11:03.000 And that's the thing, is that we actually are competing for attention.
00:11:06.000 It felt like it was neutral, and it was relatively neutral.
00:11:10.000 And to progress that story back in time with Twitter competing for attention, let's look at some other things that they did.
00:11:15.000 So they also added this retweet, this instant resharing feature, right?
00:11:19.000 And that made it more addictive, because suddenly we're all playing the fame lottery, right?
00:11:23.000 Like, I could retweet your stuff, and then you get a bunch of hits, and then you could go viral, and you could get a lot of attention.
00:11:28.000 So then instead of the companies competing for attention, now each of us suddenly win the fame lottery over and over and over again, and we're getting attention.
00:11:36.000 And then...
00:11:37.000 Oh, I had another example I was going to think about.
00:11:39.000 I forgot it.
00:11:40.000 What was it?
00:11:42.000 You can jump in if you want.
00:11:44.000 Apple has an interesting way of handling sort of the way they have their algorithm for their podcast app.
00:11:54.000 It's secret.
00:11:55.000 It's weird.
00:11:57.000 But one of the things that it favors is it favors new shows and it favors engagement and new subscribers.
00:12:05.000 So comments, engagement, and new shows.
00:12:09.000 There you go.
00:12:09.000 And that's the same as competing for attention because engagement must mean people like it.
00:12:14.000 And there's going to be a fallacy as we go down that road, but go on.
00:12:17.000 Well, it's interesting because you could say if you have a podcast and your podcast gets like, let's say, 100,000 downloads, a new podcast can come along and it can get 10,000 downloads and it'll be ahead of you in the rankings.
00:12:30.000 And so you could be number three and it could be number two and you're like, well, how is that number two?
00:12:35.000 And it's got 10 times less, but they don't do it that way.
00:12:40.000 And their logic is they don't want the podcast world to be dominated by, you know, New York Times.
00:12:46.000 The big one.
00:12:46.000 Yeah, and whatever's number one and number two and number three forever.
00:12:49.000 We actually just experienced this.
00:12:52.000 We have a podcast called Urine Divided Attention, and since the film came out in that first month, we went from being in the lower 100 or something like that to we shot to the top five.
00:13:01.000 I think we were the number one tech podcast for a while, and so we just experienced this through the fact not that we had the most listeners, but because the trend was so rapid that we sort of jumped to the top.
00:13:11.000 I think it's wise that they do that because eventually it evens out over time.
00:13:17.000 You see some people rock it to the top like, oh my god, we're number three.
00:13:21.000 And you're like, hang on there, fella.
00:13:22.000 Just give it a couple of weeks.
00:13:24.000 And then three weeks later, four weeks later, now they're number 48. They get depressed.
00:13:28.000 That was really where you should have been.
00:13:31.000 But the thing that Apple does that I really like in that is it gives an opportunity for these new shows to be seen.
00:13:39.000 Where they might have gotten just stuck because these rankings and the ratings for a lot of these shows are so consistent and they have such a following already.
00:13:49.000 It's very difficult for these new shows to gather attention.
00:13:52.000 And the problem was that there were some people that gamed the system.
00:13:58.000 And there was companies that could literally move.
00:14:00.000 Like Earl Skakel.
00:14:02.000 Remember Earl became the number one podcast?
00:14:04.000 And no one was listening to it?
00:14:07.000 Earl has money.
00:14:08.000 And he hired some people to game the system.
00:14:12.000 And he was kind of open about it.
00:14:14.000 And laughing about it.
00:14:16.000 Isn't he banned from iTunes now?
00:14:18.000 Or something?
00:14:19.000 I think he got banned because of that.
00:14:22.000 Because it was so obvious he gamed the system.
00:14:24.000 He had like a thousand downloads and he was number one.
00:14:26.000 I mean, the thing is that we're...
00:14:29.000 Apple Podcasts you can think of as like the Federal Reserve or the government of the attention economy because they're setting the rules by which you win, right?
00:14:36.000 They could have set the rules, as you said, to be, you know, who has the most listeners and then you just keep rewarding the kings that already exist versus who is the most trending.
00:14:45.000 There's actually a story...
00:14:46.000 A friend of mine told me, I don't know if it's true, although it was a fairly credible source, who said there was a meeting with Steve Jobs when they were making the first podcast app, and that they had made a demo of something where you could see all the things your friends were listening to.
00:15:03.000 So just like making a news feed like we do with Facebook and Twitter, right?
00:15:07.000 And then he said was, well, why would we do that?
00:15:09.000 If something is important enough, your friend will actually just send you a link and say you should listen to this.
00:15:16.000 Like, why would we automatically just promote random things that your friends are listening to?
00:15:21.000 And again, this is kind of how you get back to social media.
00:15:24.000 How is social media so successful?
00:15:25.000 Because it's much more addictive to see what your friends are doing in a feed, but it doesn't reward what's true or what's meaningful.
00:15:31.000 And this is the thing that people need to get about social media is it's really just rewarding the things that tend to keep people back addictively.
00:15:39.000 The business model is addiction in this race to the bottom of the brainstem for attention.
00:15:42.000 Well, it seems like if we, in hindsight, if hindsight is 20-20, what should have been done or what could have been done had we known where this would...
00:16:00.000 Right.
00:16:13.000 And then the problem is, so this is the thing I was going to say about Twitter, is when one company does the, call it the engagement feed, meaning showing you the things that the most people are clicking on and retweeting, trending, things like that.
00:16:28.000 Let's imagine there's two feeds.
00:16:29.000 So there's the feed that's called the reverse chronological feed, meaning showing in order in time, you know, Joe Rogan posted this two hours ago, but that's, you know, After that, you have the thing that people posted an hour and a half ago all the way up to 10 seconds ago.
00:16:41.000 That's the reverse chronological.
00:16:43.000 They have a mode like that on Twitter.
00:16:45.000 If you click the sparkle icon, I don't know if you know this, it'll show you just in time, here's what people said, you know, sorted by recency.
00:16:52.000 But then they have this other feed called what people click on, retweet, etc., the most, the people you follow.
00:16:57.000 And it sorts it by what it thinks you'll click on and want the most.
00:17:00.000 Which one of those is more successful at getting your attention?
00:17:03.000 The sort of recency, what they posted recently, versus what they know people are clicking and retweeting on the most.
00:17:10.000 Certainly what they know people are clicking on and retweeting the most.
00:17:13.000 Correct.
00:17:13.000 And so once Twitter does that, let's say Facebook was sitting there with the recency feed, like just showing you here's the people who posted in this time order sequence.
00:17:22.000 They have to also switch to who is like the most relevant stuff, right?
00:17:27.000 The most clicked, retweeted the most.
00:17:29.000 So this is part of this race for attention that once one actor does something like that, and they algorithmically We're good to go.
00:18:01.000 And it becomes, again, this game-theoretic race of who's going to do more.
00:18:04.000 Now, if you open up TikTok, TikTok doesn't even wait—I don't know if you know or your kids use TikTok—but when you open up the app, it doesn't even wait for you to click on something.
00:18:13.000 It just actually plays the first video the second you open it, which none of the other apps do, right?
00:18:17.000 And the point of that is that causes you to enter into this engagement stream even faster.
00:18:22.000 So again, this race for attention produces things that are not good for society.
00:18:27.000 And even if you took the whack-a-mole sticker, you took the antitrust case, and you whack Facebook, and you got rid of Facebook, or you whack Google, or you whack YouTube...
00:18:34.000 You're just going to have more actors flooding in doing the same thing.
00:18:38.000 And one other example of this is the time it takes to reach, let's say, 10 million followers.
00:18:46.000 So if you remember back in the—wasn't it Ashton Kutcher who raced for the first million followers?
00:18:50.000 Race with CNN. Race with CNN, right?
00:18:51.000 Yeah.
00:18:52.000 So now, if you think of it, the companies are competing for our attention.
00:18:56.000 If they find out that each of us becoming a celebrity and having a million people we get to reach, if that's the currency of the thing that gets us to come back to get more attention, then they're competing at who can give us that bigger fame lottery hit faster.
00:19:10.000 So let's say 2009 or 2010 when Ashton Kutcher did that.
00:19:14.000 It took him, I don't know how long it took, months for him to get a million?
00:19:18.000 No, I don't remember.
00:19:18.000 It was a little bit though, right?
00:19:20.000 And then TikTok comes along and says, hey, we want to give kids the ability to hit the fame lottery and make it big, hit the jackpot even faster.
00:19:28.000 We want you to be able to go from zero to a million followers in 10 days, right?
00:19:32.000 And so they're competing to make that shorter and shorter and shorter.
00:19:35.000 And I know about this because, you know, speaking from a Silicon Valley perspective, Venture capitalists fund these new social platforms based on how fast they can get to like 100 million users.
00:19:46.000 There was this famous line that like, I forgot what it was, but I think Facebook took like 10 years to get to 100 million users.
00:19:52.000 Instagram took, you know, I don't know, four years, three years or something like that.
00:19:56.000 TikTok can get there even faster.
00:19:57.000 And so it's shortening, shortening, shortening.
00:19:59.000 And that's what people are, that's what we're competing for.
00:20:02.000 It's like who can win the fame lottery faster.
00:20:04.000 But is a world where everyone broadcasts to millions of people Yeah, I think.
00:20:27.000 More conspiracy-oriented views of the world, QAnon, Facebook groups, things like that.
00:20:32.000 And we can definitely go into it.
00:20:34.000 There's a lot of legitimate conspiracy theories, so I don't want to make sure I'm not categorically dismissing stuff.
00:20:39.000 But that's really the point, is that we have landed in a world where the things that we are paying attention to are not necessarily the agenda of topics that we would say, in a reflective world, what we would say is the most important.
00:20:52.000 So there's a lot of conversation about free will and about letting people choose whatever they enjoy viewing and watching and paying attention to.
00:21:07.000 But when you're talking about these incredibly potent algorithms and the incredibly potent addictions that people...
00:21:19.000 The people develop to these things, and we're pretending that people should have the ability to just ignore it and put it away.
00:21:26.000 Right.
00:21:26.000 Use your willpower, Jim.
00:21:27.000 Yeah.
00:21:28.000 That seems...
00:21:29.000 Have your kids use your willpower.
00:21:30.000 I have a folder on my phone called Addict, and it's all caps, and it's at the end of my...
00:21:36.000 You have to scroll through all my other apps to get to it.
00:21:38.000 And so if I want to get to Twitter or Instagram, I've got to go there.
00:21:41.000 The problem is that the app switcher will put it in the most recent.
00:21:44.000 So once you switch apps and you have Twitter in a recent, it'll be right there.
00:21:47.000 So that's the problem.
00:21:48.000 If I want to go...
00:21:49.000 Left, and yeah, if I want to see that, yeah, you can do that.
00:21:52.000 Yeah, it's insanely addictive, and if you can control yourself, it's not that big a deal.
00:22:02.000 But how many people can control themselves?
00:22:04.000 Well, I think the thing we have to hone in on is the asymmetry of power.
00:22:10.000 As I say in the film, it's like...
00:22:11.000 We're bringing this ancient brain hardware, the prefrontal cortex, which is like what you use to do goal-directed action, self-control, willpower, holding back, you know, marshmallow test, don't get the marshmallow now, wait later for the two marshmallows later.
00:22:27.000 All of that is through our prefrontal cortex.
00:22:29.000 And when you're sitting there and you think, okay, I'm going to go watch, I'm going to look at this one thing on Facebook because my friend invited me to this event, or it's this one post I have to look at.
00:22:38.000 And the next thing you know, you find yourself scrolling through the thing for like an hour.
00:22:42.000 And you say, man, that was on me.
00:22:44.000 I should have had more self-control.
00:22:45.000 But there, behind the screen, behind that glass slab, is like a supercomputer pointed at your brain.
00:22:51.000 That is predicting the perfect thing to show you.
00:22:55.000 Next.
00:22:56.000 And you can feel it.
00:22:57.000 This is really important.
00:22:58.000 So if I'm Facebook, when you flick your finger, you think, when you're using Facebook, it's just going to show me the next thing that my friend said.
00:23:06.000 But it's not doing that.
00:23:07.000 When you flick your finger, it actually literally wakes up this sort of supercomputer avatar voodoo doll version of Joe.
00:23:13.000 And the voodoo doll of Joe is, you know, the more clicks you ever made on Facebook is like adding the little hair to the voodoo doll.
00:23:21.000 And the more likes you've ever made adds little clothing to the voodoo doll.
00:23:25.000 And the more, you know, watch time on videos you've ever had adds little, you know, shoes to the voodoo doll.
00:23:31.000 So the voodoo doll is getting more and more accurate the more things you click on.
00:23:34.000 This is in the film The Social Dilemma.
00:23:35.000 Like, if you notice, like, the character, you know, as he's using this thing...
00:23:40.000 It builds a more and more accurate model that the AIs, the three AIs behind the screen, are kind of manipulating.
00:23:45.000 And the idea is it can actually predict and prick the voodoo doll with this video or that post from your friends or this other thing, and it'll figure out the right thing to show you that it knows will keep you there, because it's already seen how that same video or that same post has kept 200 million other voodoo dolls there,
00:24:02.000 because you just look like another voodoo doll.
00:24:04.000 So here's an example.
00:24:05.000 And this works the same on all the platforms.
00:24:07.000 If you were a teen girl and you opened a dieting video on YouTube, 70% of YouTube's watch time comes from the recommendations on the right-hand side, right?
00:24:17.000 So the things that are showing recommended videos next.
00:24:20.000 And it will show you...
00:24:23.000 What did it show the girls who watched the teen dieting video?
00:24:27.000 It showed anorexia videos, because those were better at keeping the teen girls' attention.
00:24:32.000 Not because it said these are good for them, these are helpful for them.
00:24:36.000 It just says these tend to work at keeping their attention.
00:24:39.000 So again, these tend to work if you are already watching diet videos?
00:24:43.000 Yeah.
00:24:43.000 So if you're a 13-year-old girl and you watch a diet video, YouTube wakes up its voodoo doll version of that girl and says, hey, I've got like 100 million other voodoo dolls of 13-year-old girls, right?
00:24:53.000 And they all tend to watch these other videos.
00:24:55.000 I just know that they have this word thinspo.
00:24:58.000 Thinspiration is the name for it.
00:24:59.000 Really?
00:25:00.000 To be inspired for anorexia.
00:25:01.000 Yeah, it's a real thing.
00:25:02.000 YouTube addressed this problem a couple years ago.
00:25:04.000 But when you let the machine run blind, all it's doing is picking stuff that's engaging.
00:25:09.000 Right.
00:25:09.000 Why did they choose to not let the machine run blind with one thing, like anorexia?
00:25:15.000 Well, so now we're getting into the Twitter censorship conversation and the moderation conversation.
00:25:19.000 So this is why I don't focus on censorship and moderation, because the real issue is if you blur your eyes and zoom way out and say, how does the whole machine tend to operate?
00:25:28.000 Like, no matter what I start with, what is it going to recommend next?
00:25:32.000 So, you know, if you started with, you know, a World War II video, YouTube would recommend a bunch of Holocaust denial videos, right?
00:25:42.000 If you started teen girls with a dieting video, it would recommend these anorexia videos.
00:25:47.000 In Facebook's case, if you joined—there's so many different examples here because Facebook recommends groups to people based on what it thinks is most engaging for you.
00:25:56.000 So if you were a new mom—you had Renee DiResta, my friend, on this podcast— We've done a bunch of work together, and she has this great example of as a new mom, she joined one Facebook group for mothers who do do-it-yourself baby food, like organic baby food.
00:26:10.000 And then Facebook has this sidebar.
00:26:12.000 It says, here's some other groups you might recommend, you might want to join.
00:26:15.000 And what do you think was the most engaging of those?
00:26:17.000 Because Facebook, again, is picking on which group, if I got you to join it, would cause you to spend the most time here, right?
00:26:25.000 So for some do-it-yourself baby food groups, which group do you think it's selected?
00:26:29.000 Probably something about vaccines.
00:26:31.000 Exactly.
00:26:31.000 So anti-vaccines for moms.
00:26:32.000 Yeah.
00:26:34.000 Okay.
00:26:34.000 So then if you join that group, now it does the same run the process again.
00:26:38.000 So then, so now look at Facebook.
00:26:40.000 So it says, Hey, I've got these voodoo dolls.
00:26:41.000 I've got like a hundred million voodoo dolls and they're all, they just joined this anti-vaccine moms group.
00:26:46.000 And then what do they tend to engage with for very long time?
00:26:49.000 If I get them to join these other groups, which of those other groups would show up?
00:26:53.000 I don't know.
00:26:54.000 Chemtrails.
00:26:55.000 Oh, okay.
00:26:56.000 The Pizzagate.
00:26:57.000 Flat Earth?
00:26:57.000 Flat Earth, absolutely.
00:26:59.000 Yep.
00:27:00.000 And YouTube recommended...
00:27:01.000 So I'm interchangeably going from YouTube to Facebook because it's the same dynamic.
00:27:05.000 They're competing for attention.
00:27:06.000 And YouTube recommended Flat Earth conspiracy theories hundreds of millions of times.
00:27:11.000 And so when you're a parent during COVID... And you sit your kids in front of YouTube because you're like, this is the digital pacifier.
00:27:19.000 I've got to let them do their thing.
00:27:20.000 I've got to do work.
00:27:21.000 And then you come back to the dinner table and your kid says, you know, the Holocaust didn't happen and the earth is flat.
00:27:26.000 And people are wondering why.
00:27:28.000 It's because of this.
00:27:29.000 And now, to your point about this sort of moderation thing, we can take the whack-a-mole stick after the public yells, and Renee and I make a bunch of noise or something, in a large community, by the way, of people making noise about this, and they'll say, okay, shoot, you're right.
00:27:43.000 Flat Earth, we've got to deal with that.
00:27:44.000 And so they'll tweak the algorithm.
00:27:45.000 And then people make a bunch of noise about the inspiration videos for anorexia for kids, and they'll deal with that problem.
00:27:52.000 But then they start doing it based reactively, But again, if you zoom out, it's just still recommending stuff that's kind of from the crazy town section of YouTube.
00:28:01.000 Is the problem the recommendation?
00:28:03.000 Because I don't mind that people have ridiculous ideas about hollow earth because I think it's humorous.
00:28:10.000 But I'm also a 53-year-old man.
00:28:13.000 Right.
00:28:13.000 I'm not a 12-year-old boy with a limited education that is like, oh my god, the government's lying to us.
00:28:21.000 There's lizard people that live under the earth.
00:28:22.000 Right.
00:28:23.000 But if that's the real argument about these conspiracy theories is that they can influence young people or the easily impressionable or people that maybe don't have a sophisticated sense of vetting out bullshit.
00:28:35.000 Right.
00:28:35.000 Well, and the algorithms aren't making a distinction between who is just laughing at it and who is deeply vulnerable to it.
00:28:42.000 Exactly.
00:28:43.000 And generally, it just says who's vulnerable to it.
00:28:45.000 Because another example, the way I think about this is if you're driving down the highway And, you know, there's Facebook and Google trying to figure out, like, what should I give you based on what tends to keep your attention?
00:28:54.000 If you look at a car crash, and everybody driving on the highway, they look at the car crash.
00:28:58.000 According to Facebook and Google, it's like, the whole world wants car crashes.
00:29:01.000 We just feed them car crashes after car crashes after car crashes.
00:29:04.000 And what the algorithms do, as Guillaume Chaslow in the film says, who's the YouTube whistleblower from the YouTube recommendation system...
00:29:12.000 Is they find the perfect little rabbit hole for you that it knows will keep you there for five hours.
00:29:16.000 And the conspiracy theory, like dark corners of YouTube, were the dark corners that tends to keep people there for five hours.
00:29:23.000 And so you have to realize that we're now something like 10 years in to this vast psychology experiment, where it's been, you know, in every language, in hundreds of countries, right, in hundreds of languages, it's been steering people towards the crazy town.
00:29:36.000 When I say crazy town, I think of, you know, imagine there's a spectrum on YouTube.
00:29:41.000 And there's on one side you have like the calm Walter Cronkite, Carl Sagan, you know, slow, you know, kind of boring, but like educational material or something.
00:29:51.000 And the other side of the spectrum, you have, you know, the craziest stuff you can find.
00:29:56.000 Crazy Town.
00:29:57.000 No matter where you start, you could start in Walter Cronkite or you could start in Crazy Town.
00:30:02.000 But if I'm YouTube and I want you to watch more, am I going to steer you towards the calm stuff or am I going to steer you more towards Crazy Town?
00:30:10.000 Crazy town.
00:30:10.000 Always more towards crazy town.
00:30:11.000 So then you imagine just tilting the floor of humanity just by like three degrees, right?
00:30:16.000 And then you just step back and you let society run its course.
00:30:20.000 As Jaron Lanier says in the film, if you just tilt society by one degree, two degrees, that's the whole world.
00:30:25.000 That's what everyone is thinking and believing.
00:30:29.000 Right.
00:30:35.000 Right.
00:30:38.000 Right.
00:30:45.000 The things we are about the world.
00:30:47.000 And increasingly, that's based on technology.
00:30:49.000 And we can get into, you know, what's going on in Portland.
00:30:52.000 Well, the only way I know that is I'm looking at my social media feed, and according to that, it looks like the entire city's on fire and it's a war zone.
00:30:58.000 But if you...
00:30:59.000 I called a friend there the other day, and he said, it's a beautiful day.
00:31:03.000 There's actually no violence anywhere near where I am.
00:31:05.000 It's just like these two blocks or something like that.
00:31:07.000 And this is the thing is warping our view of reality.
00:31:10.000 And I think that's what really, for me, The Social Dilemma was really trying to accomplish as a film.
00:31:15.000 And, you know, the director, Jeff Wolowski, was trying to accomplish, is how did this society go crazy everywhere all at once, you know, seemingly?
00:31:24.000 You know, this didn't happen by accident.
00:31:26.000 It happened by design of this business model.
00:31:28.000 When did the business model get implemented?
00:31:31.000 Like, when did they start using these algorithms to recommend things?
00:31:33.000 Because initially, YouTube was just a series of videos, and it didn't have that recommended section.
00:31:39.000 When was that?
00:31:40.000 You know, that's a good question.
00:31:41.000 I mean, I... You know, originally YouTube was just post a video and you can get people to, you know, go to that URL and send it around.
00:31:51.000 They needed to figure out, once the competition for attention got more intense, they needed to figure out, how am I going to keep you there?
00:31:58.000 And so recommending those videos on the right-hand side, I think that was there pretty early, if I remember, actually.
00:32:04.000 Because that was some of the innovation is like keeping people within this YouTube wormhole.
00:32:07.000 And once people were in the YouTube wormhole constantly seeing videos, that was what they could offer the promise to a new video uploader.
00:32:16.000 Hey, if you post it here, you're going to get way more views than if you post it on Vimeo.
00:32:21.000 And that's the thing.
00:32:22.000 If I open up TikTok right now on my phone...
00:32:24.000 Do you have TikTok on your phone?
00:32:26.000 Well, I'm not supposed to, obviously, but more for research purposes.
00:32:30.000 Ah, research.
00:32:31.000 Do you know how to TikTok at all?
00:32:33.000 No.
00:32:33.000 My 12-year-old is obsessed.
00:32:35.000 Oh, really?
00:32:35.000 Oh yeah, she can't even sit around.
00:32:37.000 If she's standing still for five minutes, she just starts like...
00:32:41.000 She starts TikTok-ing.
00:32:44.000 And that's the thing.
00:32:44.000 2012. Oh, so the Mayans were right.
00:32:47.000 Right.
00:32:48.000 2012, the platform announced an update.
00:32:51.000 To the discovery system designed to identify the videos people actually want to watch by prioritizing videos that hold attention throughout, as well as increasing the amount of time a user spends on the platform overall, YouTube could assure advertisers that it was providing a valuable, high-quality experience for people.
00:33:08.000 So, that's the beginning of the end.
00:33:11.000 Yeah, so 2012 on YouTube's timeline, I mean, you know, the Twitter and Facebook world, I think, introduces the retweet and reshare buttons in the 2009 to 2010 kind of time period.
00:33:23.000 So you end up with this world where the things that we're most paying attention to are based on algorithms choosing for us.
00:33:31.000 And so the sort of deeper argument that's in the film that I'm not sure everyone picks up on is...
00:33:39.000 Right.
00:33:57.000 And then everyone votes based on that information.
00:33:59.000 Now you could say, well, hold on.
00:34:01.000 Radio and television were there and were partisan before that.
00:34:04.000 But actually, radio and TV are often getting their news stories from Twitter.
00:34:10.000 And Twitter is recommending things based on these algorithms.
00:34:13.000 So when you control the information that an entire population is getting, you're controlling their choices.
00:34:19.000 I mean, literally in military theory, if I want to screw up your military, I want to control the information that it's getting.
00:34:23.000 I want to confuse the enemy.
00:34:25.000 And that information funnel is the very thing that's been corrupted.
00:34:29.000 And it's like the Flint water supply for our minds.
00:34:31.000 I was talking to a friend yesterday and she was saying that there were articles that she was laughing that there's articles that are written about negative tweets that random people make about a celebrity doing this or that.
00:34:46.000 And she was quoting this article.
00:34:48.000 She's like, look how crazy this is.
00:34:50.000 This is a whole article that's written about someone who decided to say something negative about some Something some celebrity had done, and then it becomes this huge article, and then the tweets are prominently featured.
00:35:02.000 And then the response to those, I mean, like really arbitrary, like weird.
00:35:07.000 Because it's a values-blind system that just cares about what will get attention.
00:35:10.000 Exactly, and that's what the article was.
00:35:12.000 It was just an attention grab.
00:35:13.000 It's interesting because Prince Harry and Meghan have become very interested in these issues and are actively working on these issues and getting to know them just a little bit.
00:35:22.000 Are they really?
00:35:23.000 Because it affects them personally?
00:35:25.000 Well, it's actually interesting.
00:35:27.000 I mean, I don't want to speak for them, but I think Meghan has been the target of the most vitriol hate-oriented stuff on the planet, right?
00:35:33.000 From just the amount of sort of criticism that they get.
00:35:37.000 Really?
00:35:37.000 And scrutiny?
00:35:38.000 Yeah.
00:35:38.000 I mean, newsfeeds filled with hate about just what she looks like, what she says, just constantly.
00:35:44.000 Boy, I'm out of the loop.
00:35:45.000 I've never seen anything.
00:35:46.000 She's pretty.
00:35:47.000 What do they think she looks like?
00:35:48.000 Honestly, I don't follow it myself because I don't fall into these attention traps.
00:35:52.000 I try not to.
00:35:52.000 But people just face the worst vitriol.
00:35:55.000 I mean, this is the thing with teen bullying, right?
00:35:56.000 So...
00:35:57.000 I think they work on these issues because teenagers are now getting a micro version of this thing where each of us are scrutinized.
00:36:03.000 Think about what celebrity status does and how it screws up humans in general.
00:36:10.000 Take an average celebrity.
00:36:11.000 It warps your mind, it warps your psychology, and you get scrutiny.
00:36:15.000 When you suddenly are followed, each person gets thousands or project forward into the future a few years.
00:36:20.000 Each of us have tens of thousands to hundreds of thousands of people that are following what we say.
00:36:25.000 That's a lot of feedback.
00:36:27.000 And as Jonathan Haidt says in the film, and I know you've had him here, it's made kids much more cautious and less risk-taking and more bullied overall.
00:36:38.000 And there's just huge problems in mental health around this.
00:36:40.000 Yeah, it's really bad for young girls, right?
00:36:43.000 Especially for 10 to 14 years.
00:36:45.000 And I've had quite a few celebrities in here and we've discussed it.
00:36:48.000 I just tell them that you can't read that stuff.
00:36:51.000 Just don't read it.
00:36:52.000 Yeah.
00:36:52.000 Like, there's no good in it.
00:36:54.000 Like, I had a friend, she did a show, she's a comedian, she did a show, and she was talking about this one negative comment that was inaccurate.
00:37:02.000 It said she only did a half an hour and her show sucked.
00:37:05.000 She's like, fuck her and this and that.
00:37:06.000 I go, why are you reading that?
00:37:08.000 She's like, because it's mostly positive.
00:37:09.000 I go, but how come you're not talking about most of it then?
00:37:12.000 We're talking about this one person.
00:37:13.000 This one negative person.
00:37:15.000 We're both laughing about it.
00:37:16.000 She's healthy.
00:37:17.000 She's not completely fucked up by it.
00:37:19.000 But this one person got into her head.
00:37:22.000 I'm like, I'm telling you, the juice is not worth the squeeze.
00:37:26.000 But don't read those things.
00:37:27.000 But this is...
00:37:28.000 Exactly right.
00:37:29.000 And this is based on how our minds work.
00:37:30.000 I mean, our minds literally have something called negativity bias.
00:37:32.000 So if you have 100 comments and 99 are positive and one is negative, where does the average human's mind go?
00:37:39.000 Right.
00:37:40.000 They go to the negative.
00:37:41.000 And it also goes to the negative even when you shut down the screen.
00:37:44.000 Your mind is sitting there looping on that negative comment.
00:37:46.000 And why?
00:37:47.000 Because evolutionarily, it's really important that we look at social approval, negative social approval, because our reputation is at stake in the tribe.
00:38:13.000 Yes.
00:38:15.000 This is the psychological environment that is the default way that kids are growing up now.
00:38:20.000 I actually faced this recently with the film itself because actually the film has gotten just crazy positive acclaim for the most part and there's just a few negative comments and for myself even, right?
00:38:31.000 Here comes a conjunction, but...
00:38:34.000 I was glued to a few negative comments.
00:38:37.000 And then you could click and you would see other people that you know who positively like or respond to those comments.
00:38:43.000 You're like, why did that person say that negative thing?
00:38:45.000 I thought we were friends.
00:38:46.000 That whole kind of psychology.
00:38:47.000 And we're all vulnerable to it.
00:38:49.000 Unless you learn, as you said, to tell your celebrity friends just don't pay attention to it.
00:38:53.000 Even mild stuff I see people fixate on.
00:38:55.000 Even mild disagreement or mild criticism people fixate on.
00:39:01.000 And it's also a problem because you realize that someone's saying this and you're not there and you can't defend yourself.
00:39:07.000 So you have this feeling of helplessness.
00:39:09.000 Like, hey, that's not true.
00:39:11.000 I didn't...
00:39:12.000 And then you don't get it out of your system.
00:39:14.000 You never get to express it.
00:39:16.000 And people can share that false negative stuff.
00:39:19.000 I mean, not all negative stuff is false, but you can assert things and build on the hate fest and start going crazy and saying, this person's a white supremacist or this person's even worse.
00:39:29.000 And that'll spread to thousands and thousands of people, and next thing you know, you check into your feed again at, you know, 8 p.m.
00:39:35.000 that night, and your whole reputation's been destroyed, and you didn't even know what happened to you.
00:39:40.000 And this happened to teenagers, too.
00:39:41.000 I mean, they're anxious.
00:39:42.000 Like, I'll post, you know, a teenager will post a photo.
00:39:44.000 They're high school.
00:39:45.000 They'll make a dumb comment without thinking about it.
00:39:47.000 And then next thing they know, you know, at the end of the day, the parents are all calling because, like, 300 parents saw it and are calling up the parent of that kid.
00:39:55.000 And it's...
00:39:57.000 We talk to teachers a lot in our work at the Center for Humane Technology, and they will say that on Monday morning, this is before COVID, but on Monday morning, they spend the first hour of class having to clear all the drama that happened on social media from the weekend for the kids.
00:40:11.000 Jesus.
00:40:12.000 And again, like this...
00:40:14.000 And these kids are in what age group?
00:40:17.000 This was like 8th, 9th, 10th grade, that kind of thing.
00:40:21.000 And the other problem with these kids is there's not like a long history of people growing up through this kind of influence and successfully navigating it.
00:40:32.000 These are the pioneers.
00:40:34.000 Yeah, and they won't know anything different, which is why we talk about in the film.
00:40:38.000 They're growing up in this environment.
00:40:41.000 And one of the simplest principles of ethics is the ethics of symmetry.
00:40:47.000 Doing unto others as you would do to yourself.
00:40:48.000 And as we say at the end of the film...
00:40:55.000 Right.
00:41:00.000 Right.
00:41:10.000 That's when you know.
00:41:11.000 If you talk to a doctor or a lawyer, a doctor, and you say, you know, would you get this surgery for your own kid?
00:41:16.000 They say, oh no, I would never do that.
00:41:17.000 Like, would you trust that doctor?
00:41:19.000 Right.
00:41:20.000 And it's the same thing for a lawyer.
00:41:21.000 So this is the relationship where we have a relationship of asymmetry and technology is influencing all of us.
00:41:27.000 And we need a system by which, you know, when I was growing up, you know, I grew up on the Macintosh and technology and I was creatively doing programming projects and whatever else.
00:41:36.000 The people who built the technology I was using would have their own kits use the things that I was using because they were creative and they were about tools and empowerment.
00:41:44.000 And that's what's changed.
00:41:45.000 We don't have that anymore because the business model took over.
00:41:48.000 And so instead of having just tools sitting there like hammers waiting to be used to build creative projects or programming to invent things or paintbrushes or whatever, we now have a manipulation-based technology environment where everything you use has this incentive to not only addict you but to have you play the fame lottery,
00:42:04.000 get social feedback, because those are all the things that keep people's attention.
00:42:08.000 Isn't this also a problem with these information technologies being attached to corporations that have this philosophy of unlimited growth?
00:42:16.000 Yes.
00:42:16.000 So no matter how much they make, I applaud Apple because I think they're the only company that takes steps to protect privacy, to block advertisements, to make sure that at least when you Use their Maps application.
00:42:34.000 They're not saving your data and sending it to everybody.
00:42:38.000 And it's one of the reasons why Apple Maps is really not as good as Google Maps.
00:42:43.000 But I use it.
00:42:45.000 And that's one of the reasons why I use it.
00:42:47.000 And when Apple came out recently and they were doing something to block...
00:42:57.000 Your information being sent to other places.
00:43:01.000 And they...
00:43:02.000 I forget, what was the exact thing that...
00:43:05.000 In the new iOS, they released a thing that blocks the tracking identifiers.
00:43:10.000 That's right.
00:43:10.000 And it's not actually out yet.
00:43:11.000 It's going to be out in January or February, I think someone told me.
00:43:14.000 And what that's doing, that's a good example of they're putting a tax on the advertising industry.
00:43:19.000 Because just by saying you can't track people individually, that takes down the value of an advertisement by like 30% or something.
00:43:27.000 Here it is.
00:43:27.000 When I do Safari, I get this whole privacy report thing.
00:43:30.000 It says it's like in the last seven days, it's prevented 125 trackers from profiling me.
00:43:36.000 Yeah, and you can opt out of that if you'd like.
00:43:39.000 If you're like, no, fuck that, track me.
00:43:40.000 Yeah, you can do that if you want to.
00:43:42.000 You can let them send your data.
00:43:43.000 But that seems to me a much more ethical approach, to be able to decide whether or not these companies get your information.
00:43:50.000 I mean, those things are great.
00:43:53.000 The challenge is, imagine you get the privacy equation perfectly right.
00:43:57.000 Look at this.
00:43:57.000 Apple working on its own search engine as Google ties could be cut soon.
00:44:01.000 I started using DuckDuckGo.
00:44:04.000 Yep.
00:44:04.000 For that very reason, just because they don't do anything with it.
00:44:08.000 They give you the information, but they don't take your data and do anything with it.
00:44:14.000 The challenge is, let's say we get all the privacy stuff perfectly, perfectly right, and data production and data controls and all that stuff.
00:44:21.000 In a system that's still based on attention and grabbing attention and harvesting and strip mining our brains...
00:44:29.000 You still get maximum polarization, addiction, mental health problems, isolation, teen depression, suicide, polarization, breakdown of truth, right?
00:44:39.000 So we really focus in our work on those topics because that's the direct influence of the business model on warping society.
00:44:47.000 We need to name this mind warp.
00:44:48.000 We think of it like the climate change of culture.
00:44:50.000 Yeah.
00:45:09.000 Until you have a unified model of how emissions change all of those different phenomena, right?
00:45:14.000 In the social fabric, we have shortening of attention spans.
00:45:17.000 We have more outrage-driven news media.
00:45:20.000 We have more polarization.
00:45:22.000 We have more breakdown of truth.
00:45:24.000 We have more conspiracy-minded thinking.
00:45:26.000 These seem like separate events.
00:45:32.000 I think?
00:45:53.000 I want you watching the TV, the tablet, and the phone at the same time, because now I've tripled the size of the amount of extractable attention that I can get for advertisers.
00:46:02.000 Which means that by fracking for attention and splitting you into more junk attention that's thinner...
00:46:09.000 We can sell that as if it's real.
00:46:10.000 It's like the financial crisis where you're selling thinner and thinner financial assets as if it's real, but it's really just a junk asset.
00:46:17.000 And that's kind of where we are now where it's sort of the junk attention economy because we can shorten attention spans and we're debasing the substrate.
00:46:26.000 That makes up our society.
00:46:27.000 Because everything in a democracy depends on individual sensemaking and meaningful choice, meaningful free will, meaningful independent views.
00:46:34.000 But if that's all basically sold to the highest bidder that debases the soil from which independent views grow, because all of us are jacked into this sort of matrix of social media manipulation, That's ruining and degrading our democracy.
00:46:47.000 And that's really...
00:46:48.000 There's many other things that are ruining and degrading our democracy, but that's this sort of invisible force that's upstream that affects every other thing downstream.
00:46:55.000 Because if we can't agree on what's true, for example, you can't solve any problem.
00:46:59.000 I think that's what you talked about in your 10-minute thing on The Social Dilemma I think I saw on YouTube.
00:47:04.000 Yeah.
00:47:06.000 Your organization highlights all these issues in an amazing way, and it's very important.
00:47:12.000 But do you have any solutions...
00:47:17.000 It's hard, right?
00:47:18.000 So I just want to say that this is as complex a problem as climate change in the sense that you need to change the business model.
00:47:27.000 I think of it like we're on the fossil fuel economy and we have to switch to some kind of beyond that thing, right?
00:47:32.000 Because so long as the business models of these companies depend on extracting attention, Can you expect them to do something different?
00:47:41.000 You can't, but how could you?
00:47:43.000 I mean, there's so much money involved, and now they've accumulated so much wealth that they have an amazing amount of influence.
00:47:52.000 And the asymmetric influence can buy lobbyists, can influence Congress, and prevent things from happening.
00:47:58.000 So this is why it's kind of the last moment.
00:48:00.000 That's right.
00:48:01.000 But, you know, I think we're seeing signs of real change.
00:48:03.000 We have the antitrust case that was just filed against Google in Congress.
00:48:07.000 We're seeing more hearings.
00:48:09.000 What was the basis of that case?
00:48:11.000 You know, to be honest, I was actually in the middle of the Social Dilemma launch when I think that happened, and my home burned down in the recent fires in Santa Rosa, so I actually missed that happening.
00:48:21.000 It's hard to hear that.
00:48:22.000 Yeah, sorry.
00:48:22.000 That was a big thing to drop.
00:48:23.000 But yeah, no, it's awful.
00:48:25.000 There's so much that's been happening in the last six weeks.
00:48:27.000 I was evacuated three times where I lived in California.
00:48:30.000 Oh, really?
00:48:30.000 Yeah, so real close to our house.
00:48:33.000 Justice Department's who's monopolist.
00:48:35.000 Google for violating antitrust laws.
00:48:38.000 Department files complain against Google to restore competition in search and search advertising markets.
00:48:43.000 Okay, so it's all about search.
00:48:45.000 Right, this was a case that's about Google using its dominant position to privilege its own search engine in its own products and beyond, which is similar to sort of Microsoft bundling in the Internet Explorer browser.
00:48:59.000 But, you know, this is all...
00:49:01.000 Good progress, but really it misses the kind of fundamental harm of like, these things are warping our society.
00:49:07.000 They're warping how our minds are working.
00:49:08.000 And there's no, you know, congressional action against that, because it's a really hard problem to solve.
00:49:13.000 I think that the reason the film for me is so important is that if I look at the growth rate of how fast Facebook has been recommending people into conspiracy groups and Kind of polarizing us into separate echo chambers, which we should really break down, I think, as well for people like exactly the mechanics of how that happens.
00:49:31.000 But if you look at the growth rate of all those harms, compared to, you know, how fast has Congress passed anything to deal with it, like basically not at all.
00:49:40.000 They seem a little bit unsophisticated in that regard.
00:49:43.000 It might be an understatement.
00:50:04.000 And Rafi Martina, his staffer, is an amazing human being.
00:50:07.000 He works very hard on these issues.
00:50:08.000 So there are some good folks.
00:50:10.000 But when you look at the broad, like the hearing yesterday, it's mostly grandstanding to politicize the issue, right?
00:50:17.000 Because you turn it into, on the right, hey, you're censoring conservatives.
00:50:21.000 And on the left, it's, hey, you're not taking down enough misinformation and dealing with the hate speech and all these kinds of things.
00:50:27.000 And they're not actually dealing with, how would we solve this problem?
00:50:30.000 They're just trying to make a political point to win over their base.
00:50:33.000 Now, Facebook recently banned the QAnon pages, which I thought was kind of fascinating, because I'm like, well, this is a weird sort of slippery slope, isn't it?
00:50:43.000 Like, if you decide that you...
00:50:45.000 I mean, it almost seemed to me like, well, we'll throw them a bone.
00:50:47.000 We'll get rid of QAnon, because it's so preposterous.
00:50:51.000 Let's just get rid of that.
00:50:52.000 But what else?
00:50:54.000 Like, if you keep going down that rabbit hole, where do you draw the line?
00:50:58.000 Like, where...
00:50:59.000 Are you allowed to have JFK conspiracy theories?
00:51:03.000 Are you allowed to have flat earth?
00:51:05.000 Are you allowed?
00:51:05.000 I mean, I guess flat earth is not dangerous.
00:51:08.000 Is that where they make the distinction?
00:51:10.000 So I think their policy is evolving in the direction of when things are causing offline harm, when online content is known to precede offline harm, that's when the platform, that's the standard by which platforms are acting.
00:51:23.000 What offline harm has been caused by the QAnon stuff, do you know?
00:51:27.000 There's several incidents.
00:51:28.000 We interviewed a guy on our podcast about it.
00:51:31.000 There's some armed at gunpoint type thing.
00:51:33.000 I can't remember.
00:51:36.000 And there's things that are priming people to be violent, you know.
00:51:42.000 I just want to say these are really tricky topics, right?
00:51:44.000 I think what I want to make sure we get to, though, is that there are many people manipulating the groupthink that can happen in these echo chambers.
00:51:51.000 Because once you're in one of these things, like I studied cults earlier in my career.
00:51:56.000 And the power of cults is like they're a vertically integrated persuasion stack because they control your social relationships.
00:52:01.000 They control who you're hearing from and who you're not hearing from.
00:52:04.000 They give you meaning, purpose, and belonging.
00:52:06.000 They have a custom language.
00:52:10.000 They have an internal way of referring to things.
00:52:12.000 And social media allows you to create this sort of decentralized cult factory where it's easier to grab people into an echo chamber where they only hear from other people's views.
00:52:23.000 And Facebook, I think even just recently, I think?
00:52:43.000 When does it cross a line?
00:52:44.000 I don't know.
00:52:45.000 I mean, the policy teams that work on this are coming up with their own standards, so I'm not familiar with it.
00:52:51.000 If you think about how hard it is to come up with a law at the federal level that all states will agree to, then you imagine Facebook trying to come up with a policy that will be universal to all the countries that are running Facebook, right?
00:53:05.000 Well, then you imagine how you take a company that never thought they were going to be in the position to do that.
00:53:10.000 Correct.
00:53:10.000 And then within a decade, they become the most prominent source of news and information on the planet Earth.
00:53:15.000 Correct.
00:53:16.000 And now they have to regulate it.
00:53:17.000 And, you know, I actually believe Zuckerberg when he says, I don't want to make these decisions.
00:53:22.000 I shouldn't be in this role where my beliefs decide the whole world's views.
00:53:26.000 Right.
00:53:27.000 He genuinely believes that.
00:53:28.000 Yeah.
00:53:29.000 And to be sure to all that.
00:53:30.000 But the problem is he created a situation where he is now in that position.
00:53:34.000 I mean, he got there very quickly.
00:53:35.000 And they did it aggressively when they went into countries like Myanmar, Ethiopia, all throughout the African continent, where they gave...
00:53:41.000 Do you know about Free Basics?
00:53:43.000 No.
00:53:43.000 So this is the program that I think has gotten something like 700 million accounts onto Facebook, where they do a deal with like a telecommunications provider, like their version of AT&T in Myanmar or something.
00:53:55.000 So when you get your smartphone, it comes...
00:53:57.000 Facebook's built in.
00:53:58.000 Facebook's built in.
00:53:58.000 Yes, I do know about that.
00:53:59.000 And there's an asymmetry of access where it's free to access Facebook, but it costs money to do the other things for the data plan.
00:54:07.000 So you get a free Facebook account.
00:54:09.000 Facebook is the internet, basically, because it's the free thing you can do on your phone.
00:54:14.000 And then we know that there's fake information that's being spread there.
00:54:18.000 So the data doesn't apply to Facebook use?
00:54:20.000 Yeah, I think the cost...
00:54:21.000 You know how we pay for data here?
00:54:23.000 I think you don't pay for Facebook, but you do pay for all the other things, which creates an asymmetry where, of course, you're going to use Facebook for most things.
00:54:31.000 Right.
00:54:31.000 So you have Facebook Messenger, video calls, WhatsApp.
00:54:36.000 I don't know exactly what video...
00:54:37.000 Facebook has video calls as well, right?
00:54:40.000 In general, they do.
00:54:41.000 I just don't know how that works in the developing world.
00:54:43.000 But there's a joke within Facebook.
00:54:45.000 I mean, this has caused genocides, right?
00:54:46.000 So in Myanmar, which is in the film, the Rohingya Muslim minority group, many Rohingya were persecuted and murdered because of fake information spread by the government on Facebook using their asymmetric knowledge with fake accounts.
00:55:01.000 I mean, even just a couple weeks ago, Facebook took down a network of, I think, several hundred thousand fake accounts in Myanmar.
00:55:08.000 And they didn't even have at the time more than something like four or five people in their extended Facebook network who even spoke the language of that country.
00:55:16.000 So when you realize that this is like the, I think of like the Iraq War Colin Powell Pottery Barn Rule, where like, you know, if you go in and you break it, then you are responsible for fixing it.
00:55:27.000 This is Facebook actively doing deals to go into Ethiopia, to go into Myanmar, to go into the Philippines or whatever, and providing these solutions.
00:55:36.000 And then it breaks the society, and they're now in a position where they have to fix it.
00:55:41.000 There's actually a joke within Facebook that if you want to know which countries will be quote-unquote at risk in two years from now, look at which ones have Facebook free basics.
00:55:51.000 Jesus!
00:55:53.000 It's terrifying that they do that, and they don't have very many people that even speak the language.
00:55:57.000 So there's no way they're going to be able to filter it.
00:55:59.000 That's right.
00:56:00.000 And so now, if you take it back, I know we were talking outside about the congressional hearing and Jack Dorsey and the questions from the senator about, are you taking down the content from the Ayatollahs or from the Chinese Xinjiang province about the Uyghurs, you know, when there's sort of speech that leads to offline violence in these other countries?
00:56:18.000 The issue is that these platforms are managing the information commons for countries they don't even speak the language of.
00:56:25.000 And if you think the conspiracy theory sort of dark corners, crazy town of the English internet are bad, and we've already taken out like hundreds of whack-a-mole sticks and they've hired hundreds of policy people and hundreds of engineers to deal with that problem.
00:56:39.000 Yeah, I think.
00:56:56.000 They don't have a voice on the platform.
00:56:57.000 This is really important that the people in Myanmar who got persecuted and murdered didn't have to be on Facebook for the fake information spread about them to impact them, for people to go after them, right?
00:57:12.000 So this is the whole, I can assert something about this minority group.
00:57:16.000 That minority group isn't on Facebook.
00:57:18.000 But if it manipulates the dominant culture to go, we have to go kill them...
00:57:22.000 Then they can go do it.
00:57:23.000 And the same thing has happened in India, where there's videos uploaded about, hey, those Muslims, I think they're called flesh killings, where they'll say that these Muslims killed this cow, and in Hinduism, the cows are sacred.
00:57:39.000 Did I get that right?
00:57:42.000 I believe you did.
00:57:44.000 Yeah.
00:57:46.000 They will post those.
00:57:48.000 They'll go viral on WhatsApp and say, we have to go lynch those Muslims because they killed the sacred cows.
00:57:54.000 And they went from something like five of those happening per year to now hundreds of those happening per year because of fake news being spread, again, on Facebook about them, on WhatsApp about them.
00:58:04.000 And again, they don't have to be on the platform for this to happen to them, right?
00:58:08.000 So this is critical.
00:58:12.000 Let's imagine all of your listeners.
00:58:14.000 I don't even know how many you have, like tens of millions, right?
00:58:16.000 And we all listen to this conversation.
00:58:18.000 We say, we don't want to even use Facebook and Twitter or YouTube.
00:58:21.000 We all still, if you live in the US, still live in a country that everyone else will vote based on everything that they're seeing on these platforms.
00:58:28.000 If you zoom out to the global context, all of us, we don't use Facebook in Brazil, but if Brazil, which was heavily, the last election was skewed by Facebook and WhatsApp, where something like 87% of people saw at least one of the major fake news stories about Bolsonaro,
00:58:44.000 and he got elected, and you have people in Brazil chanting, Facebook, Facebook, when he wins...
00:58:50.000 He wins and then he sets a new policy to wipe out the Amazon.
00:58:54.000 All of us don't have to be on Facebook to be affected by a leader that wipes out the Amazon and accelerates climate change timelines because of those interconnected effects.
00:59:03.000 So, you know, we at the Center for Immune Technology are looking at this from a global perspective where it's not just the US election.
00:59:09.000 Facebook manages something like 80 elections per year.
00:59:12.000 And if you think that they're doing all the monitoring that they are for, you know, English-speaking, American election, most privileged society, now look at the hundreds of other countries that they're operating in.
00:59:21.000 Do you think that they're devoting the same resources to the other countries?
00:59:26.000 This is so crazy.
00:59:28.000 It's like, is that you, Jamie?
00:59:30.000 What's that weird noise?
00:59:33.000 You hear like a squeaky?
00:59:34.000 I heard it too.
00:59:34.000 Yeah.
00:59:35.000 Maybe it's me.
00:59:36.000 I don't think it is.
00:59:37.000 Just my feedback.
00:59:39.000 There it is.
00:59:40.000 It might be me.
00:59:41.000 Is it?
00:59:41.000 Breathing.
00:59:41.000 I don't know.
00:59:42.000 You have asthma?
00:59:43.000 I think I had an allergy coming out.
00:59:45.000 Oh.
00:59:46.000 I was like that.
00:59:47.000 Making some noises.
00:59:48.000 What's terrifying is that we're talking about from 2012 to 2020 YouTube implementing this program and then what is even the birth of Facebook?
00:59:59.000 What is that like 2002 or 3?
01:00:01.000 2004. 2004. This is such a short timeline and having these massive worldwide implications from the use of these things.
01:00:11.000 When you look at the future, do you look at this like a runaway train that's headed towards a cliff?
01:00:17.000 Yeah, I mean, I think right now, this thing is a Frankenstein that it's not like even if Facebook is aware of all these problems, they don't have the staff unless they hired like hundreds of, you know, hundreds of thousands of people, definitely, minimum, to try to address all these problems.
01:00:33.000 But the paradox we're in...
01:00:35.000 Is that the very premise of these services is to rely on automation.
01:00:40.000 It used to be we had editors and journalists, or at least editors or people who edited even what went on television, saying, what is credible?
01:00:49.000 What is true?
01:00:50.000 You sat here with Alex Jones even yesterday, and you're trying to check him on everything he's saying.
01:00:54.000 You're researching and trying to look that stuff up.
01:00:56.000 You're trying to be doing some more responsible communication.
01:01:00.000 The premise of these systems is that you don't do that.
01:01:04.000 The reason venture capitalists find social media so profitable and such a good investment is because we generate the content for free.
01:01:13.000 We are the useful idiots, right?
01:01:15.000 Instead of paying a journalist $70,000 a year to write something credible, we can each be convinced to share our political views and we'll do it knowingly for free.
01:01:23.000 Actually, we don't really know that we're the useful idiots.
01:01:25.000 That's kind of the point.
01:01:26.000 And then instead of paying an editor $100,000 a year to figure out which of those things is true that we want to promote and give exponential reach to, you have an algorithm says, hey, what do people click on the most?
01:01:37.000 What do people like the most?
01:01:39.000 And then you realize the quality of the signals that are going into the information environment that we're all sharing is a totally different process.
01:01:47.000 We went from a high-quality, gated process that cost a lot of money...
01:01:51.000 To this really crappy process that costs no money, which makes the company so profitable.
01:01:57.000 And then we fight back for territory, for values, when we raise our hands and say, hey, there's a thinspiration video problem for teenagers and anorexia.
01:02:06.000 Hey, there's a mass conspiracy sort of echo chamber problem over here.
01:02:10.000 Hey, there's, you know, flat earth sort of issues.
01:02:13.000 And again, these get into tricky topics because we want to, you know, I know we both believe in free speech and we have this feeling that the solution to bad speech is better, you know, more speech that counters the things that are said.
01:02:26.000 But in a finite attention economy, We don't have the capacity for everyone who gets bad speech to just have a counter response.
01:02:35.000 In fact, what happens right now is that that bad speech rabbit holes into, I don't want to call it worse and worse speech, but more extreme versions of that view that confirms it.
01:02:43.000 Because once Facebook knows that that flat earth rabbit hole is good for you at getting your attention back, it wants to give you just more and more of that.
01:02:50.000 It doesn't want to say here's 20 people who disagree with that thing.
01:02:53.000 Right.
01:02:53.000 Right?
01:02:54.000 So I think if you were to imagine a different system, we would ask, who are the thinkers that are most open-minded and synthesis-oriented, where they can actually steelman the other side?
01:03:04.000 Actually, they can do, you know, for this speech, here is the opposite counterargument.
01:03:08.000 They can show that they understand that.
01:03:10.000 And imagine those people get lifted up.
01:03:12.000 But notice that none of those people that you and I know, I mean, we're both friends with Eric Weinstein, And, you know, I think he's one of these guys who's really good at sort of offering the steel manning, here's the other side of this, here's the other side of that.
01:03:23.000 But the people who generally do that aren't the ones who get the tens of millions of followers on these surfaces.
01:03:29.000 It's the black and white, extreme, outrage-oriented thinkers and speakers that get rewarded in this attention economy.
01:03:35.000 And so if you look at how, if I zoom way out and say, how is the entire system behaving?
01:03:39.000 Just like if I zoom out and say, you know, the climate system, like, how is the entire overall system behaving?
01:03:45.000 It's not producing the kind of information environment on which democracy can survive.
01:03:51.000 Jesus.
01:03:53.000 The thing that troubles me the most is that I clearly see your thinking and I agree with you.
01:03:57.000 I don't see any holes in what you're saying.
01:03:59.000 I don't know how this plays out, but it doesn't look good.
01:04:02.000 And I don't see a solution.
01:04:05.000 It's like...
01:04:07.000 If there are a thousand bison running full steam towards a cliff and they don't realize the cliff is there, I don't see how you pull them back.
01:04:16.000 So I think of it like we're trapped in a body that's eating itself.
01:04:21.000 So it's kind of a cannibalism economy because our economic growth right now with these tech companies is based on eating our own organs.
01:04:28.000 So we're eating our own mental health organs.
01:04:30.000 We're eating the health of our children.
01:04:31.000 Sorry for being so gnarly about it, but it's a cannibalistic system.
01:04:36.000 In a system that's hurting itself or eating itself or punching itself, if one of the neurons wakes up in the body, it's not enough to change that.
01:04:44.000 It's going to keep punching itself.
01:04:44.000 But if enough of the neurons wake up and say, this is stupid, why would we build our system this way?
01:04:49.000 And the reason I'm so excited about the film is that if you have 40 to 50 million people who now recognize that we're living in this sort of cannibalist system in which...
01:05:11.000 Yeah.
01:05:21.000 And we have to all recognize that we're now 10 years into this hypnosis experiment of warping of the mind.
01:05:26.000 And like, you know, friends with some hypnotists, it's like, how do we snap our fingers and get people to say, there's an inflated level of polarization and hatred right now that especially going into this election, I think we all need to be much more cautious about what's running in our brains right now.
01:05:41.000 Yeah, I don't think most people are generally aware of what's causing this polarization.
01:05:45.000 I think they think it's the climate of society because the president and because of Black Lives Matter and the George Floyd protests and all this jazz.
01:05:55.000 But I don't think they understand that that's exacerbated in a fantastic way by social media and the last 10 years of our addictions to social media and these echo chambers that we all exist in.
01:06:09.000 Yeah, so I want to make sure that we're both clear, and I know you agree with this, that these things were already in society to some degree, right?
01:06:19.000 So we want to make sure we're not saying social media is blamed for all of it.
01:06:22.000 Absolutely not.
01:06:22.000 No, no.
01:06:23.000 It's gasoline.
01:06:25.000 It's gasoline, right.
01:06:25.000 Exactly.
01:06:26.000 It's lighter fluid for sparks of polarization.
01:06:28.000 It's lighter fluid for sparks of, you know...
01:06:32.000 Which is ironically what everybody...
01:06:34.000 It was the opposite of what everybody hoped the internet was going to be.
01:06:38.000 Right.
01:06:38.000 Everybody hoped the internet was going to be this bottomless resource of information where everyone was going to be educated in a way they had never experienced before in the history of the human race, where you'd have access to all the answers to all your questions.
01:06:51.000 You know, Eric Weinstein describes it as the library of Alexandria in your pocket.
01:06:55.000 Yeah.
01:06:56.000 But no.
01:06:56.000 Yeah.
01:06:56.000 Well, and I want to be clear so that I'm not against technology or giving people access.
01:07:00.000 In fact, I think a world where everyone had a smartphone and a Google search box and Wikipedia and a search-oriented YouTube so you can look up health issues and how to do-it-yourself fix anything would be awesome.
01:07:11.000 That would be great.
01:07:12.000 I would love that.
01:07:13.000 I just want to be really clear because this is not an anti-technology conversation.
01:07:16.000 It's about, again, this business model that depends on recommending stuff to people, which, just to be clear on the polarization front, it...
01:07:24.000 Social media is more profitable when it gives you your own Truman Show that affirms your view of reality every time you flick your finger.
01:07:32.000 That's going to be more profitable than every time you flick your finger.
01:07:34.000 I actually show you, here's a more complex, nuanced picture that disagrees with that.
01:07:38.000 Here's a different way to see it.
01:07:39.000 That won't be nearly as successful.
01:07:41.000 And the best way for people to test this, we actually recommend, even after seeing the film to do this, is open up Facebook on two phones.
01:07:49.000 Especially like, you know, two partners or people who have the same friends.
01:07:52.000 So you have the same friends on Facebook.
01:07:54.000 You would think if you scroll your feeds, you'd see the same thing.
01:07:57.000 You have the same people you're following.
01:07:59.000 So why wouldn't you see the same thing?
01:08:00.000 But if you swap phones and you actually scroll through their feed for 10 minutes, and you scroll through mine for 10 minutes, You'll find that you'll see completely different information.
01:08:10.000 And you'll also notice that it won't feel very compelling.
01:08:13.000 Like if you asked yourself—my friend Emily just did this with her husband after seeing the film.
01:08:17.000 And she literally has the same friends as her husband.
01:08:20.000 And she scrolled through the feed and she's like, this isn't interesting.
01:08:22.000 I wouldn't come back to this.
01:08:25.000 Right?
01:08:25.000 And so we have to, again, realize how subtle this has been.
01:08:30.000 I wonder what would happen if I scrolled through my feed, because I literally don't use Facebook.
01:08:35.000 What do you use?
01:08:35.000 I don't use it at all.
01:08:36.000 I only use Instagram.
01:08:37.000 Use Instagram.
01:08:38.000 I stopped using Twitter because it's like a bunch of mental patients throwing shit at each other, and I very rarely use it, I should say.
01:08:46.000 Occasionally, I'll check some things to see what the climate is, cultural climate, but...
01:08:53.000 I use Instagram and Facebook.
01:08:55.000 I used to use Instagram to post to Facebook, but I kind of stopped even doing that.
01:09:01.000 It just seems gross.
01:09:03.000 It's these people in these verbose arguments about politics and the economy and world events.
01:09:11.000 We have to ask ourselves, Is that medium constructive to solving these problems?
01:09:19.000 No.
01:09:19.000 Just not at all.
01:09:20.000 And it's an attention casino, right?
01:09:21.000 The house always wins.
01:09:23.000 You might see Eric Weinstein in a thread battling it out or sort of duking it out with someone and maybe even reaching some convergence on something, but it just whizzes by your feet and then it's gone.
01:09:33.000 And all the effort that we're putting in to make these systems work, but then it's just all gone.
01:09:38.000 What do you do?
01:09:39.000 I mean, I try to very minimally use social media overall.
01:09:45.000 Luckily, the work is so busy that that's easier.
01:09:48.000 I want to say first that, you know, on the addiction fronts of these things, I, you know, myself am very sensitive and, you know, easily addicted by these things myself.
01:09:58.000 And that's why I think I notice.
01:10:00.000 You were saying in a social dilemma, it's email for you, huh?
01:10:03.000 Yeah, you know, for me, if I refresh my email and pull to refresh like a slot machine, sometimes I'll get invited to meet the president of such and such to advise on regulation, and sometimes I get a stupid newsletter from a politician I don't care about or something, right?
01:10:17.000 So email is very addictive.
01:10:20.000 It's funny, I talked to Daniel Kahneman, who wrote the—he's like the founder of behavioral economics.
01:10:24.000 He wrote the book Thinking Fast and Slow, if you know that one.
01:10:27.000 And he said as well that email was the most addictive for him.
01:10:31.000 And he, you know, the one thing you'll find is that the people who know most about these sort of persuasive manipulative tricks, they'll say we're not immune to them just because we know about them.
01:10:39.000 Dan Ariely, who's another famous persuasion behavioral economics guy, talks about flattery and how flattery still feels good even if I tell you I don't mean it.
01:10:48.000 Like, I love that sweatshirt.
01:10:49.000 That's an awesome sweatshirt.
01:10:50.000 Where'd you get it?
01:10:52.000 You're just going to bullshit me.
01:10:53.000 But that's the...
01:10:55.000 It feels good to get flattery, even if you know that it's not real.
01:10:59.000 Right.
01:11:00.000 And the point being that, like, again, we have so much evolutionary wiring to care about what other people think of us, that just because you know that they're manipulating you in the likes or whatever, it still feels good to get those hundred extra likes on that thing that you posted.
01:11:13.000 Yeah.
01:11:13.000 When do the likes come about?
01:11:16.000 Um, well, let's see.
01:11:19.000 Well, actually, you know, in the film, you know, Justin Rosenstein, who's the inventor of the like button, talks about I think the first version was something called Beacon and it arrived in 2006, I think.
01:11:29.000 But then the simple like one click like button was like a little bit later, like 2008-2009.
01:11:34.000 Are you worried that it's going to be more and more invasive?
01:11:37.000 I mean, you think about the problems we're dealing with now with Facebook and Twitter and Instagram, all these within the last decade or so.
01:11:44.000 What do we have to look forward to?
01:11:47.000 I mean, is there something on the horizon that's going to be even more invasive?
01:11:50.000 Well, we have to change this system because, as you said, technology is only going to get more immersed into our lives and infused into our lives, not less.
01:12:00.000 Is technology going to get more persuasive or less persuasive?
01:12:02.000 More, for sure.
01:12:04.000 Is AI going to get better at predicting our next move or less good at predicting our next move?
01:12:08.000 It's almost like we have to eliminate that.
01:12:12.000 I mean, it would be really hard to tell them you can't use algorithms anymore that depend on people's attention spans.
01:12:19.000 Right.
01:12:19.000 It would be really hard, but it seems like the only way for the internet to be pure.
01:12:24.000 Correct.
01:12:24.000 I think of this like the environmental movement.
01:12:26.000 I mean, some people have compared the film The Social Dilemma to Rachel Carson's Silent Spring.
01:12:32.000 Right, where that was the birth, that was the book that birthed the environmental movement.
01:12:36.000 And that was in a Republican administration, the Nixon administration, we actually passed, we created the EPA, the Environmental Protection Agency.
01:12:42.000 We went from a world where we said the environment is something we don't pay attention to, to we passed a bunch, I forgot the laws we passed between 1963 and 1972. Over a decade, we started caring about the environment.
01:12:53.000 We created things that protected the national parks.
01:12:55.000 We And I think that's kind of what's going on here, that, you know, imagine, for example, it is illegal to show advertising on youth-oriented social media apps between 12am and 6am, because you're basically monetizing loneliness and lack of sleep.
01:13:12.000 Right?
01:13:13.000 Like, imagine that you cannot advertise during those hours, because we say that like a national park, our children's attention between...
01:13:19.000 This is a very minimal example, by the way.
01:13:21.000 This would be like, you know, taking the most obvious piece of low-hanging fruit and land and say, let's quarantine this off and say, this is sacred.
01:13:28.000 But isn't the problem, like, the Environmental Protection Agency, it resonates with most people.
01:13:34.000 The idea, oh, let's protect the world for our children.
01:13:36.000 Right.
01:13:37.000 There's not a lot of people profiting off of polluting the rivers.
01:13:40.000 Right.
01:13:41.000 But when you look...
01:13:42.000 Well, there's, I mean, overhunting, you know, certain lands or overfishing certain fisheries and collapsing them.
01:13:47.000 I mean, there are, if you have big enough corporations that are based on an infinite growth profit model, you know, operating with less and less For sure.
01:13:57.000 For sure.
01:14:08.000 And also, this is not something that really resonates in a very clear, like, one plus one equals two way.
01:14:15.000 Like, an environmental protection agency, it makes sense.
01:14:20.000 Like, if you ask people, should you be able to throw garbage into the ocean?
01:14:24.000 Everyone's going to say, no, that's a terrible idea.
01:14:26.000 Right.
01:14:26.000 Should you be able to make an algorithm that shows people what they're interested in on YouTube?
01:14:34.000 Like, yeah, what's wrong with that?
01:14:35.000 Well, it's more like sugar, right?
01:14:36.000 Because sugar is always going to taste way better than something else, because our evolutionary heritage says, like, that's rare, and so we should pay more attention to it.
01:14:44.000 This is like sugar for the fame lottery, for attention, for social approval.
01:14:48.000 And so it's always going to feel good, and we need to have consciousness about it.
01:14:51.000 And we haven't banned sugar, but we have created a new conversation about what healthy...
01:15:02.000 I think that's true.
01:15:20.000 You might want to look that up, but...
01:15:22.000 So I think we could have something like that here where we have to...
01:15:26.000 I think of it this way, if you want to even get kind of weirdly, I don't know, spiritual or something about it, which is we are the only species that could even know that we're doing this to ourselves.
01:15:37.000 Right.
01:15:38.000 Like, we're the only species with the capacity for self-awareness to know that we have actually, like, roped ourselves into this matrix of, like, literally the matrix, of sort of undermining our own psychological weaknesses.
01:15:52.000 A lion that somehow manipulated its environment so that there's gazelles everywhere and is overeating on gazelles doesn't have the self-awareness to know, wait a second, if we keep doing this, this is going to cause all these other problems.
01:16:03.000 It can't do that because its brain doesn't have that capacity.
01:16:06.000 Our brain...
01:16:08.000 We do have the capacity for self-awareness.
01:16:10.000 We can name negativity bias, which is that if I have 100 comments and 99 are positive, my brain goes to the negative.
01:16:16.000 We can name that, and once we're aware of it, we get some agency back.
01:16:19.000 We can name that we have a draw towards social approval.
01:16:22.000 So when I see I've been tagged in a photo, I know that they're just manipulating my social approval.
01:16:26.000 We can name social reciprocity, which is when I get all those text messages and I feel, oh, I have to get back to all these people.
01:16:32.000 Well, that's just an inbuilt bias that we have to get back reciprocity.
01:16:36.000 We have to get back to people who give stuff to us.
01:16:39.000 The more we name our own biases, like confirmation bias, we can name that my brain is more likely to feel good getting information that I already agree with than information that disagrees with me.
01:16:52.000 I can get more agency back.
01:16:54.000 And we're the only species that we know of that has the capacity to realize that we're in a self-terminating sort of system, and we have to change that by understanding our own weaknesses and that we've created the system that is undermining ourselves.
01:17:06.000 And I think the film is doing that for a lot of people.
01:17:10.000 It certainly is, but I think it needs more.
01:17:13.000 It's like inspiration.
01:17:14.000 It needs a refresher on a regular basis.
01:17:17.000 Do you feel this massive obligation to be that guy that is out there sort of as the Paul Revere of the technology influence invasion?
01:17:30.000 I just see these problems and I want them to go away.
01:17:33.000 I didn't desire and wake up to run a social movement, but honestly, right now, that's what we're trying to do with the Center for Humane Technologies.
01:17:42.000 We realized that before the success of the film, we were actually more focused on working with technologists inside the industry.
01:17:50.000 I come from Silicon Valley.
01:17:51.000 Many of my friends are executives at the companies and we have these inside relationships.
01:17:54.000 We focused at that level.
01:17:56.000 We also worked with policymakers, and we were trying to speak to policymakers.
01:18:00.000 We weren't trying to mobilize the whole world against this problem.
01:18:05.000 But with the film, suddenly we as an organization have had to do that.
01:18:08.000 And frankly, I wish we had—I'm speaking really honestly—I really wish we'd had those funnels so that people who saw the film could have landed into a carefully designed funnel where we actually started mobilizing people to deal with this issue.
01:18:20.000 Because there are ways we can do it.
01:18:21.000 We can pass certain laws.
01:18:22.000 We have to have a new cultural sort of set of norms about how do we want to show up and use this system.
01:18:28.000 You know, families and schools can have whole new protocols of how do we want to do group migrations?
01:18:32.000 Because one of the problems is that if a teenager says by themselves, whoa, I saw the film, I'm going to delete my Instagram account by myself or TikTok account by myself, that's not enough because all their friends are still using Instagram and TikTok and they're still going to talk about who's dating who or gossip about this or homework or whatever on those services.
01:18:52.000 And so the services, Instagram and TikTok, prey on social exclusion, that you will feel excluded if you don't participate.
01:18:59.000 And the way to solve that is to get whole schools or families together, like different parent groups or whatever together, and do a group migration from Instagram to Signal or iMessage or some kind of group thread that way.
01:19:12.000 Because notice that when you, as you said, Apple's a pretty good actor in this space.
01:19:16.000 If I make a FaceTime call to you, FaceTime isn't trying to monetize my attention.
01:19:22.000 It's just sitting there being like, yeah, how can I help you have a good, as close to face-to-face conversation as possible?
01:19:28.000 Jamie pulled up an article earlier that was saying that Apple was creating its own search engine.
01:19:34.000 I hope that is the case, and I hope that if it is the case, they apply the same sort of ethics that they have.
01:19:40.000 Towards sharing your information that they do with other things to their search engine.
01:19:45.000 But I wonder if there would be some sort of value in them creating a social media platform that doesn't rely on that sort of algorithm.
01:19:57.000 Well, I think in general, one of the exciting trends that has happened since the film is there's actually many more people trying to build alternatives, social media products, that are not based on these business models.
01:20:08.000 Yeah.
01:20:10.000 I could name a few, but I don't want to be endorsing anything.
01:20:13.000 There's people building Marco Polo, Clubhouse, Wikipedia is trying to build a sort of non-profit version.
01:20:19.000 I always forget the names of these things.
01:20:21.000 But the interesting thing is that for the first time people are trying to build something else because now there's enough people who feel disgusted by the present state of affairs.
01:20:31.000 And that wouldn't be possible unless we created a kind of a cultural movement based on something like the film that reaches a lot of people.
01:20:37.000 It's interesting that you made this comparison to the Environmental Protection Agency because there's kind of a parallel in the way other countries handle the environment versus the way we do and how it makes them competitive.
01:20:48.000 I mean, that's always been the Republican argument for not getting rid of certain fossil fuels and coal and all sorts of things that have a negative consequence, that we need to be competitive with China.
01:21:02.000 We need to be competitive with these other countries that don't have these regulations in effect.
01:21:07.000 The concern would be, well, first of all, the problem is these companies are global, right?
01:21:11.000 Like Facebook is global.
01:21:13.000 If they put these regulations on America but didn't put these regulations worldwide, then wouldn't they use the income and the algorithm in other countries unchecked and have this tremendous negative consequence and gather up all this money?
01:21:30.000 Which is why, just like Sugar, it's like everyone around the world has to understand and be more antagonistic.
01:21:34.000 Not like sugar is evil, but just you have to have a common awareness about the problem.
01:21:38.000 But how could you educate people that, like, if you're talking about a country like Myanmar or these other countries that have had these, like, serious consequences because of Facebook, how could you possibly get our ideas across to them if we don't even know their language?
01:21:55.000 And it's just...
01:21:56.000 This system that's already set up in this very advantageous way for them where Facebook comes on their phone.
01:22:02.000 Like, how could you hit the brakes on that?
01:22:04.000 Well, I mean, first of all, I just want to say this is an incredibly hard and depressing problem.
01:22:09.000 We realize just the scale of it, right?
01:22:11.000 Right.
01:22:12.000 You need something like a global—I mean, language-independent, global self-awareness about this problem.
01:22:19.000 Now, again, I don't want to be tooting the horn about the film, but the thing I'm excited about is It launched on Netflix in 190 countries and in 30 languages.
01:22:27.000 You should toot the horn.
01:22:29.000 Toot it.
01:22:31.000 Well, I think the film was seen in 30 languages.
01:22:34.000 So the cool thing is I wish I could show the world my inbox.
01:22:37.000 I think people see the film and they feel like, oh my god, this is huge and I'm a huge problem and I'm all alone.
01:22:43.000 How are we ever going to fix this?
01:22:45.000 But I get emails every day from Indonesia, Chile, Argentina, Brazil, people saying, oh my god, this is exactly what's going on in my country.
01:22:54.000 I mean, I've never felt more optimistic, and I've felt really pessimistic for the last eight years working on this, because there really hasn't been enough movement.
01:23:02.000 But I think for the first time, there's a global awareness now that we could then start to mobilize.
01:23:07.000 I know the EU is mobilizing, Canada is mobilizing, Australia is mobilizing, California state is mobilizing with Prop 24. There's a whole bunch of Movement now in the space, and they have a new rhetorical arsenal of, you know, why we have to make this bigger transition.
01:23:21.000 Now, you know, are we going to get all the countries that, you know, where there's the six different major dialects in Ethiopia, where they're going to know about this?
01:23:30.000 I don't think the film was translated into all those dialects.
01:23:33.000 I think we need to do more.
01:23:34.000 It's a really, really hard, messy problem.
01:23:37.000 But on the topic of if we don't do it, someone else will...
01:23:43.000 One interesting thing in the environmental movement was there's a great WNYC radio piece about the history of lead and when we regulated lead.
01:23:54.000 Do you know anything about this?
01:23:55.000 Yeah, I do.
01:23:57.000 I'm curious if this matches up with your experience.
01:24:00.000 My understanding is that obviously lead was this sort of miracle thing.
01:24:03.000 We put it in paint.
01:24:05.000 We put it in gas.
01:24:06.000 It was like, great.
01:24:07.000 And then the way we figured out that we should regulate lead out of our sort of infused product supply...
01:24:17.000 There was this guy who proved that it dropped kids' IQ by four points for every, I think, microgram per deciliter, I think.
01:24:29.000 So in other words, if you had a microgram of lead per deciliter of either, I'm guessing, air, it would drop the IQ of kids by four points.
01:24:39.000 And they measured this by actually doing a sample on their teeth or something, because lead shows up in your bones, I think.
01:24:45.000 And they proved that if the IQ points dropped by four points, it would lower future wage-earning potential of those kids, which would then lower the GDP of the country, because it would be shifting the IQ of the entire country down by four points,
01:25:03.000 if not more, based on how much lead is in the environment.
01:25:06.000 If you zoom out and say, is social media...
01:25:10.000 Now, let's replace the word IQ, which is also a rot term because there's like a whole bunch of views about how that's designed in certain ways and not others and measuring intelligence.
01:25:19.000 Let's replace IQ with problem-solving capacity.
01:25:22.000 What is your problem-solving capacity?
01:25:24.000 Which is actually how they talk about it in this radio episode.
01:25:28.000 And imagine that we have a societal IQ or a societal problem-solving capacity.
01:25:33.000 The US has a societal IQ. Russia has a societal IQ. Germany has a societal IQ. How good is a country at solving its problems?
01:25:42.000 Now imagine that what does social media do to our societal IQ? Well, it distorts our ideas.
01:25:50.000 It gives us a bunch of false narratives.
01:25:52.000 It fills us with misinformation and nonsense.
01:25:55.000 It makes it impossible to agree with each other.
01:25:57.000 And in a democracy, if you don't agree with each other and you can't even do compromise, you have to recognize that politics is invented to avoid warfare, right?
01:26:04.000 Right.
01:26:04.000 So we have compromise and understanding so that we don't physically are violent with each other.
01:26:10.000 We have compromise and conversation.
01:26:12.000 If social media makes compromise, conversation, and shared understanding and shared truth impossible, It doesn't drop our societal IQ by four points.
01:26:21.000 It drops it to zero.
01:26:22.000 Because you can't solve any problem, whether it's human trafficking, or poverty, or climate issues, or racial injustice, whatever it is that you care about.
01:26:33.000 It depends on us having some shared view about what we agree on.
01:26:36.000 I think?
01:26:59.000 And when two people who traditionally disagree actually agree on something, that's what gets boosted to the top of the way that we look at our information feeds.
01:27:08.000 Really?
01:27:09.000 Yeah.
01:27:09.000 So it's about finding consensus where they'd be unlikely and saying, hey, actually, you, Joe, and Tristan, typically you disagree on these six things, but you agree on these three things.
01:27:18.000 And of things that we're going to encourage you to talk about on a menu, we hand you a menu of the things you agree on.
01:27:23.000 And how did they manipulate that?
01:27:26.000 Honestly, we did a great interview with her on our podcast that people can listen to.
01:27:31.000 I think you should have her on.
01:27:32.000 I would love to, but what does your podcast again tell people?
01:27:34.000 It's called Your Undivided Attention.
01:27:37.000 And the interview is with Audrey Tang is her name.
01:27:39.000 And I think this is one model of how do you have...
01:27:49.000 Taiwan is such a unique situation, too, right?
01:27:56.000 Because China doesn't recognize them, and there's a real threat that they're going to be invaded by China.
01:28:23.000 I'm sure.
01:28:29.000 Whereas the United States, we're not this tiny island with a looming threat elsewhere.
01:28:33.000 In fact, many people don't know or don't think that there's actually information warfare going on.
01:28:38.000 I actually think it's really important to point out to people that...
01:28:43.000 The social media is one of our biggest national security risks because while we're obsessed with protecting our physical borders and building walls and spending a trillion dollars redoing the nuclear fleet, we left the digital border wide open.
01:28:56.000 Like if Russia or China try to fly a plane into the United States, Our Pentagon and billions of dollars of defense infrastructure from Raytheon and Boeing or whatever will shoot that thing down and it doesn't get in.
01:29:06.000 If they try to come into the country, they'll get stopped by the passport control system, ideally.
01:29:11.000 If Russia or China try to fly an information bomb into the country, instead of being met by the Department of Defense, they're met by a Facebook algorithm with a white glove that says exactly which zip code you want to target.
01:29:23.000 Like, it's the opposite of protection.
01:29:25.000 So social media makes us more vulnerable.
01:29:27.000 I think of it like, if you imagine like a bank that spent billions of dollars, you know, surrounding the bank with physical bodyguards, right?
01:29:35.000 Like, just the buffest guys in every single quarter, you just totally secured the bank.
01:29:39.000 But then you install on the bank a computer system that everyone interacts with.
01:29:44.000 And no one changes the default password from like lowercase password.
01:29:48.000 Anyone can hack in.
01:29:49.000 That's what we do when we install Facebook in our society or you install Facebook in Ethiopia.
01:29:55.000 Because if you think Russia or China, you know, or Iran or South Korea or excuse me, North Korea...
01:30:00.000 Influencing our election is bad.
01:30:02.000 Just keep in mind the dozens of countries throughout Africa where we actually know recently there was a huge campaign that the Stanford Cyber Policy Center did a report on of Russia targeting I think something like seven or eight major countries and disinformation campaigns running in those countries.
01:30:18.000 Or the Facebook whistleblower who came out about a month ago, Sophie Zhang, I think is her name, saying that she personally had to step in to deal with disinformation campaigns in Honduras, Azerbaijan, I think Greece or some other countries like that.
01:30:32.000 So the scale of what these technology companies are managing, they're managing the information environments for all these countries, but they don't have the resources to do it.
01:30:42.000 So they...
01:30:42.000 Not only that, they're not trained to do it.
01:30:44.000 They're not qualified to do it.
01:30:45.000 They're making up as they go along.
01:30:47.000 They're 20 to 30 to 40. And they're way behind the curve.
01:30:49.000 When I had Renee to rest on, and she detailed all the issues with the Internet Research Agency in Russia and what they did during the 2016 campaign for both sides.
01:31:00.000 I mean, the idea is they just promoted Trump, but they were basically sowing the seeds of...
01:31:07.000 Just the decline of the democracy.
01:31:09.000 They were trying to figure out how to create turmoil.
01:31:12.000 And they were doing it in this very bizarre, calculated way that it didn't seem...
01:31:20.000 It was hard to see, like, what's the endgame here?
01:31:22.000 Well, the endgame is to have everybody fight.
01:31:24.000 I mean, that's really what the endgame was.
01:31:27.000 Yeah.
01:31:27.000 And if I'm one of our major adversaries, after World War II, there was no ability to use kinetic nukes or something on the bigger countries, right?
01:31:37.000 That's all done.
01:31:39.000 So what's the best way to take down the biggest country on the planet, on the block?
01:31:46.000 You use its own internal tensions against itself.
01:31:48.000 This is what Sun Tzu would tell you to do.
01:31:50.000 Yeah.
01:31:51.000 And that's never been easier because of Facebook and because of these platforms being open to do this manipulation.
01:31:58.000 And if I'm looking now, we're four days away from the U.S. elections or something like that when this goes out.
01:32:03.000 Jesus Christ.
01:32:05.000 We have never been more destabilized as a country until now.
01:32:10.000 I mean, this is the most destabilized we probably have ever been, I would say.
01:32:14.000 And polarized.
01:32:16.000 Maybe people would argue the Civil War was worse, but in recent history, there is maximum incentive for foreign actors to drive up, again, not one side or the other, but to drive us into conflict.
01:32:30.000 So I think what we all need to do is recognize how much incentive there is to plant stories, to actually have so physical violence on the streets.
01:32:40.000 I think there was just a story, wasn't we talking about this morning, that Yeah.
01:33:06.000 Very much so.
01:33:08.000 And the Renee DiResta podcast that I did, where she went into depth about all the different ways that they did it, and the most curious one being funny memes.
01:33:18.000 Yeah.
01:33:18.000 There's so many of the memes that you read that you laugh at.
01:33:22.000 Yeah.
01:33:22.000 Well, it was just so weird.
01:33:25.000 They were humorous.
01:33:26.000 And she said she looked at probably 100,000 memes.
01:33:28.000 And the funny thing is you actually can agree with them, right?
01:33:30.000 Yeah, they're funny.
01:33:31.000 They're humorous.
01:33:31.000 You would laugh at them and say, like, oh, you know.
01:33:33.000 And they're being constructed by foreign agents that are doing this to try to mock certain aspects of our society and pit people against each other and create a mockery.
01:33:46.000 And, you know, back in 2016, there was very little collaboration between our defense industry and CIA and DOD and people like that and the tech platforms.
01:33:56.000 And the tech platforms said it's government's job to deal with if foreign actors are doing these things.
01:34:00.000 How do you stop something like the IRA? Like, say, if they're creating memes in particular, and they're funny memes.
01:34:07.000 One of the issues that Renee brings up, and I'm just a huge fan of her and her work, is that if I'm China, I don't need to invent some fake news story.
01:34:19.000 I just find someone in your society who's already saying what I want you to be talking about, and I just amplify them up.
01:34:26.000 I take that dial, and I just turn it up to 10. So I find your Texas secessionists, and like, oh, Texas secessionists, that would be a good thing if I'm trying to rip the country apart.
01:34:34.000 So I'm going to take those Texas secessionists and the California secessionists, And I'm just going to dial them up to 10. So those are the ones we hear from.
01:34:41.000 Now, if you're trying to stop me in your Facebook and you're the integrity team or something, on what grounds are you trying to stop me?
01:34:48.000 Because it's your own people, your own free speech.
01:34:50.000 I'm just the one amplifying the one I want to be out there.
01:34:54.000 Right?
01:34:54.000 And so that's what gets tricky about this is I think our moral concepts that we hold so dear of free speech are inadequate in an attention economy that is hackable.
01:35:03.000 And it's really more about what's getting the attention rather than what are individuals saying or can't say.
01:35:10.000 Again, they've created this Frankenstein where they're making mostly automated decisions about who's looking like what pattern behavior or coordinated and authentic behavior here or that, and they're shutting down.
01:35:20.000 I don't know if people know this.
01:35:21.000 Facebook shut down 2 billion fake accounts.
01:35:24.000 I think this is a stat from a year ago.
01:35:27.000 They shut down 2 billion fake accounts.
01:35:29.000 They have 3 billion active real users.
01:35:31.000 Do you think that those 2 billion were the perfect real fake accounts and they didn't miss any or they didn't overwhelm and took some real accounts down with it?
01:35:40.000 Our friend Brett Weinstein, he just got taken down by Facebook.
01:35:43.000 I think you saw that.
01:35:45.000 That seemed calculated, though.
01:35:46.000 Facebook has shut down 5.4 billion fake accounts this year.
01:35:51.000 And that was in November 2019. Oh, my God.
01:35:54.000 Oh, my God.
01:35:55.000 That is insane.
01:35:56.000 That's so many.
01:35:58.000 And so, again, it's the scale that these things are operating at.
01:36:01.000 And that's why, you know, when Brett got his thing taken down, I didn't like that.
01:36:05.000 But it's not like there's this vendetta against Brett, right?
01:36:07.000 Right.
01:36:08.000 Oh, I don't know about that.
01:36:09.000 That seemed to me to be a calculated thing because, you know, Eric actually tweeted about it saying that, you know, you could probably find the tweet because I retweeted it.
01:36:18.000 Like, basically, it was reviewed by a person, so you're lying.
01:36:22.000 He's like, this is not something that was taken down by an algorithm.
01:36:26.000 He believes that it was because it was Unity 2020 platform where they were trying to bring together conservatives and liberals.
01:36:33.000 And try to find some common ground and create, like, a third-party candidate that combines the best of both worlds.
01:36:39.000 I don't understand what policy his Unity 2020 thing was going up against.
01:36:44.000 Like, I have no idea what they would say.
01:36:45.000 It's going against the two-party system.
01:36:46.000 The idea is that it's taking away votes from Biden and that it may help Trump win.
01:36:50.000 They banned him off Twitter as well.
01:36:51.000 You know that, too.
01:36:52.000 They blocked the account or something from him.
01:36:54.000 They banned the account.
01:36:56.000 They banned the entire account.
01:36:56.000 They banned the Unity 2020 account.
01:36:58.000 Yeah.
01:36:59.000 Unity.
01:36:59.000 I mean, literally, unity.
01:37:01.000 They're like, nope, no unity.
01:37:02.000 Fuck you.
01:37:03.000 We want Biden.
01:37:04.000 The political bias on social media is undeniable, and that's maybe the least of our concerns in the long run, but it's a tremendous issue, and it also, it for sure sows the seeds of discontent, and it creates more animosity, and it creates more conflict.
01:37:20.000 The interesting thing is that if I'm one of our adversaries, I see that there is this view that people don't like the social media platforms and I want them to be more...
01:37:29.000 Like, let's say I'm Russia or China.
01:37:30.000 And I'm currently using Facebook and Twitter successfully to run information campaigns.
01:37:35.000 And then I want them...
01:37:36.000 I can actually plant a story so that they end up shutting it down and shutting down conservatives or shutting down one side, which then forces the platforms to open up more so that I then, Russia or China, can keep manipulating even more.
01:37:47.000 I understand.
01:37:48.000 I see what you're saying.
01:37:49.000 So right now, they want it to be a free-for-all where there's no moderation at all because that allows them to get in.
01:37:57.000 And they can weaponize the conversation against itself, right?
01:38:00.000 I don't see a way out of this, Tristan.
01:38:02.000 We have to all be aware of it.
01:38:04.000 But even if we are all aware of it, it seems so pervasive.
01:38:08.000 Yeah.
01:38:09.000 Well, it's not just pervasive.
01:38:11.000 It's like I said, we're 10 years into this hypnosis experiment.
01:38:16.000 This is the largest psychological experiment we've ever run on humanity.
01:38:20.000 It's insane.
01:38:20.000 It is insane.
01:38:21.000 And it's also with tools that never existed before, evolutionarily.
01:38:27.000 So we really are not designed.
01:38:29.000 Just the way these brightly lit metal devices and glass devices interact with your brain, they're so enthralling.
01:38:38.000 Right.
01:38:38.000 I think?
01:39:01.000 And it could suck up your time staring at butts.
01:39:04.000 And the infusion of the things that are necessary for life, like text messaging or looking something up, are infused and right next to all of the sort of corrupt stuff.
01:39:15.000 And if you're using it to order food, and if you're using it to get an Uber.
01:39:20.000 But imagine if we all wiped our phones of all the extractive business model stuff and we only had the tools.
01:39:26.000 Well, have you thought about using a light phone?
01:39:29.000 Yeah, it's funny.
01:39:30.000 Those guys used to be brought up in my awareness more often.
01:39:34.000 For those who don't know, it's like a mini spin.
01:39:36.000 One of the guys on the documentary is one of the creators of it, right?
01:39:40.000 No, I think you're thinking of Tim Kendall.
01:39:43.000 He's the guy who brought in Facebook's business model of advertising.
01:39:46.000 And he runs a company now called Moment that shows you the number of hours you spend on different apps and helps you use it less.
01:39:53.000 I thought someone involved in...
01:39:55.000 In the documentary was also a part of the light phone team.
01:39:59.000 No, not officially.
01:40:01.000 No, I don't think so.
01:40:02.000 But the light phone is basically a thin black and white phone thing.
01:40:06.000 There's text.
01:40:06.000 And I think it plays music now, which I was like, oh, that's a mistake.
01:40:11.000 That's a slippery slope.
01:40:13.000 That's the thing.
01:40:13.000 And we have to all be comfortable with losing access to things that we might love.
01:40:18.000 Like, oh, maybe you do want to take notes this time, but you don't have your full keyboard to do that.
01:40:21.000 And are you willing to do that?
01:40:22.000 I think the thing is, one thing people can do is just take like a digital Sabbath one day a week off completely.
01:40:27.000 Because imagine if you got several hundred million people to do that.
01:40:31.000 That drops the revenue of these companies by like 15%.
01:40:34.000 Because that's one out of seven days that you're not on the system, so long as you don't rebalance and...
01:40:39.000 I'm inclined to think that Apple's, their solution is really the way out of this.
01:40:44.000 To opt out of all sharing of your information.
01:40:49.000 And if they could come up with some sort of a social media platform that kept that as an ethic, I mean, it might allow us to communicate with each other, but stop all this algorithm nonsense.
01:41:01.000 Look, if anybody has the power to do it, they have so much goddamn money.
01:41:05.000 Totally.
01:41:05.000 Well, and also, people talk about the government regulating these platforms, but Apple is kind of the government that can regulate the attention economy.
01:41:14.000 Yes.
01:41:14.000 Because when they do this thing we talked about earlier of saying, do you want to be tracked?
01:41:19.000 Right?
01:41:19.000 And they give you this option.
01:41:20.000 When like 99% of people are going to say, no, I don't want to be tracked.
01:41:22.000 Right.
01:41:23.000 When they do that, they just put a 30% tax on all the advertising-based businesses because now you don't get as personalized an ad, which means they make less money, which means that business model is less attractive to venture capitalists to fund the next thing.
01:41:35.000 So they're actually enacting a kind of a carbon tax, but it's like on the polluting stuff.
01:41:42.000 They're enacting a kind of...
01:41:43.000 Social media polluting stuff, they're taxing by 30%, but they could do more than that.
01:41:47.000 Imagine they have this 30-70 split on app developers get 70% of the revenue when you buy stuff and Apple keeps 30%.
01:41:55.000 They could modify that percentage based on how much social value that those things are delivering to society.
01:42:04.000 This gets a little bit weird, and people may not like this, but if you think about who's the real customer that we want to be, how do we want things oriented?
01:42:11.000 If I'm an app developer, I want to make money the more I'm helping society and helping individuals, not how much I'm extracting and stealing their time and attention.
01:42:19.000 And imagine that governments in the future actually paid some kind of budget into, let's say, the app store.
01:42:25.000 There's antitrust issues with this, but you pay money into the app store.
01:42:28.000 And then as apps started helping people with more social outcomes, like let's say learning programs or schools or things like Khan Academy, things like this, that more money flows in the direction of where people got that value.
01:42:40.000 And it was that revenue split between Apple and the app developers ends up going more to things that end up helping people, as opposed to things that were just good at capturing attention and monetizing zombie behavior.
01:42:51.000 One of my favorite lines in the film is Justin Rosenstein from the Like button, Saying that, you know, so long as a whale is worth more dead than alive, and a tree is worth more as lumber and two-by-fours than a living tree,
01:43:07.000 now we're the whale, we're the tree, we're worth more when we have predictable zombie-like behaviors, when we're more addicted, distracted, outraged, polarized, and disinformed than if we're a living, thriving citizen or a growing child that's like playing with their friends.
01:43:24.000 And I think that that kind of distinction that just like we protect national parks or we protect, you know, certain fisheries and we don't kill the whales in those areas or something, we need to really protect, like, we have to call out what's sacred to us now.
01:43:37.000 Yeah, it's an excellent message.
01:43:41.000 My problem that I see is that I just don't know how well that message is going to be absorbed on the people that are already in the trance.
01:43:52.000 I think it's so difficult for people to put things down.
01:43:55.000 Like I was telling you how difficult it is for me to tell my friends don't read the comments.
01:43:59.000 Right.
01:44:00.000 Right.
01:44:00.000 It's hard to have that kind of discipline, and it's hard to have that kind of...
01:44:04.000 Because people do get bored, and when they get bored, like, if you're waiting in line for somewhere, you pull out your phone.
01:44:10.000 You're at the doctor's office, you pull out your phone.
01:44:14.000 Totally.
01:44:15.000 I mean, and that's why...
01:44:16.000 And I do that, right?
01:44:18.000 Yeah, I do too.
01:44:18.000 This is incredibly hard.
01:44:21.000 Back in the day...
01:44:22.000 When I was at Google, I tried to change Google from the inside for two years before leaving.
01:44:27.000 What was it like there?
01:44:28.000 Please share your experiences because when you said you tried to change it from the inside, what kind of resistance were you met with and what was their reaction to these thoughts that you had about the unbelievable negative consequences?
01:44:42.000 Well, this is in 2013, so we didn't know about all the negative consequences.
01:44:46.000 But you saw the writing on the wall, at least some of it.
01:44:48.000 Some of it, yeah.
01:44:49.000 I mean, the notion that things were competing for attention, which would mean that they would need to compete to get more and more persuasive and hack more and more of our vulnerabilities, and that that would grow, that was the core insight.
01:44:59.000 I didn't know that it would lead to polarization or conspiracy theory, like, recommendations.
01:45:03.000 But I did know, you know, more addiction, kids having less, you know, weaker relationships...
01:45:09.000 When did it occur to you?
01:45:11.000 What were your initial feelings?
01:45:14.000 I was on a hiking trip in the Santa Cruz Mountains with our co-founder now, Aza Raskin.
01:45:20.000 It's funny enough, our co-founder Aza, his dad was Jeff Raskin, who invented the Macintosh project at Apple.
01:45:26.000 I don't know if you know the history there.
01:45:28.000 He started the Macintosh project and actually came up with the word humane to describe the humane interface.
01:45:34.000 And that's where our name and our work comes from, is from his father's work.
01:45:37.000 He and I were in the mountains in Santa Cruz and just experiencing nature and just...
01:45:43.000 I came back and realized all of this stuff that we've built is just distracting us from the stuff that's really important.
01:45:50.000 And that's when, coming back from that trip, I made the first Google Deck that then spread virally throughout the company, saying, never before in history have 50 designers.
01:46:00.000 You know, white 20 to 35 year old engineers who look like me to hold the collective psyche of humanity.
01:46:07.000 And then that presentation was released and about, you know, 10,000 people at Google saw it.
01:46:12.000 It was actually the number one We need to do something about this.
01:46:37.000 It was just very hard to get momentum on it.
01:46:40.000 And really the key interfaces to change within Google are Chrome and Android because those are the neutral portals into which you're then using apps and notifications and websites and all of that.
01:46:51.000 Like those are the kind of governments of the attention economy that Google runs.
01:46:55.000 And when you worked there, did you have to use Android?
01:47:01.000 Was it part of the requirement to work there?
01:47:04.000 No, I mean a lot of people had Android phones.
01:47:05.000 I still used an iPhone.
01:47:07.000 Was it an issue?
01:47:09.000 No.
01:47:09.000 No, I mean, people, because they realized that they needed products to work on all the phones.
01:47:14.000 I mean, if you work directly on Android, then you would have to use an Android phone.
01:47:17.000 But we tried to get, you know, some of those things like the screen time features that are now launched, you know, so everyone now has on their phone, like, it shows you the number of hours or whatever.
01:47:26.000 Is that on Android as well?
01:47:27.000 It is, yeah.
01:47:28.000 And actually that came, I think, as a result of this advocacy.
01:47:30.000 And that's shipping on a billion phones, which shows you you can change this stuff.
01:47:34.000 That goes against their financial interest.
01:47:37.000 People spending less time on their phones, getting less notifications.
01:47:40.000 It sort of does, but it doesn't work.
01:47:42.000 Well, correct.
01:47:43.000 So it doesn't actually work is the thing.
01:47:44.000 Yeah.
01:47:44.000 And let's separate the intention and the fact that they did it from the behavioral...
01:47:48.000 It's like labels on cigarettes that tell you it's going to give you cancer.
01:47:50.000 Like, by the time you're buying them, you're already hooked.
01:47:52.000 Correct.
01:47:53.000 I mean, it's even worse.
01:47:55.000 Imagine, like, every cigarette box had, like, a little pencil inside so you can mark.
01:48:01.000 There's, like, little streaks that said the number of days in a row you haven't smoked, and you could, like, mark each day.
01:48:04.000 It's, like, it's too late, right?
01:48:06.000 Yeah.
01:48:07.000 It's just the wrong paradigm.
01:48:09.000 Mm-hmm.
01:48:10.000 The fundamental thing we have to change is the incentives and how money flows.
01:48:13.000 Because we want money flowing in the direction of the more these things help us.
01:48:17.000 Like, let me give you a concrete example.
01:48:19.000 Like, let's say you want to learn a musical instrument.
01:48:22.000 And you go to YouTube to pick up ukulele or whatever.
01:48:26.000 And you're seeing how to play the ukulele.
01:48:28.000 Like, from that point, in a system that was designed in a humane and sort of time-well-spent kind of way...
01:48:34.000 It would really ask you, instead of saying, here's 20 more videos that are going to just like suck you down a rabbit hole, it would sort of be more oriented towards what do you really need help with?
01:48:42.000 Like, do you need to buy ukulele?
01:48:44.000 Here's a link to Amazon to get the ukulele.
01:48:45.000 Are you looking for a ukulele teacher?
01:48:47.000 Let me do a quick scan on your Facebook or Twitter search to find out which of those people are ukulele teachers.
01:48:52.000 Do you need instant like tutoring?
01:48:54.000 Because there's actually this service you never heard of called Skillshare or something like that, where you can get instant ukulele tutoring.
01:48:59.000 And if we're really designing these things to be about what would most help you next, You know, we're only as good as the menu of choices on life's menu, and right now the menu is here's something else to addict you and keep you hooked instead of here's a next step that would actually be on the trajectory of helping people live their lives better.
01:49:17.000 But you'd have to incentivize the companies because like there's so much incentive on getting you addicted because there's so much financial reward.
01:49:23.000 What would be the financial reward that they could have to get you something that would be helpful for you like lessons or this?
01:49:31.000 I mean, so one way that that could work is like, let's say people pay a monthly subscription of like, I don't know, 20 bucks a month or something.
01:49:37.000 That's never going to work.
01:49:38.000 I get you.
01:49:39.000 But like, let's say you pay some, you put money into a pot where the pot...
01:49:43.000 But then we have the problem.
01:49:45.000 The problem is like...
01:49:46.000 It costs some money versus free.
01:49:48.000 Like there was a company that still exists for now that was trying to do the Netflix of podcasting.
01:49:54.000 Oh, uh-huh.
01:49:54.000 And they approached us and they're like, we're just going to get all these people together and people are going to pay to use your podcast.
01:50:01.000 I'm like, why would they do that when podcasts are free?
01:50:03.000 Yeah.
01:50:04.000 That's one of the reasons why podcasts work is because they're free.
01:50:06.000 Right.
01:50:07.000 When things are free, they're attractive.
01:50:10.000 It's easy.
01:50:11.000 When things cost money, you have to have something that's extraordinary, like Netflix.
01:50:15.000 When you say the Netflix of podcasting, well, Netflix makes their own shows.
01:50:20.000 They spend millions of dollars on special effects and all these different things, and they're really enormous projects.
01:50:27.000 You're just talking about people talking shit, and you want money.
01:50:31.000 Right.
01:50:31.000 Well, that's the thing.
01:50:32.000 We have to actually deliver something that is totally qualitatively better.
01:50:36.000 And it would also have to be someone like you or someone who's really aware of the issues that we're dealing with with addictions to social media should have to say this is the best possible alternative.
01:50:48.000 Like in this environment, yes, you are paying a certain amount of money per month, but maybe that could get factored into your cell phone bill.
01:50:58.000 And maybe with this sort of an ecosystem, you're no longer being drawn in by your addictions and, you know, it's not playing for your attention span.
01:51:09.000 It's rewarding you in a very productive way.
01:51:14.000 15% more of your time was just way better spent.
01:51:18.000 You were actually doing the things you cared about.
01:51:21.000 And it actually helped improve your life.
01:51:23.000 Yeah.
01:51:23.000 Imagine when you use email.
01:51:24.000 If it was truly designed.
01:51:25.000 I mean, forget email.
01:51:26.000 People don't relate to that because email isn't that popular.
01:51:28.000 But whatever it is that's a huge time sink for you.
01:51:31.000 For me, email is a huge one.
01:51:32.000 For me, web browsing or whatever is a big one.
01:51:35.000 Imagine that those things were so much better designed that I actually wrote back to the right emails and I mostly didn't think about the rest that when I was spending time on whatever I was spending time on that it was really more and more of my life was a life well lived and time well spent.
01:51:50.000 That's like the retrospective view.
01:51:52.000 I keep going to Apple because I think they're the only technology company that does have these ethics to sort of protect privacy.
01:51:59.000 Have you thought about coming to them?
01:52:01.000 Yeah.
01:52:02.000 Have you?
01:52:03.000 Well, I mean, I think that they've made great first steps.
01:52:09.000 And they were the first, along with Google, to do the screen time management stuff.
01:52:13.000 But that was just the...
01:52:15.000 Barely scratching the surface, like baby, baby, baby steps.
01:52:18.000 What we really need them to do is radically reimagine how those incentives and how the phone fundamentally works.
01:52:25.000 So it's not just all these colorful icons.
01:52:27.000 And one of the problems, they do have a disincentive, which is a lot of their revenue comes from gaming.
01:52:30.000 And as they move more into Apple TV competing with HBO and Hulu and Netflix and that whole thing, where they need subscriptions, so Apple's revenue on devices and hardware is sort of maxing out.
01:52:41.000 And where they're going to get their next bout of revenue to keep their stock price up is on these subscriptions.
01:52:46.000 I'm less concerned with those addictions.
01:52:48.000 I'm less concerned with gaming addictions than information addictions because at least it's not fundamentally altering your view of the world.
01:52:55.000 Right.
01:52:55.000 It's growing up democracy and making it impossible to agree.
01:52:57.000 And this is coming from a person that's had legitimate video game addictions in the past.
01:53:03.000 But my wife is addicted to Subway Surfer.
01:53:06.000 I don't know what it is.
01:53:07.000 It's a crazy game.
01:53:08.000 It's like you're riding on the top of subways and you're jumping around.
01:53:12.000 It's really ridiculous.
01:53:13.000 But it's fun.
01:53:13.000 You watch it like, whoa.
01:53:15.000 But I don't fuck with video games.
01:53:16.000 But I watch it and those games at least are enjoyable.
01:53:24.000 There's something silly about it.
01:53:25.000 Like, ah, fuck!
01:53:27.000 And then you start doing it again.
01:53:28.000 When I see people getting angry about things on social media, I don't see the upside.
01:53:33.000 Right.
01:53:34.000 I don't mind them making a profit off games.
01:53:37.000 There is an issue, though, with games that addict children, and then these children, you could spend money on Roblox, and you can have all these different things you spend money on.
01:53:49.000 You wind up having these enormous bills.
01:53:52.000 You leave your kid with an iPad, and you come back, you have a $500 bill.
01:53:56.000 What did you do?
01:53:57.000 Yeah.
01:53:57.000 This is an issue, for sure.
01:54:00.000 But at least it's not an issue in that it's changing their view of the world.
01:54:06.000 And I feel like there's a way for—I keep going back to Apple—but a company like Apple to rethink the way—you know, they already have a walled garden, right, with iMessage and FaceTime and all this different— They can totally build those things out.
01:54:21.000 I mean, iMessage and iCloud could be the basis for some new neutral social media that's not based on instant social approval and rewards, right?
01:54:30.000 They can make it easier to share information with small groups of friends and have that all synced.
01:54:33.000 And even, you know, in the pre-COVID days, I was thinking about Apple a lot.
01:54:37.000 I think you're right, by the way, to really poke on them.
01:54:39.000 I think they're the one company that's in a position to lead on this.
01:54:42.000 And they also have a history of thinking along those lines.
01:54:46.000 You know, they had this feature that's kind of hidden now, but the Find My Friends, right?
01:54:50.000 They call it Find My Now.
01:54:51.000 It's all buried together so you can find your devices and find your friends.
01:54:54.000 But in a pre-COVID world, imagine they really built out the, you know, where are my friends right now and making it easier to know when you're nearby someone so you can more easily get together in person.
01:55:05.000 Because right now, to the extent Facebook wants to bring people closer together...
01:55:08.000 They don't want to, and again, this is pre-COVID, but they don't want to incentivize lots and lots of Facebook events.
01:55:14.000 They really care about groups that keep people posting it online and looking at ads because of the category of bringing people closer together, they want to do the online screen time based version of that as opposed to the offline.
01:55:24.000 Right.
01:55:24.000 Apple, by contrast, if you had little iMessage groups of friends, you could say, hey, does everyone in this little group want to opt into being able to see where we all are on, say, weekdays between 5 and 8 p.m.
01:55:37.000 or something like that?
01:55:38.000 So you could time-bound it and make it easier for serendipitous connection and availability to happen.
01:55:43.000 That's hard to do.
01:55:44.000 It's hard to design that.
01:55:45.000 But there's things like that that Apple's in a position to do if it really took on that mantle.
01:55:50.000 And I think as people get more and more skeptical of these other products, they're in a better and better position to do that.
01:55:56.000 One of the antitrust issues is do we want a world where our entire well-being as a society depends on what one massive corporation worth over a trillion dollars does or doesn't do?
01:56:05.000 We need more openness to try different things.
01:56:09.000 And we're really at the behest of whether one or two companies, Apple or Google, I think we're good to go.
01:56:39.000 You know, we have a hugely emitting, you know, society-ruining kind of business model of this attention-extractive paradigm, and we could long-term sort of just like a progressive tax on that transition to some other thing.
01:56:53.000 The government could do that, right?
01:56:55.000 That's not like, who do we censor?
01:56:56.000 It's how do we disincentivize these businesses to pay for the sort of life support systems of society that they've ruined.
01:57:02.000 A good example of this, I think, in Australia.
01:57:06.000 I think it's Australia that's regulated that Google and Facebook have to pay the publishers who they're basically hollowing out.
01:57:12.000 Because one of the effects we've not talked about is the way that Google and Facebook have hollowed out the fourth estate in journalism.
01:57:18.000 I mean, because journalism has turned into, and local news websites can't make any money except by basically producing clickbait.
01:57:24.000 So even to the extent that local newspapers exist, they only exist by basically clickbaitification of lower and lower paid workers who are just generating content farms.
01:57:35.000 Anyway, so that's an example of if you force those companies to pay to revitalize the fourth estate and to make sure we have a very sustainably funded fourth estate that doesn't have to produce this clickbait stuff, that's another direction.
01:57:50.000 Yeah, that's interesting that they have to pay.
01:57:55.000 I mean, these are the wealthiest companies in, like, the history of humanity, right?
01:57:59.000 So that's the thing.
01:57:59.000 So we shouldn't be cautious about how much they should have to pay.
01:58:03.000 Except we also don't want it to happen on the other end, right?
01:58:05.000 You don't want to have a world where, you know, we have Roundup making a crazy amount of money from giving everybody cancer and lymphoma from, you know, the chemicals.
01:58:15.000 Right, glyphosates.
01:58:16.000 And then they pay everybody on the other end after a lawsuit of a billion dollars, but now everyone's got cancer.
01:58:21.000 Let's actually do it in a way.
01:58:22.000 So we don't want a world where Facebook and Google profit off of the erosion of our social fabric, and then they pay us back for it.
01:58:29.000 How do you quantify how much money they have to pay to journalism?
01:58:34.000 Yeah.
01:58:35.000 It seems like it's almost a form of socialism.
01:58:38.000 Yeah.
01:58:39.000 Yeah, I mean, this is where the IQ lead example is interesting because they were able to disincentivize and tax the lead producers because they were able to produce some result on how much this lowered the wage earning potentials of the entire population.
01:58:54.000 I mean, how much does this cost our society?
01:58:56.000 We used to say free is the most expensive business model we've ever created because we get the free downgrading of our attention spans, our mental health, our kids, our ability to agree with each other, our capacity to do anything as a democracy.
01:59:09.000 Like, we got all that for free.
01:59:11.000 Wonderful.
01:59:11.000 Obviously, we get lots of benefits, and I want to acknowledge that.
01:59:14.000 But that's just not sustainable.
01:59:16.000 It's a real question.
01:59:17.000 I mean, right now, we're...
01:59:19.000 We have huge existential problems.
01:59:21.000 We have a global competition, power competition going on.
01:59:23.000 I think China just passed the GDP of the US, I believe.
01:59:29.000 If we care about the US having a future in which it can lead the world in some meaningful and enlightened way, we have to deal with this problem.
01:59:39.000 And we have to have a world where digital democracy outcompetes digital authoritarianism.
01:59:44.000 Which is the China model.
01:59:45.000 And right now that builds more coherence and is more efficient and doesn't evolve the way that our current system, you know, does.
01:59:52.000 I think Taiwan, Estonia, and countries like that where they are doing digital democracies are good examples that we can learn from.
01:59:58.000 But we're behind right now.
02:00:00.000 Well, China also has a really fascinating situation with Huawei.
02:00:05.000 Where Google has banned Huawei.
02:00:08.000 So you can't have Google applications on Huawei.
02:00:12.000 So now Huawei is creating their own operating system.
02:00:15.000 And they have their own ecosystem now that they're building up.
02:00:21.000 You know, it's weird that there's only a few different operating systems now.
02:00:25.000 I mean, there's a very small amount of people that are using Linux phones.
02:00:30.000 Then you have a large amount of people using Android and iPhones.
02:00:33.000 And if China becomes...
02:00:55.000 Tons.
02:00:56.000 Yeah.
02:00:58.000 It's weird.
02:00:59.000 When you see this, it feels so futile for me on the outside looking in, but you're working on this.
02:01:11.000 How long do you anticipate this is going to be a part of your life?
02:01:14.000 I mean, what does it feel like to you?
02:01:21.000 I mean, it's not easy, right?
02:01:27.000 The film ends with this question, do you think we're going to get there?
02:01:31.000 I just say we have to.
02:01:34.000 If you care about this going well, I wake up every day and I ask, what will it take for this whole thing to go well?
02:01:41.000 And how do we just orient each of our choices as much as possible towards this going well?
02:01:45.000 And we have a whole bunch of problems.
02:01:47.000 I do look a lot at the environmental issues, the permafrost, methane bombs.
02:01:51.000 The timelines that we have to deal with certain problems are crunching, and we also have certain dangerous exponential technologies that are emerging, decentralization of, you know, CRISPR, and, like, there's a lot of existential threats.
02:02:02.000 I hang out a lot with the sort of existential threats community.
02:02:06.000 It's going to take...
02:02:06.000 It must be a lot of fun.
02:02:08.000 There's a lot of psychological problems in that community, actually.
02:02:11.000 A lot of depression.
02:02:12.000 I can only imagine.
02:02:13.000 There's some suicide as well.
02:02:15.000 It's, you know, it's...
02:02:18.000 It's hard, but I think we each have a responsibility when you see this stuff to say, what will it take for this to go well?
02:02:26.000 And I will say that really seeing the film impact people the way that it has I used to feel like, oh my god, how are we ever going to do this?
02:02:35.000 No one cares.
02:02:35.000 Like, none of people know.
02:02:37.000 At the very least, we now have about 40 to 50 million people who are at least introduced to the problem.
02:02:44.000 The question is, how do we harness them into a collective movement?
02:02:47.000 And that's what we're trying to do next.
02:02:51.000 I'll say also, these issues get more and more weird over time.
02:02:55.000 Co-founder Asa Raskin will say that it's making reality more and more virtual over time.
02:02:59.000 Because we haven't talked about how, as technology advances, at hacking our weaknesses, we start to prefer it over the real thing.
02:03:08.000 We start, for example, there's a recent company, VC-funded, raised like, I think it's worth like over $125 million.
02:03:15.000 And what they make are virtual influencers.
02:03:17.000 So these are like virtual people Virtual video that is more entertaining, more interesting, and that fans like more than real people.
02:03:29.000 Oh boy.
02:03:30.000 And it's kind of related to the kind of deepfake world, right?
02:03:33.000 Where like people prefer this to the real thing.
02:03:34.000 And Sherry Turkle, you know, who's been working at MIT, wrote the book Reclaiming Conversation and Alone Together.
02:03:39.000 She's been talking about this forever, that over time, humans will prefer connection to robots and bots.
02:03:46.000 And the computer-generated thing more than the real thing.
02:03:48.000 Think about AI-generated music being more...
02:03:52.000 It'll start to sweeten our taste buds and give us exactly that thing we're looking for better than we will know ourselves.
02:03:57.000 Just like YouTube can give us the perfect next video that actually every bone in our body will say, actually, I kind of do want to watch that, even though it's a machine pointed at my brain calculating the next thing.
02:04:06.000 There's an example from Microsoft writing this chatbot called Zhao Ice, I don't know how to pronounce it, that after nine weeks people preferred that chatbot to their real friends.
02:04:15.000 And 10 to 25% of their users actually said, I love you, to the chatbot.
02:04:20.000 Oh boy.
02:04:20.000 And there are several who actually said it convinced them not to commit suicide, to have this relationship with this chatbot.
02:04:26.000 So it's her.
02:04:27.000 Yeah.
02:04:27.000 It's her.
02:04:28.000 It's the movie.
02:04:29.000 Exactly.
02:04:29.000 So all these things are the same, right?
02:04:31.000 We're veering into a direction where technology, if it's so good at meeting these underlying paleolithic emotions that we have, the way out of it is we have to see that this is what's going on.
02:04:44.000 We have to see and reckon with ourselves, saying, this is how I work.
02:04:47.000 I have this negativity bias.
02:04:48.000 If I get those 99 comments and one's Positive comments and one's negative.
02:04:52.000 My mind is going to go to the negative.
02:04:53.000 I don't see that.
02:04:54.000 I see you in the future wearing an overcoat.
02:04:58.000 You are literally Lawrence Fishburne in The Matrix trying to tell people to wake up.
02:05:04.000 Well, there's a line in The Social Dilemma where I say, how do you wake up from The Matrix if you don't know you're in The Matrix?
02:05:11.000 That is the issue, right?
02:05:13.000 Even in the Matrix, we at least had a shared Matrix.
02:05:15.000 The problem now is that in the Matrix, each of us have our own Matrix.
02:05:19.000 That's the real kicker.
02:05:20.000 I struggle with the idea that this is all inevitable because this is a natural course of progression with technology and that it's sort of figuring out the best way to...
02:05:33.000 We're good to go.
02:05:48.000 That this is just how life is, and this is how life always should be.
02:05:51.000 But this is just all we've ever known.
02:05:53.000 It's all we've ever known.
02:05:54.000 Einstein didn't write into the laws of physics that social media has to exist for humanity, right?
02:05:58.000 We've gotten rid—again, the environmental movement is a really interesting example, because we passed all sorts of laws.
02:06:03.000 We got rid of lead.
02:06:04.000 We've changed— Some of our pesticides were slow on some of these things, and corporate interests and asymmetric power of large corporations, which I want to say markets and capitalism are great, because when you have asymmetric power for predatory systems that cause harm,
02:06:20.000 they're not going to terminate themselves.
02:06:22.000 They have to be bound in by the public, by culture, by the state.
02:06:27.000 And we just have to point to the examples where we've done that.
02:06:30.000 And in this case, I think the problem is that how much of our stock market is built on the back of like five companies generating a huge amount of wealth.
02:06:39.000 So this is similar.
02:06:40.000 I don't mean to make this example, but there's a great book by...
02:06:45.000 Adam Hochschild called Bury the Chains, which is about the British abolition of slavery, in which he talks about how for the British Empire, like if you think about it, when we collectively wake up and say, this is an abhorrent practice that has to end,
02:07:01.000 but then at that time in the 1700s, 1800s in Britain, slavery was what powered the entire economy.
02:07:08.000 It was free labor for a huge percentage of the economy.
02:07:11.000 So if you say, we can't do this anymore, we have to stop this, how do you decouple when your entire economy is based on slavery?
02:07:19.000 And the book is actually inspiring because it tracks a collective movement that was through networked all these different groups, the Quakers in the U.S., the people testifying before Parliament, the former slaves who did firsthand accounts, the graphics and art of all the People had never seen what it looked like on a slave ship.
02:07:37.000 And so by making the invisible visceral and showing just how abhorrent this stuff was, through a period of about 60 to 70 years, the British Empire had to drop their GDP by 2% every year for 60 years and willing to do that to get off of slavery.
02:07:52.000 Now, I'm not making a moral equivalence.
02:07:54.000 I want to be really clear for everybody taking things out of context.
02:07:57.000 Sure.
02:07:58.000 But just that it's possible for us to do something that isn't just in the interest of economic growth.
02:08:04.000 And I think that's the real challenge.
02:08:06.000 That's actually something that should be on the agenda, which is one of the major tensions is economic growth being in conflict with dealing with many of our problems, whether it's some of the environmental issues or some of the technology issues we're talking about right now.
02:08:20.000 Artificial intelligence is something that people are terrified of as an existential threat.
02:08:25.000 They think of it as one day you're going to turn something on and it's going to be sentient and it's going to be able to create other forms of artificial intelligence that are exponentially more powerful than the one that we created and that will have unleashed this beast that we cannot control.
02:08:41.000 What my concern is with all of this...
02:08:43.000 We've already gotten there.
02:08:43.000 Yeah, that's my concern.
02:08:45.000 My concern is that this is a slow acceptance of drowning.
02:08:51.000 It's like a slow, oh, we're okay, I'm only up to my knees, oh, it's fine, it's just my waist high, I can still walk around.
02:08:58.000 It's very much like the frog in boiling water, right?
02:09:00.000 Exactly, exactly.
02:09:01.000 It seems like...
02:09:03.000 This is like humans have to fight back to reclaim our autonomy and free will from the machines.
02:09:10.000 I mean, one clear...
02:09:11.000 Okay, Neo.
02:09:13.000 It's very much the Matrix.
02:09:14.000 It's Neo.
02:09:15.000 One of my favorite lines is actually when the Oracle says to Neo, and don't worry about the vase.
02:09:19.000 And he says, what vase?
02:09:20.000 And he knocks it over.
02:09:21.000 It says, that vase.
02:09:22.000 And so it's like, she's the AI who sees so many moves ahead in the chessboard.
02:09:26.000 She can say something which will cause him to do the thing that verifies the thing that she predicted would happen.
02:09:31.000 Yeah.
02:09:31.000 That's what AI is doing now, except it's pointed at our nervous system and figuring out the perfect thing to dangle in front of our dopamine system and get the thing to happen, which instead of knocking off the vase is to be outraged at the other political side and be fully certain that you're right, even though it's just a machine that's calculating shit that's going to make you,
02:09:48.000 you know, do the thing.
02:09:50.000 When you're concerned about this, how much time do you spend thinking about simulation theory?
02:09:54.000 The simulation?
02:09:55.000 Yeah.
02:09:55.000 Yeah, the idea that if not currently, one day there will be a simulation that's indiscernible from regular reality.
02:10:03.000 And it seems we're on that path.
02:10:04.000 I don't know if you mess around with VR at all, but...
02:10:07.000 Well, this is the point about the virtual chatbots out-competing our real friends and the technology.
02:10:12.000 I mean, that's what's happening is that reality is getting more and more virtual because we interact with a virtual news system that's all this sort of clickbait economy, outrage machine that's already a virtual political environment that then translates into real-world action and then becomes real.
02:10:28.000 And that's the weird feedback loop.
02:10:29.000 Go back to 1990, whatever it was, when the internet became mainstream, or at least started becoming mainstream, and the small amount of time that it took, the 20 plus years, to get to where we are now, and then think, what about the virtual world?
02:10:45.000 And once this becomes something that's...
02:10:53.000 We're good to go.
02:11:13.000 We seem to be a thing that creates newer and better objects and more effective.
02:11:22.000 But we have to realize AI is not conscious and won't be conscious the way we are.
02:11:26.000 And so many people think that...
02:11:29.000 But is consciousness essential?
02:11:30.000 I think so.
02:11:32.000 To us?
02:11:33.000 I don't know.
02:11:34.000 Essential in the sense that we're the only ones who have it?
02:11:35.000 No, I don't know that.
02:11:36.000 Well, no.
02:11:37.000 There might be more things that have consciousness, but...
02:11:40.000 Is it essential?
02:11:42.000 I mean, to the extent that choice exists, it would exist through some kind of consciousness.
02:11:47.000 Is choice essential?
02:11:50.000 It's essential to us as we know it, like as life, as we know it.
02:11:54.000 But my worry is that we're inessential.
02:11:58.000 We're thinking now, like single-celled organisms, being like, hey, I don't want to gang up with a bunch of other people and become an object that can walk.
02:12:06.000 I like being a single-celled organism.
02:12:08.000 This is a lot of fun.
02:12:09.000 I mean, I hear you saying, you know, are we a bootloader for the AI that then runs the world?
02:12:14.000 That's Elon's perspective.
02:12:16.000 I mean, I think this is a really dangerous way to think.
02:12:18.000 I mean, we have to...
02:12:19.000 Yeah, so are we then the species...
02:12:20.000 Dangerous for us.
02:12:21.000 Yeah.
02:12:22.000 I mean, are...
02:12:22.000 But what if the next version of what life is is better?
02:12:26.000 But the next version being run by machines that have no values, that don't care, that don't have choice, and are just maximizing for things that were programmed in by our little miniature brains anyway.
02:12:34.000 But they don't cry...
02:12:35.000 They don't commit suicide.
02:12:37.000 But then consciousness and life dies.
02:12:39.000 That could be the future.
02:12:41.000 I think this is the last chance to try to snap out of that.
02:12:44.000 Is it important in the eyes of the universe that we do that?
02:12:46.000 I don't know.
02:12:47.000 It feels important.
02:12:48.000 How does it feel to you?
02:12:49.000 It feels important, but I'm a monkey.
02:12:52.000 You know, the monkey's like, God, I'm staying in this tree, man.
02:12:55.000 You guys are out of your fucking mind.
02:12:56.000 I mean, this is the weird paradox of being human is that, again, we have these lower level emotions.
02:13:01.000 We care about social approval.
02:13:02.000 We can't not care.
02:13:03.000 At the same time, like I said, there's this weird proposition here.
02:13:06.000 We're the only species that if this were to happen to us, We would have the self-awareness to even know that it was happening.
02:13:13.000 This two-hour interview, we can conceptualize that this thing has happened to us.
02:13:18.000 That we have built this matrix, this external object, which has AI and supercomputers and voodoo doll versions of each of us.
02:13:25.000 And it has perfectly figured out how to predictably move each of us in this matrix.
02:13:30.000 Let me propose this to you.
02:13:31.000 We are what we are now, human beings, homo sapiens in 2020. We are this thing that, if you believe in evolution, I'm pretty sure you do, we've evolved over the course of millions of years to become who we are right now.
02:13:44.000 Should we stop right here?
02:13:45.000 Are we done?
02:13:47.000 No, right?
02:13:48.000 We should keep evolving.
02:13:50.000 What does that look like?
02:13:55.000 What does it look like if we go ahead?
02:13:57.000 Just forget about social media.
02:13:59.000 What would you like us to be in a thousand years or a hundred thousand years or five hundred thousand years?
02:14:07.000 You certainly wouldn't want us to be what we are right now, right?
02:14:10.000 No one would.
02:14:11.000 No, I mean, I think this is what visions of Star Trek and things like that were trying to ask, right?
02:14:15.000 Like, hey, let's imagine humans do make it, and we become the most enlightened we can be, and we actually somehow make peace with these other, you know, alien tribes, and we figure out, you know, space travel and all of that.
02:14:27.000 I mean...
02:14:28.000 Actually, a good heuristic that I think people can ask is, on an enlightened planet where we did figure this out, what would that have looked like?
02:14:35.000 Isn't it always weird that those movies, it's people are just people, but they're in some weird future.
02:14:41.000 But they haven't really changed that much.
02:14:44.000 Right.
02:14:45.000 Which is to say that the fundamental way that we work is just unchanging.
02:14:50.000 But there are such things as more wise societies, more sustainable societies, more peaceful or harmonious societies.
02:14:56.000 Ultimately, biologically, we have to evolve as well.
02:15:00.000 But the best version of that is probably the gray aliens.
02:15:05.000 Right?
02:15:05.000 Maybe so.
02:15:06.000 That's the ultimate future.
02:15:07.000 I mean, we're going to get into gene editing and becoming more perfect, perfect on the sense of, you know, that, but we are going to start optimizing for what are the outcomes that we value.
02:15:18.000 I think the question is, how do we actually come up with brand new values that are wiser than we've ever thought of before, that actually are able to transcend the win-lose games that lead to omni-lose-lose, that everyone loses if we keep playing the win-lose game at greater and greater scales?
02:15:33.000 I, like you, have a vested interest in the biological existence of human beings.
02:15:38.000 I think people are pretty cool.
02:15:39.000 I love being around them.
02:15:40.000 I enjoyed talking to you today.
02:15:41.000 Me too.
02:15:42.000 My fear is that we're a Model T. Right.
02:15:49.000 You know, and there's no sense in making those fucking things anymore.
02:15:52.000 The brakes are terrible.
02:15:54.000 They smell like shit when you drive them.
02:15:56.000 They don't go very fast.
02:15:58.000 We need a better version.
02:16:16.000 I don't mean to be, you know, playing the woo-woo new age card.
02:16:19.000 I just genuinely mean how much of our lives is just running away from, you know, anxiety and discomfort and aversion.
02:16:27.000 It is, but, you know, in that sense, some of the most satisfied and happy people are people that live a subsistence living, that have these subsistence existences in the middle of nowhere, just chopping trees and catching fish.
02:16:40.000 Right, and more connection, probably, that's authentic, than something else.
02:16:43.000 And I think that's what this is really bad.
02:16:44.000 It probably resonates biologically, too, because of the history of human beings living like that.
02:16:49.000 It's just so much longer and greater.
02:16:51.000 Totally, and I think that those are more sustainable societies.
02:16:54.000 We can never obtain peace in the outer world until we make peace with ourselves.
02:16:58.000 Dalai Lama.
02:16:59.000 Yeah, but I don't buy that guy.
02:17:01.000 You know, that guy, he's an interesting case.
02:17:04.000 I was thinking it was a slightly different quote.
02:17:06.000 But actually, there's one quote that I would love to, if it's possible to...
02:17:10.000 One of the reasons why I don't buy him, he's just chosen.
02:17:13.000 They just chose that guy.
02:17:14.000 Yeah.
02:17:15.000 Also, he doesn't have sex.
02:17:18.000 How much can you be enjoying life if that's not a part of it?
02:17:22.000 Come on, bro.
02:17:23.000 You wear the same outfit every day?
02:17:24.000 Get the fuck out of here with your orange robes.
02:17:27.000 There's a really important quote that I think would really be good to share.
02:17:33.000 It's from the book, have you read Amusing Ourselves to Death by Neil Postman?
02:17:37.000 No.
02:17:37.000 From 1982?
02:17:39.000 No.
02:17:40.000 So especially when we get into big tech and we talk about censorship a lot and we talk about Orwell, he has this really wonderful opening to this book.
02:17:48.000 It was written in 1982. It literally predicts everything that's going on now.
02:17:51.000 I frankly think that I'm adding nothing and it's really just Neil Postman called it all in 1982. Wow.
02:17:57.000 He had this great opening.
02:17:59.000 It says...
02:18:01.000 Let's see.
02:18:04.000 We were all looking out for, you know, 1984. When the year came and the prophecy didn't, thoughtful Americans sang softly in praise of themselves.
02:18:11.000 The roots of liberal democracy had held.
02:18:14.000 This is like we made it through the 1984 gap.
02:18:16.000 Wherever else the terror had happened, we at least had not been visited by Orwellian nightmares.
02:18:22.000 But we had forgotten that alongside Orwell's dark vision, there was another slightly older, slightly less well-known, equally chilling vision of Aldous Huxley's Brave New World.
02:18:32.000 Contrary to common belief, even among the educated, Huxley and Orwell did not prophecy the same thing.
02:18:39.000 Orwell warns that we will become overcome by an externally imposed oppression.
02:18:44.000 But in Huxley's vision, no Big Brother is required to deprive people of their autonomy, maturity, or history.
02:18:50.000 As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think.
02:18:57.000 What Orwell feared were those who would ban books.
02:19:00.000 What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.
02:19:07.000 Orwell feared those who would deprive us of information.
02:19:10.000 Huxley feared those who would give us so much that we would be reduced to passivity and egoism.
02:19:16.000 Orwell feared the truth would be concealed from us.
02:19:19.000 Huxley feared the truth would be drowned in a sea of irrelevance.
02:19:24.000 Orwell feared we would become a captive culture.
02:19:27.000 But Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies and the orgy-porgy and the centrifugal bumble puppy.
02:19:35.000 Don't know what that means.
02:19:36.000 As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny failed to take into account man's almost infinite appetite for distractions.
02:19:50.000 Lastly, in 1984, Orwell added, people are controlled by inflicting pain.
02:19:55.000 In Brave New World, they are controlled by inflicting pleasure.
02:19:59.000 In short, Orwell feared that what we fear will ruin us.
02:20:02.000 Huxley feared that what we desire will ruin us.
02:20:05.000 Holy shit.
02:20:07.000 Isn't that good?
02:20:08.000 That's the best way to end this.
02:20:11.000 Goddamn.
02:20:12.000 But again, if we can become aware that this is what's happened, we're the only species with the capacity to see that our own psychology, our own emotions, our own paleolithic evolutionary system has been hijacked.
02:20:26.000 I like that you're optimistic.
02:20:27.000 We have to be.
02:20:29.000 If we want to remain people, we have to be.
02:20:32.000 Optimism is probably the only way to live in a meat suit body and keep going.
02:20:37.000 Otherwise...
02:20:37.000 It certainly helps.
02:20:38.000 It certainly helps.
02:20:40.000 Thank you very much for being here, man.
02:20:42.000 I really enjoy this.
02:20:43.000 Even though I'm really depressed now.
02:20:44.000 I really don't want you to be depressed.
02:20:46.000 I really hope people...
02:20:47.000 I'm kidding.
02:20:47.000 I'm not.
02:20:48.000 We really want to build a movement.
02:20:50.000 I wish I could give people more resources.
02:20:53.000 We do have a podcast called Your Undivided Attention, and we're trying to build a movement at humanetech.com.
02:20:58.000 But...
02:20:58.000 Well, listen, any new revelations or new developments that you have, I'd be more than happy to have you on again.
02:21:04.000 We'll talk about them and send them to me, and I'll put them on social media and whatever you need.
02:21:08.000 Awesome.
02:21:08.000 I'm here to help.
02:21:09.000 Awesome, man.
02:21:09.000 Great to be here.
02:21:10.000 Resist.
02:21:11.000 Resist.
02:21:11.000 We're in this together.
02:21:12.000 Humanity.
02:21:13.000 Resist.
02:21:13.000 Humanity.
02:21:14.000 We're in this together.
02:21:14.000 Thank you, Tristan.
02:21:15.000 I really, really appreciate it.
02:21:16.000 Goodbye, everybody.