The Joe Rogan Experience - November 18, 2021


Joe Rogan Experience #1736 - Tristan Harris & Daniel Schmachtenberger


Episode Stats

Length

3 hours and 1 minute

Words per Minute

184.75688

Word Count

33,441

Sentence Count

2,149

Misogynist Sentences

18

Hate Speech Sentences

24


Summary

Daniel Schmachtenberger is a UX designer at Google, and he's spent the last few years trying to figure out how to ethically influence other people's thoughts, concerns, and behaviors. In this episode, we talk to Daniel about the challenges he's faced over the years, and how he's come up with a framework for how to address them. He also talks about the role of asymmetries of power, and why it's important to understand the relationship between what technology knows about us and what we don't know about ourselves. It's a really important conversation, and one that I think a lot of people need to hear. If you haven't heard it before, you should definitely check out the movie "The Social Dilemma" and watch it on Amazon Prime or wherever else you get your media. If you don't have a Kindle device, you can get a free eReader app from Amazon so you can read my book on any laptop, desktop, smartphone, tablet, or other device you want. I'm giving away a Kindle eReader device for free! Kindle $9.99, iBook $99, Paperback 49, Hardcover 49, or AudioBook 49, and Tascam 49, which is also free for Audible $99.99.00. You can get an Audible membership for 49.99 and Audible 49, plus Audible is 49,99 gets a 4-track membership for 4-tracks for 49-99,99.95, plus shipping & Audible for 4 years.99 a year, shipping starts from $49.99 starting from $99 a month. Music: "Goodbye, goodbye, hello, goodbyes" by John Carmack, "goodbye, bye, goodmorning, good morning, hello! by John Rocha, my love you, bye bye, bye! John, I'll see you next week! - John, Timestamps: 3:00) 4:00 5:30) 6:15) 7:40) 8:15 9:30 11: How do you influence people? 12: How to influence people 13:30:00s 15:30s 16:40 17:40s: What do you think about the future of the future? 17, 18:10s: How would you like to know more?


Transcript

00:00:12.000 Gentlemen, thank you for being here.
00:00:14.000 I keep doing these podcasts where I just talk to people, so please introduce yourself and tell people what you do.
00:00:21.000 I am Tristan Harris and came on the show about a year ago after The Social Dilemma came out.
00:00:27.000 That's probably where most people know me.
00:00:29.000 And used to be a design ethicist at Google, studying how do you ethically influence people's attention and thoughts and behaviors.
00:00:37.000 And really enjoyed the conversation last year.
00:00:40.000 The reason that today I'm here with Daniel Schmachtenberger, who's really a person I've learned so much from the last few years and why I thought it'd be a good through line, What a daunting task.
00:01:08.000 How to ethically influence people.
00:01:11.000 And what a weird thing that this Industry that didn't exist 20 years ago has such a I mean think about life on earth and then 20 years ago all of a sudden this social media thing sort of evolves and Now you have to wonder how much of an effect it has on our just day-to-day lives and how to ethically influence people Yeah.
00:01:36.000 What the fuck does that even mean?
00:01:38.000 Well, first of all, I should say...
00:01:40.000 How do those thoughts even get, you know, how does that get worked out?
00:01:44.000 Actually, I should first say that there wasn't at Google a department that said, how do we ethically influence people?
00:01:49.000 I actually sort of, as was shown in the film The Social Dilemma, wrote this presentation worried about how technology was influencing people's thoughts, concerns, behaviors, etc.
00:01:59.000 And I studied persuasive technology at Stanford, which is a whole discipline and field, the idea that technology can influence people.
00:02:06.000 And it was out of my own personal concern that when that presentation went viral at Google, I kind of worked my way into this position that never existed before, which was how could we create a framework for what it means to ethically influence other people?
00:02:21.000 And a lot of that has to do with asymmetries of power.
00:02:23.000 I mean, when I was a kid, I was a magician.
00:02:25.000 We talked about this before.
00:02:26.000 Magic is about an Asymmetric relationship.
00:02:29.000 The magician knows something about your mind that you don't know about your own mind.
00:02:32.000 That's what makes the trick work.
00:02:34.000 And actually across some of these things we're going to talk about today are ways that there is an asymmetric relationship between what technology knows about us and what we don't know about ourselves.
00:02:44.000 When you were studying at Stanford, what year was this?
00:02:48.000 This was 2002 to 2006. I was an undergrad and then 2006 I got involved.
00:02:54.000 With Professor BJ Fogden, who, again, actually studied ways that persuasive technology could be used for positive purpose.
00:03:01.000 Like, how do you help people be healthier?
00:03:03.000 How do you help people floss?
00:03:04.000 How do you help people work out more often?
00:03:06.000 Things like that.
00:03:07.000 It could be used in a positive way.
00:03:08.000 But I got concerned because it was all of...
00:03:12.000 This increasing arms race to use persuasive tools to harvest and capture people's attention, now known as the race to the bottom of the brainstem, to go down the brainstem into more social validation, more social narcissism, all of that.
00:03:25.000 And that's one of the arms races we see everywhere, which is like in every single thing.
00:03:29.000 If one oil company doesn't drill for that oil well, the other one will.
00:03:33.000 If one attention company doesn't add the beautification filter, the other one will.
00:03:37.000 If one company doesn't do narcissism, social validation hacking, and likes and variable rewards, the other one will.
00:03:44.000 And it's true across so many of the other issues that we're facing, whether it's like, if I don't build the drone for everyone, then someone else is going to build the drone for everyone.
00:03:52.000 If I don't...
00:03:52.000 So that's how I think these things are connected.
00:03:54.000 Did you realize it back then?
00:03:55.000 I mean, in 2002, 2006, you're talking about a completely different world in terms of social media influence.
00:04:00.000 It's before the iPhone, actually.
00:04:01.000 Yeah.
00:04:01.000 Yeah.
00:04:02.000 2007, right?
00:04:03.000 Yeah.
00:04:03.000 iPhone came out in 2007. We were studying persuasive technology and I was, as I've said in the past, partners with the co-founder of Instagram in the persuasive technology class.
00:04:13.000 So we were actually studying how would you apply persuasive technology to people before the iPhone even existed.
00:04:20.000 And, you know, what bothered me is that I think when people think about how do you ethically persuade people, you just get into a whole bunch of ethical cop-outs.
00:04:29.000 Like, well, we're just giving people what they want.
00:04:32.000 You know, or if they don't want this, they'll use something else.
00:04:35.000 There's these very simple ways that the minds of people in technology, the tech industry, I think defend what they're doing.
00:04:41.000 And what concerned me was that the ethical framework wasn't really there.
00:04:44.000 Not that I had one at the time, by the way.
00:04:45.000 I studied at Google for three years to try to...
00:04:47.000 Like, what does it mean to ethically influence three billion people who are jacked into the system?
00:04:52.000 And this is before Cambridge Analytica, before the Facebook files and Frances Haugen talking about that, you know, we now have the receipts for all these things.
00:04:59.000 So we talked about all these things in The Social Dilemma.
00:05:01.000 But now there's the evidence with Frances Haugen's whistleblowing that, you know, Instagram makes body image issues worse for one in three teenage girls.
00:05:09.000 I know I'm going fast, but that's the broad strokes.
00:05:12.000 Do you know the conspiracy theory about her?
00:05:15.000 Tell me.
00:05:15.000 The conspiracy theory amongst the tinfoil hat folk is, first of all, she started a Twitter account like right before she went there and was immediately verified.
00:05:29.000 Right.
00:05:29.000 And then instantaneously was on all these major media outlets, major network television shows and being interviewed.
00:05:37.000 And she was saying something that a lot of people felt like was a call to authoritarian intervention into social media.
00:05:47.000 That it was government censorship was the solution and regulation was the solution to dealing with this problem and that it seemed like she was a She was a sanctioned whistleblower.
00:05:58.000 She was saying all the things that they wanted to hear, and that's why they put her in the position to make a big loud noise.
00:06:06.000 What did you think about that when it came up?
00:06:09.000 I always have to do this, you know, when something like that happens, like, hmm, maybe, maybe, because you know the government would do that.
00:06:17.000 Most certainly, they would love to have control over social media.
00:06:20.000 They would love to be able to censor things like the Hunter Biden laptop story.
00:06:24.000 They would love to be able to hide Joe Biden's medical records or Kamala Harris's time as a prosecuting attorney.
00:06:34.000 There's a lot of stuff they would like to do.
00:06:36.000 Or district attorney, rather.
00:06:38.000 There's a lot of stuff they would like to do with access to information.
00:06:42.000 I mean, you're seeing it right now in terms of one of the things that's been fascinating about COVID. During this pandemic and during this terrible time of paranoia and dealing with this disease and fear and anxiety, you're seeing this narrative from social media networks that absolutely walk step-in-step with the government,
00:07:05.000 where if the government wants certain information censored, it's being censored across major social media platforms that has to be coordinated.
00:07:16.000 There's no way it's not, and there's no way they're incentivized to not have people discuss certain things, because we've said before, you know, it's one of the major points of the social dilemma, is that things that are controversial,
00:07:32.000 whether they're true or not, are the things that are the most clicked on, the most shared, and that's where the money is.
00:07:41.000 So there's got to be some sort of incentive for them to not do what they do with every other subject, whether it's immigration or gun control or abortion or anything.
00:07:53.000 Do they censor on immigration?
00:07:55.000 Or you're saying that as an example something goes viral?
00:07:57.000 As an example something goes viral.
00:07:58.000 Yeah, yeah, yeah.
00:07:59.000 Not the censorship.
00:07:59.000 Right.
00:08:00.000 They don't censor on immigration.
00:08:01.000 No.
00:08:01.000 I mean, the border crisis is a great example of that.
00:08:06.000 The government would probably like us to not see all those Haitian immigrants storming across the border, but my God, those were shared like crazy.
00:08:15.000 Totally.
00:08:16.000 So why was COVID information shared?
00:08:19.000 Well, because there was a narrative that they could say, well, this is dangerous misinformation and we can protect people, even though some of it turned out to actually be accurate, like the lab leak hypothesis.
00:08:31.000 At least that it's a hypothesis.
00:08:33.000 It's a hypothesis that at least is being considered by virologists.
00:08:37.000 But the point is that Who the fuck are they to decide what can and can't be discussed?
00:08:44.000 And when they're doing something step-in-step with the government, I get concerned.
00:08:49.000 So when someone comes along and this person who's a whistleblower says, something needs to be done, you know, we're endangering young girls' lives, we're doing this, we're doing that, we need some sort of government intervention.
00:09:01.000 I mean, this is essentially calling for censorship and calling for government control of social media, which freaks people.
00:09:08.000 So she's pretty clear that she's not calling for censorship.
00:09:11.000 But the reason I asked you was curious how it came across your radar, because I happened to know and hear a little bit about this from her.
00:09:18.000 We interviewed her on our podcast.
00:09:21.000 And the story that goes viral about her saying that she's a psyop or that she's a plant, that's an incendiary, inflammatory, controversial story.
00:09:31.000 So when that gets suggested, is it going to go, is it just going to fizzle out or is it going to go viral?
00:09:36.000 Yeah.
00:09:36.000 How ironic.
00:09:37.000 It's going to go viral.
00:09:38.000 Exactly.
00:09:39.000 And in fact, when you kind of realize everything...
00:09:43.000 I mean, there's some things that are real conspiracy theories, and there's some things that are real psyops, and that's a real thing.
00:09:48.000 But notice how many things we think of as psyops, conspiracies, etc.
00:09:53.000 now, and it's because anything that has that incendiary quality goes viral.
00:09:58.000 And I happen to know, for example, I think one of the things that claims in there is that she's funded by this billionaire, Piero Midiar, But I happened to know from talking to her that that happened at the very, very end of what she was doing.
00:10:09.000 And it was a tiny grant of like $150,000 for us in the nonprofit world.
00:10:13.000 That's like a tiny amount of money basically just to support her flight costs.
00:10:16.000 And I happened to also sort of hear from her how much of the media was constructed at the very last minute.
00:10:23.000 Like she was working this one newspaper, the Wall Street Journal, to do this sort of procedural rollout of specific stuff that she thought was concerning.
00:10:31.000 I guess what I'll just say is like, What if she's just a good-faith person who saw that virality was driving people crazy and that it was harmful to teenage girls?
00:10:41.000 And it's true that the government would see some of that and say, hey, we could use that for something else.
00:10:48.000 We could use that.
00:10:49.000 She could be a tool for us to do something else.
00:10:51.000 But I guess in the aim of complexity and nuance and not jumping to conclusions and this sort of thing, My perception from talking to her now extensively, she's a very good-faith actor who was concerned that this was going to drive the world apart.
00:11:05.000 I should be really clear that this is not my position.
00:11:07.000 This is just the conspiracy theory.
00:11:10.000 I literally don't have an opinion on her.
00:11:12.000 I do have an opinion on algorithms, and I do have an opinion on what it does do to young girls' self-esteem.
00:11:18.000 You have teenage daughters, right?
00:11:20.000 Yes.
00:11:20.000 I just think...
00:11:21.000 Young girls are a point of focus for...
00:11:25.000 Why they're a point of focus more than young boys, I'm not entirely sure.
00:11:29.000 I guess it has to do with their emotional makeup, and there's higher risk of self-harm due to social media, and Jonathan Haidt talked about that in his book, The Coddling of the American Mind.
00:11:39.000 It's very clear.
00:11:41.000 It's very damaging.
00:11:42.000 And my kids, you know, my 13-year-old does Have interactions with our friends, and I do see how they bully each other and talk shit about each other, and they get so angry and mad at each other.
00:11:56.000 It is a factor, but it's an algorithm issue, right?
00:12:00.000 There's multiple things here.
00:12:01.000 So the first thing is, just to kind of set the stage a little bit...
00:12:06.000 I always use E.O. Wilson, the sociobiologist who sort of defined what the problem statement for humanity is.
00:12:14.000 He said, the fundamental problem of humanity is we have paleolithic emotions and brains, like easy brains that are hackable for magicians.
00:12:21.000 We have medieval institutions, you know, government that's not really good at seeing the latest tech, whether it was railroads or now social media or AI or deepfakes or whatever's coming next.
00:12:31.000 And then we have godlike technology.
00:12:33.000 So we have paleolithic emotions, medieval institutions, godlike technology.
00:12:38.000 You combine that fact, that's the fundamental problem statement.
00:12:41.000 How do we wield the power of gods without the love, prudence, and wisdom of gods, which is actually something that Daniel taught me.
00:12:48.000 Then you add to that the race to the bottom of the brainstem for attention.
00:12:51.000 What is their business model?
00:12:52.000 Just to review the basics.
00:12:53.000 Everybody knows this now, but it's engagement.
00:12:55.000 It's like, how do I get that attention at all costs?
00:12:57.000 So algorithms is one piece of that.
00:13:00.000 Meaning, when you're on a news feed, like, I don't want to just show you any news.
00:13:04.000 I want to show you the most viral, engaging, like, longest argumentative comment threads.
00:13:10.000 So that's like pointing a trillion-dollar market cap AI at your brain, saying, I'm going to show you the next perfect boogeyman for your nervous system, the thing that's going to make you upset, angry, whether it's masks, vaccines, Francis Haugen, whatever the thing is, it will just drive that over and over again and then repeat that thing.
00:13:27.000 And that's one of the tools in the arsenal to get attention, is that the algorithms.
00:13:31.000 Another one is technology making design decisions, like how do we inflate people's sense of beautification filters?
00:13:38.000 In fact, just recently, since we talked last time, I think it's an MIT Tech Review article showing that they're all competing, first of all, to inflate your sense of beauty.
00:13:47.000 So they're doing the filters, right?
00:13:50.000 People know this stuff.
00:13:50.000 It's very obvious.
00:13:51.000 But they're competing for who can give you a nicer filter, right?
00:13:54.000 And then now, instead of waiting for you to actually add one, TikTok was actually found to actually do like a 2%, like just bare beautification filter on the no filter mode.
00:14:04.000 Because the thing is, once they do that, the other guys have to do it too.
00:14:07.000 So I just want to name that all of this is taking place in this race to capture human attention, because if I don't do it, the other guy will.
00:14:14.000 And then it's happening with design decisions, like the beautification filters and the follow you, and if you follow me, I'll follow you back, and the like button, and check, pull, refresh, the dopamine stuff.
00:14:22.000 That's all design.
00:14:23.000 Then there's the algorithms, which is I'm pointing a thing at your brain to figure out how can I show you an infinite feed that just maximally enrages you?
00:14:31.000 And we should talk about that because that thing drives polarization, which breaks democracy.
00:14:35.000 But we can get into that.
00:14:37.000 Daniel, let's bring you in here.
00:14:38.000 So how did you guys meet and how did this sort of dynamic duo come about?
00:14:45.000 Yeah, I was working on studying kind of catastrophic risks writ large.
00:14:49.000 You've had people on the show talking about risks associated with AI and with CRISPR and genetic engineering and with climate change and environmental issues.
00:14:56.000 Pull up to the microphone there.
00:14:58.000 There you go.
00:14:59.000 And escalation pathways to war and all these kinds of things.
00:15:02.000 Basically, how can shit hit the pin?
00:15:04.000 Right.
00:15:05.000 And I think it's a pretty common question of like, how long do we have on which of these?
00:15:09.000 And are we doing a good job of tending to them so that we get to solve the rest of them?
00:15:14.000 And then for me, it was there were so many of them.
00:15:16.000 What was in common driving them?
00:15:18.000 Are there any kind of like societal generator functions of all the catastrophic risks that we can address with to make a more resilient civilization writ large?
00:15:27.000 Tristan was working on the social media issues and When you had Eric on, he talked about the twin nuclei problem of atomic energy and kind of genetic engineering and basically saying these are extremely powerful technologies that we don't have the wisdom to steward that power well.
00:15:44.000 Well, in addition to that, it's all things computation does, right?
00:15:47.000 There's a few other major categories.
00:15:48.000 I think?
00:16:14.000 You're impacting a billion people in deeper ways much faster, which means that if you're blind to something, if you don't know what you might be doing, the consequences show up faster than you can actually remediate them.
00:16:23.000 When we say exponential tech, we mean a number of things.
00:16:26.000 We mean tech that makes more powerful versions of itself, so it can use computer chips to model how to make better computer chips, and then those better computer chips can recursively do that.
00:16:33.000 We also mean exponential speed of impact, exponential scale of impact, exponentially more capital returns, exponentially smaller numbers of people capable of achieving a scale of impact.
00:16:45.000 And so when he's mentioning godlike powers and kind of medieval institutions, the speed at which our tech is having influences in the world and not just first order influences, the obvious stuff, but the second and third order ones.
00:16:57.000 Facebook isn't trying to polarize the population.
00:16:59.000 It's an externality.
00:17:00.000 It's a side effect.
00:17:01.000 of the thing they're trying to do which is to optimize ad revenue.
00:17:05.000 But the speed at which new technologies are having effects on the world and the total amount of consequence is way faster than regulation can keep up with.
00:17:12.000 And just by that alone, we should be skeptical of any government's ability to regulate something that's moving faster than it.
00:17:19.000 Faster than it can appraise of what the hell is even happening in the first place.
00:17:22.000 Not only that, you need someone who really understands the technology and you're not going to get that from elected officials.
00:17:29.000 You're going to need someone who's working on it and has a comprehensive understanding Of how this stuff works, how it's engineered, where it goes.
00:17:38.000 I mean, I'm skeptical of the government being able to regulate almost everything.
00:17:42.000 Right.
00:17:42.000 Well, and so there's maybe a few things to say about that.
00:17:45.000 So one is the complexity of all issues.
00:17:47.000 Like climate change is really complex.
00:17:48.000 Yes.
00:17:49.000 Like where the nuclear pathways of escalation or the way a satellite or GPS could get knocked out triggers a nuke somewhere, that's also really complex.
00:17:56.000 Social media is really complex.
00:17:58.000 CRISPR, you know, bio stuff is complex.
00:18:00.000 So in general, like one of the ways to summarize the kind of problem from our friend Zach Stein's kind of work is that the complexity of humanity's problems is going up like this, but the capacity to meet them is like not really meeting it.
00:18:12.000 And then you add in social media and you polarize people and divide them into like they don't even know what's true because everyone's got their own personalized version of reality.
00:18:28.000 Right.
00:18:36.000 That thing goes viral.
00:18:38.000 And when that goes viral, everybody saw that.
00:18:40.000 And they didn't see that, you know, the five senators who I talked to, who actually do really get these things pretty decently.
00:18:45.000 Now, I'm not going to say, like, let's just, like, regulate it, but just to notice, right?
00:18:48.000 So the cynical take about every time an institution makes a mistake, that thing goes viral, which means we lose trust in so many things, because no matter what the issue is.
00:18:58.000 You noticed that you were bringing up the conspiracy theory of, might the government have an incentive to make a plant like Francis?
00:19:05.000 And so it's plausible, but plausible doesn't automatically mean is.
00:19:09.000 One of the challenges is when someone has a confirmation bias, they hear something that's plausible and they just assume that it is without doing the due diligence of saying, what would I need to know?
00:19:17.000 And you do a good job of checking that.
00:19:18.000 We could also say, would Facebook have an incentive to say that she's a plant and try to hire a bunch of PR to do that?
00:19:24.000 And they were helping to spread that story, by the way.
00:19:27.000 I'm not saying they're responsible for it.
00:19:29.000 I actually think that what happens organically, again, the cynical take goes viral.
00:19:34.000 And then if you're Russia or China or you're Facebook in this case, you can be like, hmm, that's a really helpful cynical take from my perspective.
00:19:40.000 In fact, one of the things that Facebook does try to do is turn the social media debate into a censorship or free speech debate because they know that divides the political class because they know that the right doesn't want censorship, obviously.
00:19:54.000 And so they say the more they can spin whatever Frances is doing as she's claiming censorship, the more they can divide any possibility for actual action.
00:20:03.000 In fact, I'll tell you just a quick story, really quick, is during the three-hour testimony that Frances gave, if you watch the full three hours, she had both people on the left and the right.
00:20:13.000 And I've been working on this for eight years.
00:20:14.000 I have never seen someone create a bipartisan consensus the way that she did.
00:20:18.000 She actually did if you watch the video.
00:20:20.000 I think?
00:20:32.000 Because the story went viral saying that she was a democratic operative and he said, my base will hate me if I meet with you.
00:20:38.000 So the very thing we're talking about, which is the ability to regulate anything, is being broken and shattered because the incendiary controversial take on everything goes viral.
00:20:49.000 Now, again, I'm not saying that we're this easy world we should therefore regulate.
00:20:52.000 But noticing the mind warp.
00:20:54.000 Part of what I wanted to do today is how do we reverse engineer this bad trip we've been on for the last 10 years?
00:20:59.000 It's like a psychedelic trip where we've all...
00:21:02.000 I think?
00:21:18.000 It's so funny you say the right doesn't want censorship.
00:21:22.000 Isn't that a crazy statement?
00:21:24.000 Like, are we, like, shifted the polar, you know, the polar...
00:21:29.000 What do you mean?
00:21:30.000 It used to be the left didn't want censorship.
00:21:32.000 The ACLU used to defend Nazis.
00:21:35.000 I mean, what the fuck has happened?
00:21:37.000 Like, our polls have shifted.
00:21:39.000 Like, north is south and south is north.
00:21:41.000 It's...
00:21:42.000 It just shows you that so much of what ideology is, is tribal.
00:21:47.000 It's like you find a group that agrees to a certain pattern of behavior and thought, and you subscribe to that.
00:21:57.000 Now, I am a right-wing conservative, I am a left-wing progressive, and then you just follow the playbook.
00:22:04.000 And it makes it so much easier than having your own individual nuanced thoughts on complex and difficult issues like this.
00:22:11.000 But the fact that he couldn't talk to her because his base would somehow or another think that she actually is a democratic operative and she does work for the government and is some sort of an attempt at censorship.
00:22:22.000 And I'm sure not only is Facebook amplifying that, but All of the different Russian troll pages on Facebook are amplifying that, which confuses the water.
00:22:33.000 Totally.
00:22:34.000 Well, also, if I'm Russia or China, Facebook is like the best weapon I've ever had against the United States.
00:22:38.000 Yes.
00:22:39.000 Oh, my God.
00:22:39.000 You've got an F-35.
00:22:40.000 I don't need F-35.
00:22:41.000 I've got Facebook.
00:22:42.000 I can destroy your entire coherence as a society.
00:22:45.000 And they have.
00:22:46.000 And you won't get anything done.
00:22:47.000 And all of your energy will be spent on waste infighting and heat.
00:22:51.000 We talked about this recently, but I'm sure you saw the story.
00:22:53.000 There was 20, top 20 Christian sites on Facebook.
00:22:58.000 19 of them were run by a Russian troll farm.
00:23:01.000 I'm glad you actually mentioned that.
00:23:02.000 Excuse me, it was an Eastern European troll farm.
00:23:05.000 Macedonia, I think it was.
00:23:05.000 Totally.
00:23:06.000 This is an important stat, actually.
00:23:08.000 I'm glad you brought it up.
00:23:09.000 This is as recent as October 2019. 140 million Americans per month were reached by essentially troll farms actively.
00:23:20.000 There's three categories of pages in which...
00:23:22.000 So for Christian pages, the top 15 out of 15 Christian pages were all run by troll farms.
00:23:28.000 So all of the Christians in the country were receiving content.
00:23:32.000 And 85%...
00:23:36.000 85% of the Christians who saw that stuff in their feed, they didn't actually accept an invitation from the group or the page to say, yes, I want to subscribe to you.
00:23:47.000 Facebook, because they're optimizing for growth, they changed the way the system works so that if a page invites you, that's enough for it to start putting the content in your feed.
00:23:56.000 So there's an example in Francis' work where there was a QAnon person who invited 300,000 people in one day.
00:24:04.000 300,000 people.
00:24:05.000 And because Facebook's optimizing for growth and engagement, those people didn't have to say, yes, I want to join that group.
00:24:12.000 Just by being invited, it started testing.
00:24:14.000 Like, we want to optimize for growth.
00:24:15.000 So it puts it in your feed.
00:24:16.000 And if you click on it, it auto-ads you to the group.
00:24:19.000 Oh, my God.
00:24:20.000 Out of the top 15 pages for African Americans, two-thirds of those top 15 pages were run by troll farms.
00:24:27.000 Of the top 15 pages for Native Americans, one-third of those pages were run by troll farms.
00:24:32.000 So we're not living in an authentic reality.
00:24:35.000 Reality, quote-unquote, is getting more virtual.
00:24:37.000 If you read Chinese military doctrine, specifically look at the 36 stratagems, Don't ever attack a superior opponent directly.
00:24:46.000 Turn the enemy against themselves based on their existing fault lines.
00:24:49.000 Population centric, unconventional warfare, right?
00:24:51.000 Like that's kind of ancient doctrine.
00:24:53.000 It's just Facebook makes that amazingly easy because it automatically already puts people into tribal groups That whatever the content is in that group is going to keep getting upregulated, optimizes for inflammation and tribal identity and those types of things.
00:25:08.000 And so you don't have to kinetically attack a country to make the country so turned against itself that the polarized population supports a polarized population.
00:25:18.000 Representative class, which means you get gridlock on everything, which means you can't do effective governance, which means another country that does autocratic governance just wins geopolitically.
00:25:26.000 It seems absolutely insane that they could, through one page, inviting people, instantaneously start to distribute all of their information on those people that they invited.
00:25:37.000 So why would Facebook even allow that?
00:25:40.000 So if I'm designing Facebook, you would probably say, Wait, wait.
00:25:43.000 You just said the government should regulate social media.
00:25:46.000 It should be illegal is what I said.
00:25:48.000 It should be illegal.
00:25:48.000 Yeah.
00:25:49.000 Well, this is – I don't think the government should regulate.
00:25:52.000 But I do think there should be rules in terms of like – if you're a regular person that, say, has a specific group of interests – Say you only like motor cars.
00:26:06.000 You like vehicles.
00:26:07.000 You like hot rods or whatever, and that's what you're interested in.
00:26:11.000 You use Facebook when you're off duty at work and you just want to check some stuff out, and all of a sudden you get QAnon shit because they invited you into this QAnon group, and you start getting all this information.
00:26:22.000 You start getting radicalized.
00:26:24.000 It seems like And again, I don't know what we should do in terms of regulation.
00:26:32.000 But I don't think that social media groups should be able to just distribute information to people based on this concept of universal growth.
00:26:42.000 Yeah.
00:26:42.000 Well, I mean, think about it.
00:26:43.000 If we were just designing- Or unlimited growth.
00:26:44.000 Yeah, exactly.
00:26:46.000 I mean, if we were designing Facebook with a feature called groups, and groups had a feature called invitations, and you could invite people.
00:26:52.000 Wouldn't you design it so that people have to accept the invitation for the group before it shows up in your feed?
00:26:57.000 Why would Facebook not do it that way?
00:26:59.000 Because what happened is, starting in 2018, people stopped posting as much on Facebook.
00:27:04.000 So you and I, and maybe we used to post a lot more in 2016, 2017. If we stopped posting as much, oh shit, we can't harvest all that attention from people.
00:27:12.000 You were doing all this labor.
00:27:13.000 What do you mean?
00:27:14.000 What caused it to slow down?
00:27:15.000 Oh, just like people being more skeptical maybe of Facebook or just realizing they don't want to share as much or just usage burning out, more people moving to Instagram or TikTok.
00:27:22.000 People are getting older as well, right?
00:27:24.000 It's like older user base.
00:27:26.000 Totally.
00:27:26.000 And so now if I'm Facebook, I want to find new sources of free unpaid content creators.
00:27:33.000 Where can I tap that pool of content?
00:27:35.000 Oh, I've got this thing called Facebook groups where people are posting all the time.
00:27:39.000 So I'm going to start putting that stuff in people's feeds to just – so now I'm fracking for attention.
00:27:44.000 I'm going lower into all these other places to backfill this attention harvesting, we are the product machine.
00:27:50.000 And how do you know, since there isn't rigorous identity, if a user that says they're a user is really who they are or if they're a troll farm or if pretty soon they're an AI GPT-3 algorithm?
00:28:00.000 You should explain what AI GPT-3 is.
00:28:04.000 The ability to generate text-based deepfakes.
00:28:07.000 So people know what a deepfake is.
00:28:09.000 Well, there's a whole Reddit thread with people arguing with each other that are all fake.
00:28:13.000 Do you know about that?
00:28:14.000 No, I don't actually.
00:28:15.000 Here, let me send it to Jamie.
00:28:16.000 Duncan just sent this to me the other day, and I was like, what in the fuck?
00:28:20.000 I could only look at it for a couple moments before I started freaking out.
00:28:23.000 But the idea that it's not far off, this ability that...
00:28:35.000 Yes, exactly.
00:28:36.000 That's specifically what GPT-3 is.
00:28:47.000 Including stuff in your voice or in my voice.
00:28:49.000 And then you could say, GPT-3, write me an argument about why social media is great, written by Tristan Harris.
00:28:56.000 Using his words and phrases, and it'll do that.
00:28:58.000 It'll actually be able to take my style of speech, and it'll generate text there.
00:29:03.000 You could also say...
00:29:06.000 The ability to say – make arguments for vaccines or against vaccines and say only use real data and then be able to show the financial vested interest of anyone arguing on the other side and just have it be able to create – More data than people can parse in any reasonable amount of time.
00:29:25.000 Like an academic-looking paper that's 10 pages long saying why the vaccine is not safe, with citing real charts, real graphs, real statistics, and the real vested interests of people who are, say, positively pointing out that the vaccine is safe, who maybe they have some connection to Pfizer or something like that.
00:29:39.000 And it'll generate that full 10-page or 20-page document, and it'll take a team of statisticians a while to decode that thing.
00:29:46.000 And you can flood the Internet with that kind of text.
00:29:51.000 Through OpenAI and the GPT-3 algorithm, the ability to pass the Turing test in many areas.
00:29:55.000 You should explain what the Turing test is.
00:29:57.000 Meaning that if you're reading the text, you can't tell that it wasn't produced by a human.
00:30:00.000 Right.
00:30:01.000 Turing test is the idea that if you – that's how you find out if someone – it's a very good robot.
00:30:05.000 So you've already got an AI. Ex Machina, right?
00:30:08.000 So this is the Reddit thread.
00:30:11.000 So this is – these are all – why do human babies cry?
00:30:14.000 These are all robots.
00:30:16.000 This is all bots arguing with each other.
00:30:19.000 This is what happens when you give birth to a human baby.
00:30:22.000 Oh, my bad.
00:30:23.000 I thought you were just trying to answer the question.
00:30:25.000 No worries.
00:30:26.000 No, I'm trying to answer the question of how babies cry.
00:30:29.000 YTA, I don't know what that means, and you are discussing, I can't even fathom the level of toxicity in this post.
00:30:35.000 These are all bots.
00:30:36.000 I am disgusted that you are making fun of others.
00:30:39.000 Don't you know that people in this sub are supposed to be empathetic of others' feelings?
00:30:43.000 I'm sorry, but you're being a cunt.
00:30:45.000 These are all robots.
00:30:47.000 This is wild.
00:30:48.000 Because if you just read this and you didn't know, I don't really care if you disagree with my opinion as long as you don't call me a pedophile.
00:30:55.000 If you were a real man, you would be with a young girl and take care of her and you would be a sex offender.
00:31:02.000 Like, this is wild shit.
00:31:04.000 Yep.
00:31:05.000 One of the things people don't know, it actually was just developed over the summer.
00:31:08.000 They announced that OpenAI, just to track since we came and talked about some of these things last time, in August 2020, OpenAI released a video of using the same technology of machines generating stuff.
00:31:20.000 To actually write programming code.
00:31:22.000 So you tell the GPT-3, I want an asteroid video game.
00:31:26.000 And it's like...
00:31:26.000 And it writes all the code.
00:31:29.000 And then it puts a little graphic of a starship thing in the middle.
00:31:31.000 And then there's rocks that are flying.
00:31:33.000 And you say, I want the rocks to move faster.
00:31:35.000 And then the rocks move faster through the asteroid game.
00:31:37.000 Only requiring natural language input.
00:31:38.000 No programming.
00:31:39.000 Yes.
00:31:39.000 So you're just saying, you're typing in natural text.
00:31:41.000 I want an asteroid video game that when I move left, it moves left.
00:31:44.000 I want this.
00:31:44.000 I want the asteroids to move faster.
00:31:46.000 Actually make the Starship bigger.
00:31:48.000 And then it just changes, and it does it all for you.
00:31:51.000 Now, it's not perfect, but this is AGI. You're just typing it in text.
00:31:55.000 That's right.
00:31:55.000 But also voice-to-text.
00:31:57.000 So you could just say it.
00:31:58.000 You combine these things together.
00:31:59.000 Alexa, make me a Pong game.
00:32:02.000 Yeah.
00:32:03.000 Exactly.
00:32:04.000 Alexa, code me the Unreal Engine.
00:32:06.000 I mean, that one's going to be harder.
00:32:08.000 Not yet.
00:32:08.000 Right.
00:32:09.000 But the point is, that's where we're headed, right?
00:32:10.000 Right.
00:32:11.000 And part of this is, again, we have the power of God.
00:32:13.000 This is actually it right here.
00:32:14.000 Here it is.
00:32:14.000 This is the one.
00:32:16.000 Make the person 100 pixels, and it's doing it all itself.
00:32:19.000 Yep.
00:32:20.000 Wow.
00:32:21.000 And it writes the code in the right-hand side.
00:32:22.000 So this video that Jamie pulled up on YouTube is OpenAI Codex Live Demo, and you can see this all happening while this person types in the data, and they're actually explaining it now, how this is going to work.
00:32:35.000 Once you see it later, set its position to 500 pixels down and 400 pixels from the left, and then it just does that.
00:32:43.000 Oh my god, look how quick it codes it.
00:32:47.000 Wow.
00:32:47.000 Now make it controllable with the left and right keys, the right arrows, right?
00:32:51.000 Boom, and then now you can move it.
00:32:53.000 So it does it progressively, right?
00:32:54.000 It's adding the code in.
00:32:55.000 Yeah.
00:32:56.000 Wow.
00:32:57.000 Yeah.
00:32:58.000 And it's going to be accessible to more and more people, too.
00:33:01.000 Go ahead.
00:33:02.000 This is an example of a kind of deep point to think about for the state of the world as a whole is one of the things that exponential tech means is exponentially more powerful.
00:33:11.000 I hate to tell you this, but get this thing right up in your face.
00:33:14.000 Exponentially more powerful tech that's also exponentially cheaper, which also means more distributed.
00:33:19.000 Right.
00:33:19.000 And so pretty soon this level of tech will not only be getting better but available to everybody.
00:33:25.000 So what happens when you have an internet where not only do you have an AI that is curating the Facebook feed for the most sticky stuff, which usually means the most toxic stuff, and that's an AI that is curating human-made content.
00:33:37.000 But now you have AIs that are creating content that also get to maximize for stickiness.
00:33:41.000 And then you have the relationship between the curation and the creation AIs.
00:33:45.000 Like how does anyone ever know what is true about anything again?
00:33:48.000 So AI can create fake stories and the fake stories can be boosted up by these troll farms.
00:33:56.000 Which themselves could be run by fake accounts and fake logic.
00:34:00.000 Oh my god.
00:34:01.000 But wait, it goes one step further.
00:34:03.000 So that's just distributed AI, right?
00:34:07.000 But we also have drones making...
00:34:10.000 Continuously better drones with continuously better ability to swarm and weaponize them that also becomes easily accessible.
00:34:17.000 We also have CRISPR making biotech capability, something that you don't have to be a state actor to have.
00:34:23.000 Small actors can have.
00:34:24.000 So there's this question of how do we make it through having decentralized technology?
00:34:29.000 Exponential tech, which means decentralized catastrophic capability.
00:34:33.000 Godlike powers, decentralized godlike powers.
00:34:35.000 Decentralized godlike powers in terms of biology as well as in terms of technology.
00:34:39.000 That's right.
00:34:39.000 Social media is an instance.
00:34:40.000 Just gloss over the CRISPR thing.
00:34:43.000 For people who don't understand what CRISPR is, CRISPR is a gene editing tool.
00:34:46.000 I think it's on the second iteration now or is it on the third?
00:34:50.000 Something like that.
00:34:51.000 They're getting better and better at it.
00:34:53.000 The idea is eventually it's going to get to the point where it's like a home computer.
00:34:58.000 Like where you are going to be able to edit genes.
00:35:02.000 So how do you stop that?
00:35:05.000 Or what do you do about that?
00:35:07.000 And if you wanted to have any kind of regulation about something like that...
00:35:12.000 What is the regulation?
00:35:14.000 Is the regulation that you have to have some specific level of clearance before you have access to it?
00:35:20.000 But if that's the case, then you put it in control of the government and then also bad actors and other governments are going to just distribute it wildly.
00:35:30.000 And how do you control that someone...
00:35:38.000 Yeah.
00:35:50.000 And we don't want that future.
00:35:52.000 So in general, because this might sound like just disaster porn, which I want to be really clear.
00:35:58.000 There is a way through this.
00:36:00.000 Our goal in coming on was to be able to talk about framing the problem so we know what we're trying to solve.
00:36:05.000 We're not trying to say, hey, we've just got this social media problem.
00:36:07.000 Let's frame it really clearly.
00:36:09.000 Okay, you've got your coding problem, and you have this biology problem with CRISPR. How does a civilization navigate this without killing itself?
00:36:19.000 Well, Daniel's going to be able to speak to a lot more of this.
00:36:22.000 I just wanted to connect it first to social media so people see the through line.
00:36:25.000 So I actually think that social media is its other kind of...
00:36:28.000 It doesn't seem as dangerous, right?
00:36:30.000 It just feels like this thing where people are sharing cat videos and their opinions and their political ideas and sharing links.
00:36:35.000 But it's actually just like this.
00:36:37.000 And in the same way that that dangerous capacity, we're now seeing what...
00:36:41.000 That dangerous godlike power was doing of steering three billion people's thoughts, personalized to them the thing that would most outrage, you know, boogeyman their lizard brain and their nervous system.
00:36:51.000 That's a godlike power.
00:36:52.000 When you have a godlike power, there's sort of two choices.
00:36:55.000 There's two attractors with that power.
00:36:58.000 One is, think of it like a bowling alley.
00:36:59.000 You've got one gutter on the left and one gutter on the right.
00:37:01.000 On the left, you've got a dystopia, a centralized control saying, like, here's how we're going to control that godlike power.
00:37:07.000 That's like China controlling its internet.
00:37:09.000 That's like Mark Zuckerberg having a total monopoly on what people can and can't say.
00:37:13.000 Like, those are both dystopias.
00:37:15.000 That's centralized power.
00:37:17.000 The other gutter in the bowling alley is, like, take your hand off the steering wheel and let this thing go for everyone.
00:37:23.000 Like, anyone can make anything go viral.
00:37:25.000 Let's add the devious licks, which is, by the way, a TikTok challenge for anybody to basically trash their high school bathroom.
00:37:31.000 And it teaches you how to do it, and these videos go viral, and it's just like everyone's crashing.
00:37:35.000 What is a devious lick?
00:37:36.000 I probably shouldn't have gone there.
00:37:37.000 It's a...
00:37:37.000 Too late.
00:37:38.000 It's a...
00:37:40.000 A high school teacher told me this.
00:37:42.000 There's all these horrible things that are going viral at the point.
00:37:44.000 Virality is a godlike power.
00:37:46.000 And devious licks is a challenge that basically you're challenging your fellow high school aged friends around the world.
00:37:56.000 To trash their high school bathroom.
00:37:57.000 So you like flush a Big Mac with like shit and all this horrible stuff down the toilet at the same time.
00:38:02.000 They put like pee.
00:38:04.000 This is awful.
00:38:04.000 They put like pee in the soap dispenser.
00:38:06.000 They do all this awful stuff.
00:38:07.000 And you're just spreading a disaster meme.
00:38:10.000 You're just teaching people how to create a decentralized catastrophe instead of a drone hitting something.
00:38:14.000 And they do this just for TikTok likes?
00:38:16.000 Well, they do it because it's getting attention and engagement.
00:38:19.000 There's another one that's a self-harm challenge for teenage girls.
00:38:21.000 They're saying, basically, you know, this is teaching...
00:38:25.000 Who can do a cutting?
00:38:26.000 It's like a cutting challenge, I think is what it's called.
00:38:28.000 So the point is that these decentralized...
00:38:30.000 Do we know where this comes from?
00:38:31.000 Are these things from troll farms?
00:38:34.000 I don't know.
00:38:35.000 But they could be.
00:38:36.000 Some of them probably are, right?
00:38:37.000 Yeah.
00:38:38.000 There's a concept called stochastic terrorism.
00:38:40.000 There's a good article on Edge, which basically is the idea...
00:39:00.000 I think that's an example of...
00:39:07.000 I don't...
00:39:08.000 I mean, and I'm not going to claim that everyone is just...
00:39:10.000 That's an example, I think I would say, of I can basically go into a group of the Boogaloo Boys or, you know, Stop the Steal groups or something like that, and I can just see stuff that's like, hey, let's get our guns out.
00:39:20.000 Let's do this.
00:39:21.000 And I just...
00:39:22.000 Just hinting at that idea.
00:39:23.000 I'm not telling one person to go do something.
00:39:25.000 I'm not controlling anyone.
00:39:26.000 I'm just hinting, and there's a wide enough group there...
00:39:29.000 That people can take action.
00:39:30.000 So that's one of the other decentralized power tools.
00:39:33.000 But I just wanted to close the thought on the bowling alley.
00:39:36.000 We've got the bowling alley.
00:39:38.000 One gutter is like, let's lock it down with surveillance.
00:39:41.000 Let's lock it down with Mark Zuckerberg controls everything.
00:39:43.000 Let's lock it down with the government, tells us what we can and can't do on computers.
00:39:47.000 And the other gutter, which is the decentralized power for everyone, which without people having the wisdom to wield that godlike power, or at least not evidence in people's own usage of it right now.
00:39:58.000 Also, we've incentivized people to do destructive things just for likes.
00:40:03.000 Right.
00:40:03.000 So in certain places, there is an incentive for those things to happen.
00:40:06.000 It's not just by accident.
00:40:07.000 It's like by design and incentivized.
00:40:09.000 You just said it's super important.
00:40:11.000 It's a population that is getting continuously more radicalized on all sides that simultaneously has continuously more powerful tools available to them in a world that's increasingly fragile.
00:40:26.000 And so if you have an increasingly fragile world, meaning more interconnected global supply chains where a collapse somewhere leads to collapse everywhere, more sensitive infrastructure, things like that.
00:40:37.000 If you have an increasingly fragile world, you have more and more radicalized people and you have those radicalized people having access to more and more powerful tech.
00:40:45.000 That's just fragility across lots of different dynamics.
00:40:48.000 And this is why the social media thing is so central is it's a major part of the radicalization process.
00:40:53.000 It's both a major part of the radicalization process and it's itself an example of the centralized control censorship, which we don't want, and the decentralized viral means for everyone which radicalize and enrage people and polarized democracies into not working.
00:41:06.000 The thing is, in those two gutters, the gutters are getting bigger every day, like on each side.
00:41:11.000 You've got more potential for centralized control.
00:41:14.000 You've got China basically doing full control over its internet, you know, doing a bunch of stuff to top-down control.
00:41:20.000 And the other side, you have more and more decentralized power in more hands, and that gutter is growing.
00:41:26.000 The question is, how do you basically—we have to bowl a strike down the center of that alley, but it's getting thinner and thinner every day.
00:41:33.000 And the goal is, how do we actually sort of—it's almost like a test, right?
00:41:37.000 We are given these godlike powers, but we have to have the wisdom, love, and prudence of gods to match that set of capacities.
00:41:43.000 You were just mentioning what China's doing to kind of regulate its internet.
00:41:47.000 That's because you're worth speaking about.
00:41:48.000 Yeah, have you been following this?
00:41:50.000 Yeah, that's what terrifies me, is that we have to become like China in order to deal with what they're doing.
00:41:57.000 I feel like one step moving in that general direction is a social credit score system, and I'm terrified of that.
00:42:05.000 And I think that that is where vaccine passports lead to.
00:42:08.000 I really do.
00:42:10.000 This idea that they're slowly working their way into our everyday lives in this sort of inexorable way where you have to have some sort of paperwork or some sort of a Q code or something on your phone or QR code.
00:42:23.000 That scares the shit out of me because you're never going to get that back.
00:42:28.000 Once the government has that kind of power and control, they're going to be able to exercise it whenever they want with all sorts of reasons to institute it.
00:42:36.000 I'm worried about that too.
00:42:37.000 But I will say also, just to also notice that everywhere there's a way in which a small move in a direction can be shown to lead to another big boogeyman and that boogeyman makes us angry, social media is upregulating the meaning of everything to be its worst possible conclusion.
00:42:53.000 So like a small move by the government to do X might be seen as this is the first step in this total thing.
00:42:57.000 I'm not saying that they're not going to go do that.
00:42:59.000 I'm worried about that too.
00:43:00.000 But to also just notice the way that social media amplifies The degree to which we all get kind of reactive and triggered by that.
00:43:07.000 The thing I think is worth mentioning is what China is doing regarding its internet because it's seeing real problems.
00:43:15.000 And we might not like their solution.
00:43:16.000 We might want to implement a solution that has more civil liberties than we should.
00:43:19.000 Let's explain what they're doing.
00:43:21.000 Yeah.
00:43:21.000 So I'll do it quickly.
00:43:22.000 So it's quite literally as if Xi Jinping saw the social dilemma because they've, in the last two months, rolled out a bunch of sweeping reforms That include things like if you're under the age of 14 and you use Douyin,
00:43:38.000 which is their version of TikTok, when you swipe the videos, instead of getting like the influencer dancing videos and soft pornography, you get science experiments you can do at home, museum exhibits, and patriotism videos.
00:43:51.000 So you're scrolling and you're getting stuff that's educating because they want their kids to grow up and want to be astronauts and scientists.
00:43:57.000 They don't want them to grow up and be influencers.
00:43:59.000 And when I say this, by the way, I'm not, just to be clear, I'm not praising that model, just noticing all the things that they're doing.
00:44:04.000 Well, I'll praise it.
00:44:06.000 If you're going to influence people, that's a great way to do it.
00:44:08.000 They also limit it to three hours, sorry, 40 minutes a day on TikTok.
00:44:13.000 For gaming, let me actually do the TikTok example.
00:44:16.000 So they do 40 minutes a day for TikTok.
00:44:18.000 They also, when you scroll a few times, they actually do a mandatory five-second delay saying, hey, do you want to get up and do something else?
00:44:26.000 Because when people sit there, infinitely scroll.
00:44:29.000 Even Tim Cook recently said, mindless scrolling, which is actually invented by my co-founder of the Center for Humane Technology, Azar Raskin.
00:44:35.000 He was in The Social Dilemma.
00:44:36.000 He's the one who invented that infinite scroll thing.
00:44:38.000 China said, hey, we don't want people mindlessly scrolling.
00:44:41.000 So after you scroll a few videos, it does a mandatory five-second interlude.
00:44:45.000 They also have opening hours and closing hours.
00:44:48.000 So from 10 p.m.
00:44:50.000 until 6 in the morning, if you're under 14, It's like it's closed.
00:44:54.000 Meaning one of the problems of social media for teenagers is if I'm not on at one in the morning but all my friends are on and they're still commenting on my stuff, I feel the social pressure.
00:45:03.000 I'm going to be ostracized if I don't participate.
00:45:05.000 And if your notifications are on, your phone keeps buzzing.
00:45:07.000 Totally.
00:45:07.000 And even if they're not on, it's like, oh, but I want to see if they said something about my thing.
00:45:11.000 And so it's what we call a multipolar trap.
00:45:14.000 If I don't participate but the other guys are, I'm going to lose out.
00:45:17.000 And Facebook and these companies, they know that, by the way.
00:45:19.000 Even Netflix said their biggest competitor is sleep.
00:45:22.000 So one of the...
00:45:23.000 Because they're all competing for attention.
00:45:25.000 So when you do this mandatory thing where you say we're going to close from 10 p.m.
00:45:29.000 to 6 in the morning, suddenly everyone, if you're in the same time zone, it's another important side effect, can't use it at the same time.
00:45:36.000 So these are some examples.
00:45:37.000 For their military, by the way, if you're a member of the Chinese PLA army, you get a locked-down smartphone.
00:45:44.000 It's like a light phone.
00:45:45.000 It's like hyper-locked down.
00:45:46.000 You can't do anything.
00:45:47.000 By contrast, we know that Russia and China go into our...
00:45:53.000 Right.
00:46:00.000 Right.
00:46:08.000 Take the people who have real tactical capability and radicalize them.
00:46:13.000 And so target those groups in particular.
00:46:15.000 And it makes sense why their military wants to lock down the ability for external influence.
00:46:21.000 Of course.
00:46:22.000 So while we're spending all this money building physical borders, building walls, or spending $50 billion a year on the passport controls and the Department of Homeland Security and the physical...
00:46:32.000 You know, if Russia are trying to try to fly a plane into the United States, we've got Patriot missiles to shoot it down.
00:46:37.000 But when they try to fly an information, like precision-guided information bomb, we, instead of responding with Patriot missiles, we respond with, here's a white-glove Facebook algorithm that says, which zip code or Facebook group would you like to target?
00:46:50.000 So it changes the asymmetries.
00:46:52.000 Typically, what made the U.S. powerful was the geographic...
00:46:55.000 We had these huge oceans on both sides.
00:46:58.000 It gives us a unique place in the world.
00:46:59.000 When you move to the digital world, it erases that geographic asymmetry of power.
00:47:03.000 So this is an imminent national security threat.
00:47:06.000 This is not just like, hey, social media is adding some subtle pollution in the form of mental health, or hey, it's adding a little bit of polarization, but we can still get things done.
00:47:13.000 It's an imminent national security threat to our continuity of our model of governance, which we want to keep.
00:47:19.000 Have you spoken to people in power?
00:47:20.000 Have you spoken to congresspeople about this?
00:47:23.000 Yes, but I'm hoping many more of them watch this, because I think people need to see the full scope.
00:47:27.000 And I really do want to make sure we're not sounding like just full disaster porn, because we want to get to the point— Don't worry about that.
00:47:32.000 Go full disaster porn.
00:47:34.000 Better that than not.
00:47:36.000 It's not meant to scare people.
00:47:37.000 Just to get an appraisal of what is the situation that we're in.
00:47:41.000 It's going to scare—the reality is going to scare people.
00:47:42.000 Reality is scary.
00:47:43.000 It should scare people, because we're so far behind the eight ball.
00:47:47.000 There's a really important point Tristan was just at that we actually need to double click on which is that democracies are more affected by what's happening with social media than authoritarian nations are and for a number of reasons but do you want to… And we sort of hinted at it earlier,
00:48:03.000 but when social media's business model is showing each tribe their boogeyman, their extreme reality, it forces a more polarized political base, which means to get elected, you have to say something that's going to appeal to a base that's more divided.
00:48:18.000 And in the Facebook files that Francis Haugen put out, they showed that when Facebook changed the way its ranking system worked in 2018 to something called meaningful social interactions, I won't go in the details, they talked to political parties in Europe.
00:48:32.000 So here we are.
00:48:32.000 It's 2018. They do an interview with political parties in Poland and Hungary and Taiwan and India.
00:48:38.000 And these political parties say, Facebook, we know you changed your ranking system.
00:48:42.000 And Facebook like smugly responds, yeah, everyone has a conspiracy theory about how we change our ranking system because those stories go viral.
00:48:50.000 And they're like, no, no, no, we know that you changed how your ranking system works.
00:48:54.000 Because we used to be able to publish, here's a white paper on our agriculture policy to deal with, like, soil degradation.
00:48:59.000 And now, when we publish the white paper, we get crickets.
00:49:02.000 We don't get any response.
00:49:03.000 And we tested it, and the only thing that we get traffic and attention on is when we say negative things about the other political parties.
00:49:10.000 And they say, we know that's bad.
00:49:12.000 We don't want to do that.
00:49:13.000 We don't want to run our campaign that's about saying negative things about the other party.
00:49:16.000 But when you change the algorithm, that's the only thing we can do to get attention.
00:49:20.000 It shows how central the algorithm is to everything else.
00:49:23.000 If I'm Tucker Carlson or Rachel Maddow or anybody who's a political personality...
00:49:27.000 Are they really saying things just for their TV audience?
00:49:30.000 Are they also appealing to the algorithm?
00:49:32.000 Because more and more of their attention is going to happen downstream in these little clips that get filtered around.
00:49:37.000 So they also need to appeal to how the algorithm is rewarding saying negative things about the other party.
00:49:42.000 So what that does is it means you elect a more political representative class that's based on disagreeing with the other side and being divided about the other side, which means that it throws a wrench into the gears of democracy and means that democracy stops delivering results.
00:49:56.000 In a time where we have more crisis, we have more supply chain stuff and inflation and all these other things to respond to, and instead of responding effectively, it's just division all the way down.
00:50:05.000 But it's been division from the jump even long before there was social media.
00:50:09.000 So all social media is doing...
00:50:10.000 It's putting gasoline on it.
00:50:11.000 Yeah.
00:50:11.000 They're taking advantage of a trend that already existed.
00:50:14.000 It's not like...
00:50:15.000 But my opponent is reasonable.
00:50:17.000 Right.
00:50:17.000 But I feel like I'm just a better choice.
00:50:19.000 And you could disagree because he's a great guy.
00:50:21.000 But this is how I feel.
00:50:22.000 No one's doing that.
00:50:23.000 Totally.
00:50:24.000 Totally.
00:50:24.000 But notice though in this 2018 example how specific the change was.
00:50:28.000 Those political parties before 2018, they could get elected in those countries because they hadn't gone as partisan maybe as we were yet.
00:50:34.000 Yeah.
00:50:34.000 They could have gotten elected and getting attention by saying, here's a white paper about agriculture policy.
00:50:39.000 But after 2018, the algorithm has the master say.
00:50:42.000 Everyone has to appeal the algorithm.
00:50:44.000 If I'm a small business, I have to appeal to the algorithm.
00:50:46.000 If I'm a newspaper, do I just like write the articles I want to write or the investigative stories, the fourth estate that we need for democracy to work?
00:50:52.000 No, I have to write the clickbait title that's going to get attention.
00:50:55.000 So I have to exaggerate and say Joe Rogan just takes horse dewormer because that's going to get more attention than saying he took ivermectin.
00:51:01.000 Particularly in this world where no one's buying paper anymore.
00:51:05.000 Correct.
00:51:05.000 Everyone's buying everything, clicking online, so you really – and very few people are even subscribing, so you have to give them these articles and then have these ads in the articles.
00:51:14.000 And those publishers – and that's also driven by the business models of these central tech companies, especially Facebook, Twitter, and Google.
00:51:21.000 There's two feedback loops that he just mentioned.
00:51:24.000 Politically, if you have Facebook and other platforms like this polarizing the population, then the population supports a more polarized representative class.
00:51:33.000 But the representatives to be elected are doing political ads and so the political ads then further polarize the population.
00:51:41.000 So now you have this feedback loop and then the same is also true with media.
00:51:44.000 The media has to – meaning newspapers, television.
00:51:49.000 Still has to do well on the Facebook algorithm because more and more there's a monopoly of attention happening there and it's someone seeing a clip there that has them decide to subscribe to that paper or keep subscribing to it or whatever it is.
00:52:00.000 So you end up having the algorithm radicalizing what people want to pay attention to where then the sources of broadcast have to appeal to that, which then in turn further radicalizes the population.
00:52:12.000 So these are runaway feedback loops.
00:52:17.000 And what's the solution?
00:52:39.000 There's obviously many steps to this, right?
00:52:41.000 So once you've kind of let this cancer sort of spread, if you take out the thing that was causing the cancer, we've now already pre-polarized everyone's beliefs.
00:52:51.000 Like when you say, what's the solution to all this?
00:52:52.000 Like all of our minds are running malware.
00:52:55.000 Like, we're all running bad code.
00:52:56.000 We're all running confirmation bias.
00:52:58.000 Except no one thinks that they are.
00:53:00.000 Right.
00:53:00.000 Everyone thinks the other ones are, but not me.
00:53:02.000 But the point is that all of us, like, it doesn't matter, like, people on all sides of the political aisles and all tribes, we've all been shown our version of the boogeyman, our version of the inflated thing that got our attention, and then made us focus on that and then make us double down and go into those habits of those topics being the most important.
00:53:20.000 And so we have to realize that.
00:53:22.000 I almost think we need a shared moment for that.
00:53:24.000 I wish The Social Dilemma was a little bit more of a...
00:53:26.000 It was a shared moment, but I think there's almost like a truth and reconciliation moment that we need to unwind our minds from the cult factory.
00:53:35.000 Because it's a cult factory that found each of the little tribes and then just sucked them together and made them in a self-reinforcing chamber.
00:53:42.000 Let's say we take any issue that some people care about and think is central, whether we take social justice or climate change or US-China relations.
00:53:51.000 If half of the population thinks that whatever – half the population has a solution they want to implement, carbon taxes or whatever.
00:53:59.000 Other half of the population is polarized to think that that is – I'm going to mess everything up.
00:54:05.000 So that other half are still political actors and they're going to escalate how they counter that.
00:54:11.000 How do you get enough cooperation to get anything done especially where there are real issues and not just have all the energy become waste heat?
00:54:18.000 In autocracy, let's take China as an example where you don't have to – where you don't have so much internal dissent.
00:54:25.000 You don't have that issue.
00:54:26.000 So you can actually do long-term planning.
00:54:29.000 So one of the things that we see is we have decreasing ability to make shared sense of the world.
00:54:35.000 In any kind of democratic society, if you can't make shared sense of the world, you can't act effectively on issues.
00:54:40.000 But the tech – the types of tech that are decreasing our ability to make shared sense of the world are also increasing the speed at which tech is changing the world and the total consequentiality of it.
00:54:53.000 That's one way to start to think about like this bowling alley example is We're having faster and faster, more and more profound consequential effects and less and less ability to make sense of it or do anything about it.
00:55:04.000 So underneath the AI issue, the CRISPR issue, the US-China issue, the how do we regulate markets issue, the how do we fix the financial crisis issue is can we make sense of anything collectively, adequately to be able to make choices effectively in the environment we're in and that's underlying it.
00:55:24.000 Tristan was laying out that you got these two gutters, right?
00:55:27.000 You've got decentralized catastrophe weapons for everyone if we don't try to regulate the tech in some ways and that world breaks or to say if we don't want decentralized catastrophe weapons for everyone, maybe we do something like the China model but where you have ubiquitous surveillance and that's a dystopia of some kind.
00:55:44.000 So either you centralize the power and you get dystopias or it's decentralized and you get catastrophes and right now – The future looks like one of those two attractor states, most likely.
00:55:53.000 Catastrophes or dystopias.
00:55:54.000 We want a third attractor.
00:55:56.000 How do you have a world that has exponential tech that doesn't go catastrophic, where the control mechanisms to keep it from going catastrophic aren't dystopic?
00:56:04.000 And by the way, we're not here saying like, go buy our thing or we've got a new platform.
00:56:08.000 This is just about describing what is that center of that bowling alley that's not the gutters that we can skate down.
00:56:16.000 The closest manifesting example of this so far, although when you do one more construction, I think, which is this, but is Taiwan.
00:56:24.000 Because Taiwan, actually, I think I talked about it last time we were here, is a...
00:56:30.000 They've got this digital minister, Audrey Tang, who has been saying, how do you take a democracy and then use technology to make a stronger democracy?
00:56:40.000 So you can look right now at the landscape.
00:56:42.000 We can notice that China, countries like China, autocratic countries, We're good to go.
00:57:06.000 By contrast, open societies, democracies, Western democracies, are not consciously saying, hey, how do we take all of this tech and make a stronger democracy?
00:57:17.000 How do we have tech plus democracy equals stronger democracy?
00:57:20.000 One of the other reasons I wanted to talk to you.
00:57:22.000 So far, I think the tech reform conversation is like, how do we make social media like 20% less toxic and then call it a day?
00:57:29.000 Or like take a mallet and break it up and then call it a day?
00:57:31.000 That's not enough when you understand the full situation assessment that we're kind of laying out here of the skating down the middle of the bowling alley.
00:57:38.000 The thing that we need that competes with that thing, because we can't just also allow, that thing is going to outperform.
00:57:43.000 The China autocratic bottle is going to outcompete a, you know, democracy plus social media that, like, is 20% less toxic, isn't going to outcompete that thing.
00:57:53.000 Well, ultimately in the long run it's going to.
00:57:55.000 But what's fascinating is they're willing to forego any sort of profits that they would have from these children from 10 p.m.
00:58:02.000 to 6 a.m.
00:58:03.000 in order to make a more potent society of more influential, more educated, more positive people that are going to contribute to society.
00:58:14.000 This is something that I think you can only do if you have this inexorable connection between the government and business.
00:58:21.000 And that's something that they have with corporations and with the CCP over there.
00:58:24.000 They have this ability because they're completely connected.
00:58:29.000 Like the government is- What did the senator tell us about China's risk?
00:58:33.000 Oh, yeah.
00:58:33.000 This is a great point.
00:58:34.000 We were talking with a sitting senator who was saying – or at some national security conference – talking to a foreign minister of a major EU country and said – Who do you think the CCP, the Chinese Communist Party, considers to be the greatest rival to its power?
00:58:50.000 You would say the United States, right?
00:58:53.000 Right.
00:58:54.000 It's not the United States.
00:58:55.000 They consider their own technology companies to be the greatest threat to their power.
00:59:01.000 Oh, so that's why when someone like Jack Ma steps out of the line, they lock him up in the brig for a few months and shut his mouth.
00:59:06.000 Notice that cryptocurrency, oh, that's a threat to our financial system.
00:59:11.000 Oh, Bitcoin specifically.
00:59:13.000 Oh, TikTok.
00:59:14.000 That's a threat to the mental health of our kids.
00:59:16.000 Oh, Facebook.
00:59:17.000 We don't want that in our country.
00:59:18.000 That would open up our military to foreign hacking.
00:59:20.000 So they see correctly that technology is the new source of power of basically what guides societies.
00:59:27.000 It is the pen that is writing human history.
00:59:29.000 And it doesn't have, if you let just for-profit motives, again, coupled with like, how do I get as much attention out of people as possible in the race to the bottom of the brainstem to suck it out of people, that thing doesn't work with societies.
00:59:41.000 That breaks it.
00:59:41.000 So they see that appropriately and then say, let's do something about it.
00:59:44.000 Now, the cynical view is obviously they're a communist country that's just, you know, just doing their thing.
00:59:48.000 That's a cynical perspective.
00:59:49.000 But a post-cynical perspective is they're also appropriately recognizing that there's a certain threat that comes with allowing unregulated technology.
00:59:58.000 So...
01:00:00.000 One way to think about this, Tristan was just saying that they recognize the power of new technologies and the need to be able to employ them if they want to be effective.
01:00:10.000 We can see how much the world responded, how much the US responded to the possibility of a nuclear bomb with the Manhattan Project, just even the possibility that the Germans would get it and how that would change everything asymmetrically.
01:00:22.000 And so we make basically an indefinite black budget, find all the smartest scientists in the world because that much asymmetry of tech will determine who runs the world.
01:00:31.000 It's important to also say there are some people who will have just like a knee-jerk reaction that says, oh, you guys are just being catastrophic.
01:00:39.000 Yeah, you guys are just trying to scare us, disaster porn.
01:00:41.000 There have always been these risks.
01:00:42.000 We always come through them.
01:00:43.000 Really, until World War II and the bomb, there was no way for us to actually mess up the habitability of the world writ large.
01:00:50.000 We could mess up little local things.
01:00:51.000 In fact, that happened.
01:00:52.000 Most previous civilizations did go...
01:00:54.000 Like, did go extinct for different reasons.
01:00:57.000 But World War II was the first time we had truly globally catastrophic tech, and we had to build an entire world system, mutually assured destruction, the Bretton Woods world, the IGO world, to basically not use that tech.
01:01:10.000 Well, now, that was basically the first catastrophe weapon, and then we had only two superpowers that had it, so you could do mutually assured destruction, and it's really hard to...
01:01:19.000 To enrich uranium and make nukes.
01:01:20.000 It's not hard to do these types of tech, right?
01:01:23.000 That's the whole point and we have now dozens of catastrophe weapons, many dozens of actors including non-state actors who have them.
01:01:30.000 And so we're like, oh, we're in a truly new phase.
01:01:32.000 This isn't the same as it's always been.
01:01:34.000 We're in a novel time of risk.
01:01:36.000 And the exponential technologies with kind of computation at the center, AI and these other ones we're talking about, are so much more powerful than all forms of legacy power that only the groups that are developing and deploying exponential tech will influence the future.
01:01:51.000 That's like the big story.
01:01:53.000 And then we would say, well, which groups are developing and deploying exponential tech?
01:01:56.000 Well, China is.
01:01:58.000 Autocratic nations are.
01:02:00.000 Facebook is, Google is, like major corporations that are also top-down, non-democratic systems are and they're becoming – like Facebook has 3 billion people.
01:02:09.000 The US has 300 million people, right?
01:02:11.000 We're talking about something that has a global scale of influence but is really a top-down system.
01:02:16.000 A corporation, though.
01:02:17.000 So you either have corporations that are wielding the power of all this technology for mass behavior modification, surveillance of everyone, perfect sort of understanding of their psychological traits, and then moving them that scale.
01:02:28.000 But in the big tech corporation model, they're doing it for a for-profit motive, whereas in the CCP model, they're doing it for...
01:02:36.000 But neither of them are democratic.
01:02:38.000 Neither of them have some kind of participatory governance, jurisprudence of, for and by the people and the open societies are not innovating and how do we develop and deploy exponential tech in an open society way?
01:02:50.000 That's fundamentally what we're saying has to be like the central imperative of the world right now.
01:03:07.000 Well, the simple way is you don't, right?
01:03:14.000 The simple way is you lock things down and become an autocratic – yeah.
01:03:18.000 So you either beat China by becoming China or you figure out a third way.
01:03:22.000 We'd like to see there be a third way.
01:03:24.000 I'd like to see a third way too, but I don't see it.
01:03:27.000 That's what's terrifying to me.
01:03:29.000 A little more about Taiwan is actually worthwhile.
01:03:31.000 We're moving in the direction of China more than we're moving in the direction of some new utopia.
01:03:38.000 Currently, yes.
01:03:39.000 Yes.
01:03:39.000 Right.
01:03:40.000 So what about Taiwan?
01:03:41.000 You can't even mention that.
01:03:43.000 See what happened with John Cena?
01:03:44.000 No, what happened?
01:03:45.000 You didn't see that?
01:03:46.000 Yeah.
01:03:46.000 John Cena, there was an opening weekend for Fast and the Furious 9, I believe, and John Cena accidentally or inadvertently said that Taiwan is going to be the first country that sees the movie.
01:04:02.000 Well, China doesn't recognize Taiwan as a country, and if you want to do business with China, you can't say that.
01:04:08.000 That was on full display, and it made people very skeptical of the World Health Organization when one of their spokespeople was having a conversation with a journalist.
01:04:17.000 When she brought up Taiwan's response, and other countries have done it like this, but Taiwan's response, and he disconnected his line.
01:04:24.000 Oh, I did see that.
01:04:25.000 Did you see that?
01:04:25.000 Yeah.
01:04:25.000 And then came back on and glossed over it very quickly.
01:04:27.000 He said, China's done a wonderful job.
01:04:29.000 Let's move on.
01:04:30.000 And she was like, but Taiwan, and he's like, China's amazing, and China this and China that.
01:04:34.000 Well, John Cena, by saying that Taiwan was the first country that was going to see Fast and the Furious, pissed off China.
01:04:42.000 And then John Cena made a video where he spoke Mandarin.
01:04:47.000 And in it is like this weird video.
01:04:49.000 You should watch it because you haven't seen it.
01:04:50.000 Let's show it to him.
01:04:52.000 Show it to him.
01:04:52.000 Because he's apologizing to China in the weirdest way, saying, I really, really respect China and I'm so sorry and I made a mistake.
01:05:01.000 I was very, very tired.
01:05:02.000 So this is the perfect example of a kind of dystopia that we don't want to go to a future where people are all accommodating or can't feel or think their actual thoughts because they have to appeal to some source of power.
01:05:14.000 Exactly, and the source of power is financial, because $160 million was the opening weekend for Fast and Forest 9, and $134 million of it came from China.
01:05:31.000 Saying I had many, many interviews, and one of them...
01:05:36.000 I made a mistake.
01:05:40.000 Everyone asked me if I can use Chinese.
01:05:46.000 People at Fast and the Furious 9 gave me lots of interview information.
01:05:55.000 I made a mistake.
01:05:59.000 I have to say right now, it's so, so, so, so, so, so important.
01:06:04.000 I love and respect China and Chinese people.
01:06:09.000 I'm so, so sorry for my mistake.
01:06:14.000 I'm sorry, I'm sorry, I'm very sorry.
01:06:18.000 You have to understand, I love and respect China and Chinese people.
01:06:26.000 I'm sorry.
01:06:28.000 That's it.
01:06:29.000 He doesn't even say what he's sorry about.
01:06:31.000 Right.
01:06:31.000 I mean, this is wild shit.
01:06:33.000 Yeah.
01:06:33.000 When you see this guy who is, you know, one of our big major action movie stars.
01:06:39.000 Right.
01:06:40.000 Just on his knees apologizing to China.
01:06:43.000 Right.
01:06:44.000 Who hadn't said anything bad about China.
01:06:46.000 Not at all.
01:06:47.000 Right.
01:06:47.000 All he did was say Taiwan is a country, which you can't say.
01:06:51.000 Right.
01:06:52.000 And if someone talks about something that's not the mainstream narrative in the tech companies currently...
01:06:56.000 I can't believe you never saw that.
01:06:57.000 I think I'd seen it in like a John Oliver video or something briefly, so I had a taste of it, but yeah.
01:07:02.000 That's hilarious.
01:07:03.000 That's where you get your news?
01:07:05.000 I don't know, but I was working out sometime.
01:07:08.000 So this is a good example of we don't want to live in dystopias where our thought and our ideas and our free expression and our ability to figure out what's true in an open-ended way because we don't know what's true.
01:07:20.000 Right.
01:07:21.000 We need to protect that.
01:07:22.000 Remember last time I ended our conversation talking about Orwellian dystopias and Huxleyian dystopias, that quote about amusing ourselves to death?
01:07:29.000 Orwell feared a world where we would ban books and censor information.
01:07:33.000 Huxley feared a world where we'd be drowned in irrelevance and distraction.
01:07:37.000 Right?
01:07:38.000 So that's kind of another version of the two gutters.
01:07:41.000 Right now, we're kind of getting a little bit of both, right?
01:07:43.000 We're getting a little bit of, hey, we don't like the way that the companies are doing this sort of censorship or platforming, deplatforming of people.
01:07:51.000 We also don't want the unregulated, like, virality machines where the craziest stuff and the most controversial stuff that confirms our biases goes viral because both those things break society.
01:08:01.000 Right.
01:08:02.000 So let's get back to that again.
01:08:03.000 What's the solution?
01:08:04.000 Well, let me just make one narrow solution for that one because it's funny because Frances in her own testimony says Facebook wants you to believe in false choices between free speech and censorship.
01:08:13.000 There is a solution that Facebook themselves knows about for this particular problem.
01:08:18.000 Which is actually just to remove the reshare button, basically the retweet button, the reshare button.
01:08:25.000 What they found in their own research, Facebook sent something like a billion dollars or something, multibillion dollars, on integrity, content moderation, all that stuff.
01:08:34.000 And they said in their own research it would be more effective than the billions of dollars they spent on content moderation to just remove the reshare button after people click it twice.
01:08:46.000 In other words, you can hit reshare on a thing and it goes to all your friends.
01:08:49.000 And then all those friends, they still see a reshare button and they can click reshare and it goes to all their friends.
01:08:56.000 After that, there's no reshare button.
01:08:58.000 If you just remove the instant frictionless, like, make my nervous system twitch, and then boom, I'm resharing it to everybody.
01:09:05.000 If you just remove that one thing, you keep freedom of speech, but you kill an irresponsible reach, just like instant reach for everyone.
01:09:13.000 But you also kill the ability to retweet something or reshare something that's interesting.
01:09:18.000 You could still copy and paste a thing and share it.
01:09:20.000 You can still do that.
01:09:21.000 Yeah.
01:09:21.000 Okay.
01:09:22.000 But I think that we have to ask, there's a story about Steve Jobs that I've referenced several times.
01:09:26.000 Someone, you'd appreciate this because it's about podcasts.
01:09:29.000 Someone showed him the latest version of the podcast app and someone wanted to make on the iPhone.
01:09:34.000 This is on the iPhone early days.
01:09:36.000 And they're like, what if we made it so in the podcast app we had a reshare button and you could see a feed of all the stuff that your friends were looking at?
01:09:44.000 Yeah.
01:09:44.000 I mean, it sounds like kind of, it's just like social media.
01:09:46.000 And it would be really engaging and people would get sucked into podcasts.
01:09:49.000 But Steve Jobs' response was, no.
01:09:52.000 If something is truly that important and truly that meaningful, someone will copy and paste it as a link and be like, you got to check out this interview with Joe Rogan, which I hope people do with this episode, because it's crossing a threshold of significance.
01:10:07.000 Of what is truly worth our undivided attention, which is also the name of our podcast.
01:10:10.000 We call it that.
01:10:11.000 As opposed to just publicizing something and spending a bunch of money or doing a bunch of PR work.
01:10:16.000 Or creating influencer culture or just rewarding, again, the controversy and the conspiracy theory.
01:10:21.000 And again, I shouldn't use that phrase because it sounds like you're always one-sided on it.
01:10:24.000 But just the most kind of aggressive take on anything, the most cynical take on everything being rewarded.
01:10:30.000 It's nowhere is it written that a virality-based information ecosystem Where you have the...
01:10:35.000 People are familiar now with the metaphor of like a lab that's doing gain-of-function research and the idea that something could get leaked out of the lab.
01:10:42.000 Just like that as a metaphor.
01:10:43.000 Well, today we have the TikTok Institute of Virology.
01:10:46.000 We have the Zuckerberg Institute of Virology.
01:10:48.000 And they're testing what makes the memes go as viral as possible, right?
01:10:52.000 They're like, hey, if we make it have this photo or if we present it this way and if we have these reshare buttons.
01:10:57.000 Except their goal is to create these memetic pandemics.
01:10:59.000 Their goal is to have every idea...
01:11:02.000 Especially the ones that most excite your nervous system and your lizard brain go as viral as possible.
01:11:06.000 And then what you can even say that the Zuckerberg Institute of Virology released this mimetic virus and it shut down the global democracy world because now we don't have shared sense making on anything.
01:11:17.000 But we do if everyone's intelligent and objective and they just, you know, they don't use the reshare button for nonsense.
01:11:25.000 The problem is that people are, you know, we're impulsive and we also don't spend a lot of time researching a lot of things that we read.
01:11:34.000 We prefer to be smug and be right.
01:11:36.000 And they know that and they prey on that.
01:11:39.000 They prey on, hey, you're right.
01:11:41.000 You should totally re-share that thing.
01:11:43.000 You should feel great about it.
01:11:44.000 Well, it's also, you know, things are, if true, spectacular.
01:11:48.000 And it takes oftentimes hours to find out something is true or not true.
01:11:54.000 Or we don't even know.
01:11:56.000 Right.
01:11:57.000 It's very difficult to even trust.
01:11:59.000 Actually, talk about how social media kills science.
01:12:01.000 You want to do that construction?
01:12:03.000 Well, scientific bias, right?
01:12:04.000 We talked about that.
01:12:07.000 If someone has kind of an emotional bias towards what they already generally think is true, even if they're following the scientific method well, what experiment they decide to do, what they go looking for will be influenced because there's a lot of things to look at, right?
01:12:22.000 Am I trying to do science on natural supplements versus on drugs versus on vaccines versus like I'll have an intuition that is the basis of a hypothesis or a conjecture that then I'll do the scientific method on.
01:12:33.000 Intuition can be biased.
01:12:38.000 So a point that Tristan was saying earlier that I think is really important is that this model of Facebook and it is not – and it's not just Facebook.
01:12:46.000 It's TikTok.
01:12:46.000 It's all of the things that do this kind of attention harvesting, which ends up – it doesn't intend to polarize.
01:12:54.000 It's a byproduct.
01:12:55.000 It's a second-order effect, an unintended consequence.
01:12:58.000 But in the way that like the unintended consequence of – Cigarettes had an externality which was lung cancer and oil companies had oil spills and so then government had to regulate it.
01:13:09.000 These companies are fundamentally different because their externality is a polarized population which in a democracy decreases the capacity of government directly.
01:13:19.000 So the big oil companies and big pharma companies and whatever can do lobbying and campaign budget support and whatever they can to affect government.
01:13:27.000 I think?
01:13:45.000 And relative to their own internal tech issues, right?
01:13:47.000 And their own internal domestic issues.
01:14:08.000 What's interesting is that Instagram doesn't have a share feature.
01:14:11.000 I just realized that.
01:14:13.000 The share feature isn't actually the key.
01:14:15.000 TikTok is not really emphasizing shares.
01:14:17.000 But Facebook is.
01:14:18.000 Facebook is, but the key is virality, right?
01:14:21.000 So share is one way to get virality.
01:14:23.000 TikTok is just looking at engagement and upregulating the things that get most engagement.
01:14:27.000 This is actually a key point, because let's say that we tried to make a piece of legislation based on thinking it was about shares.
01:14:32.000 Then Facebook would just move to the TikTok algorithm.
01:14:35.000 Just by what you look at the most, that thing gets re-shared to other people.
01:14:38.000 Maybe viral based on unconscious signals as opposed to explicit click signals.
01:14:42.000 So there's a deeper point here, which is not, is there a piece of regulation that we can put in, even if we trust the government to do that?
01:14:48.000 Let's even say we had a very trustworthy government.
01:14:51.000 It's, can the government regulate at the speed that tech can outmaneuver it?
01:14:57.000 And here's the other question.
01:14:58.000 If you didn't have any algorithms whatsoever, wouldn't you be now open to being manipulated by troll farms just simply by volume?
01:15:07.000 You know, if they have 100,000 accounts at each individual location, they have 100,000 locations, and they're just pumping out different Instagram pages and TikTok.
01:15:17.000 And we don't really even know how many they actually have, because they discover them.
01:15:20.000 Facebook shuts down 2 billion fake accounts per quarter.
01:15:24.000 I'm sure they get all of them.
01:15:25.000 Holy shit!
01:15:26.000 And this is before AI. Yeah.
01:15:29.000 Before the GPTC. And you're obviously being sarcastic by saying they get all of them.
01:15:33.000 They don't.
01:15:33.000 No, no, yeah, exactly.
01:15:34.000 Yeah, just to let people know they were totally paying attention.
01:15:37.000 That's insane.
01:15:38.000 Two billion per quarter.
01:15:39.000 Fake accounts.
01:15:40.000 Yep.
01:15:41.000 And did they...
01:15:43.000 Is there a centralized area where these are coming from?
01:15:46.000 Is it all Russian troll farms?
01:15:48.000 Are they political ones that are used against opponents?
01:15:51.000 One of the problems is that we don't know.
01:15:54.000 Actually, that's not true.
01:15:55.000 Facebook has been...
01:15:56.000 I really want to celebrate all the positive moves they make, by the way.
01:15:58.000 This is not...
01:15:59.000 Just so I'm clear and we're clear.
01:16:01.000 This is about finding what's an earnest solution to these problems.
01:16:03.000 All the times they make great decisions that are moving in the positive direction.
01:16:06.000 We need to celebrate that.
01:16:07.000 They do these quarterly reports, I think, called information quality reports.
01:16:12.000 They publish every quarter how many accounts they take down.
01:16:15.000 But it's just like a PDF. They're just putting out a post as opposed to letting external researchers know, for example, in each country, what are the top 10 stories that go viral?
01:16:24.000 How do they find out that someone's a troll post?
01:16:27.000 I don't know.
01:16:28.000 I mean, there's classifiers that they build.
01:16:30.000 There's activity.
01:16:30.000 You can tell.
01:16:31.000 Usually, if you've ever had this happen, you use Facebook and you click around a bunch, and then it says, like, you look like you're clicking around too much.
01:16:36.000 Have you ever gotten one of those messages?
01:16:37.000 No.
01:16:39.000 Occasionally, a person can trigger a thing that makes it...
01:16:42.000 They're like, are you real?
01:16:43.000 And they're trying to figure out if you're real.
01:16:45.000 If you click around a lot, you're not real?
01:16:47.000 Because these bots...
01:16:48.000 A lot of what they're doing is they're going around harvesting information.
01:16:50.000 So they want to click around various profiles, and they download the information, and there's ways...
01:16:54.000 More than a person could be able to do just by clicking.
01:16:58.000 Yeah.
01:16:58.000 It's a hard problem, right?
01:16:59.000 Because the tech is getting better at simulating the behavior of a human, and then simulating...
01:17:04.000 There are people who are proposing that social media...
01:17:08.000 And the internet as a whole needs rigorous identity layer.
01:17:11.000 I would say that's a requisite need for where we're going.
01:17:14.000 I was thinking that about Twitter a long time ago.
01:17:16.000 And we kind of have that with Facebook, but it's not rigorous, obviously.
01:17:22.000 Right.
01:17:23.000 But Twitter, you know, you can have jackmeoff69 and that's your Twitter handle with some weird Gmail account and then just post nonsense, you know?
01:17:32.000 Trevor Burrus And so there's no ability to – for justice in that system and accountability, there's also no ability for the user reading somebody else's thing to know who they are, right?
01:17:42.000 For the veterans group or whichever group to know, is this a Russian troll farm that is pretending to be a Christian or whatever?
01:17:48.000 And especially, is this even a human or is this an AI in the very near future?
01:18:06.000 Because if there's rigorous identity, who has access to that information?
01:18:09.000 And am I now centralizing data?
01:18:12.000 So how do you...
01:18:13.000 How do you become a whistleblower?
01:18:15.000 Right, which is huge.
01:18:16.000 Or how does something like Arab Spring happen?
01:18:19.000 Right.
01:18:20.000 How can you be safely anonymous?
01:18:21.000 There's a whole decentralized community.
01:18:24.000 There's a great movement called Radical Exchange, run by Glenn Weil, and they're trying to create part of this third attractor, this like, what's the center of the bowling alley that's a digital version of democracy.
01:18:32.000 What are decentralized ways of proof of personhood?
01:18:36.000 There's a project called IDENA. There's a bunch of things people can look up by radical exchanges work.
01:18:40.000 It's part of a whole movement of which Taiwan is included, which is that, I don't know if we really got to the Taiwan example, but- We didn't, I don't think.
01:18:47.000 Okay.
01:18:47.000 We showed John Cena instead.
01:18:49.000 Oh, that's right.
01:18:50.000 That's right.
01:18:51.000 People need to get it because it's an example of what is working.
01:18:54.000 It's a solution.
01:18:55.000 It's a direction of how we can go, which is, You know, you can only fit so many people into a town hall to deliberate, right?
01:19:02.000 So there's sort of a limit to our idea of democracy is kind of guided by ideas from 200 years ago.
01:19:07.000 They've created a system called POLIS, which is a way of gathering opinions about various ideas and then sort of seeking who wants more funding for various things, less funding for various things.
01:19:17.000 And whenever there's an unlikely agreement, so they sample a bunch of people, say, you sit over here, you sit over here, they get these clusters, like, these people kind of like this, these people kind of like these other things.
01:19:26.000 Whenever there's an unlikely agreement between those clusters, in other words, consensus, rough consensus, that's what they sort of boost to the top of that system.
01:19:34.000 So everyone's seeing areas of common ground, common ground, common ground, as opposed to fault line of society, fault line of society, be more angry, join the common thread, etc.
01:19:42.000 And then you're invited into a civic design process where you actually say, hey, I don't like the tax system.
01:19:48.000 And they're like, great, we're going to invite 30 of the people who were part of that rough consensus.
01:19:53.000 We're like, let's improve the tax system.
01:19:54.000 Let's talk about how we're going to do it.
01:19:56.000 And they do a combination of in-person stuff.
01:19:57.000 This is a little bit before COVID. And Zoom calls.
01:20:00.000 And then do like these mechanisms to kind of get an idea of where do people agree and then how do we make it better?
01:20:05.000 They've done this with air pollution.
01:20:06.000 They have a huge air pollution problem because of the lithography that they do with chips and things like this.
01:20:11.000 We're good to go.
01:20:30.000 And that was more accessible.
01:20:31.000 And there was a civic participatory process where more people could contribute and participate in identifying areas of inefficiency.
01:20:38.000 You could even imagine a place where citizens could get rewarded by saying, hey, this is inefficient.
01:20:43.000 We could do it better this way.
01:20:44.000 And if you identify places where it could get more efficient, you could get money or resources by making the whole system work better for everyone.
01:20:52.000 If you ran a current audit of the U.S. government through blockchain, you'd have a goddamn revolt.
01:20:57.000 They would go, holy shit, this whole thing is corrupt.
01:21:00.000 This is infested down to the roots.
01:21:04.000 And that would be a real problem.
01:21:06.000 And I think the Nancy Pelosi's of the world would really have a hard time with that.
01:21:11.000 I heard some clip that you did where you were talking about your pot thoughts of people being in big buildings and the pipes everywhere.
01:21:19.000 And just like how weird some aspects of civilization are.
01:21:23.000 So think about how weird democracy is.
01:21:25.000 Like as an idea, the idea that you can have some huge number of people who don't know each other, who all believe different stuff and want different stuff, figure out how to actually agree and work together as opposed to just tribalize against each other and do their own thing.
01:21:38.000 Like it's actually a wild idea.
01:21:40.000 To think that that would be possible at any scale, maybe a tiny scale.
01:21:43.000 Well, that's when it started.
01:21:44.000 It was a tiny scale.
01:21:45.000 Right.
01:21:45.000 And we've always had a scale issue, right?
01:21:47.000 In 1776, you could all go into the town hall and fit.
01:21:51.000 And so I wasn't just hearing a proposition that a special interest group had created and I get a vote yes or no, which will inherently polarize the population, right?
01:21:59.000 Very few propositions get 99% of the vote.
01:22:02.000 They get 51% of the vote because they benefit something and they harm some other things.
01:22:06.000 And the people who care about what gets harmed are fighting against it.
01:22:08.000 That polarizes the population against each other.
01:22:11.000 Social media— This is a conversation about censorship or free speech.
01:22:16.000 And boom, you just split the population in half.
01:22:18.000 As opposed to, hey, we all agree we could do a little bit less virality.
01:22:21.000 We could stop the teenager use in these ways and we'd be better for everyone.
01:22:24.000 The proposition creation isn't designed to polarize the population.
01:22:27.000 It just does.
01:22:28.000 Because as soon as you get beyond the scale of we can all actually inform what a good proposition would be by being in the town hall and having a conversation.
01:22:36.000 Proposition like a suggestion.
01:22:36.000 Yeah.
01:22:36.000 Just define a proposition.
01:22:37.000 Not everybody knows what a proposition is.
01:22:38.000 Something you would vote on.
01:22:39.000 What's a good way to go forward?
01:22:41.000 Before we make a choice on what a good way to go forward is, we have to do some sense-making of what is even going on here.
01:22:47.000 Like what are the values?
01:22:48.000 What's going on here?
01:22:49.000 That was – so the point was a conversation.
01:22:51.000 That happened at a smaller scale.
01:22:53.000 Also, if you had a representative, the level of tech at the time was something that a very well-educated person could understand most of.
01:22:59.000 They could understand a lot of the tech landscape.
01:23:02.000 Obviously, we're in a situation now where the scale issue of democracy has been completely broken.
01:23:08.000 So almost nobody – we're supposed to have a government upformed by the people but nobody really understands the energy grid issues or first strike nuclear policy or monetary policy or anything like that.
01:23:18.000 And everyone's voice can't be heard, right?
01:23:21.000 Now what Taiwan was working on is, is the tech that is particularly in the West breaking democracy, could that same tech be employed differently to actually make 21st century democracy more real?
01:23:34.000 So the same AI that can mess up the information landscape for everyone, could we use that type of AI that understands language to be able to see what does everyone think and feel and actually be able to parse that into something we can understand?
01:23:45.000 So there's an online environment that says here's the distribution of people's values.
01:23:48.000 Here's the various values people care about.
01:23:50.000 Here's the emotions they have.
01:23:51.000 Here are the kind of facts about it.
01:23:53.000 And then is there a place where we can actually craft propositions together?
01:23:58.000 So there's a way to make it to be able to utilize these same tools To make democracy more realized, to make collective intelligence more realized.
01:24:08.000 But right now, as we were saying, autocracies are working on employing these tools.
01:24:13.000 Corporations are working on it, both of which are top-down.
01:24:16.000 Democracies really aren't.
01:24:18.000 Outside of Taiwan and Estonia and a few small examples.
01:24:21.000 What would be the incentive?
01:24:22.000 Who would be incentivized to use that other than the people?
01:24:26.000 And it's pretty clear that the people don't have control over Facebook, don't have control over Twitter, certainly don't have real control over the government.
01:24:33.000 You have control over elected officials who, it's almost universally agreed, will lie to you to get into office and then not do what they said they were going to do, which is the standard operational procedure.
01:24:44.000 So what's the incentive and how would these get implemented?
01:24:48.000 So again at a small scale, 1776, your representative couldn't lie all that much because everybody – they lived in the same town, right?
01:24:56.000 And you could all go see what was going on.
01:24:58.000 And so can we recreate things like as you were mentioning, people would freak out if they could actually see how government spending worked.
01:25:05.000 Can we create transparency at scale?
01:25:07.000 Can we – in a way that could create accountability at scale?
01:25:10.000 We could.
01:25:11.000 Could we have places where there's direct democracy and people can actually engage in the formation of what a good proposition is, not just voting yes or no on a proposition or yes or no on a – But can I stop you there?
01:25:22.000 How would – you say we could.
01:25:24.000 How would you do that?
01:25:25.000 How would you have that transparency and who would be incentivized to allow this transparency?
01:25:31.000 If the transparency has not existed up until now, why would they ever allow some sort of blockchain-type deep understanding of where everything's going?
01:25:43.000 I don't think they are incentivized, which is actually why this show is interesting.
01:25:47.000 Because if we're really talking about a government of, for, and by the people, where the consent of the governed is where the power of government comes from, like, ultimately, if And the founding fathers said a lot of things about that the government will decay at a certain point,
01:26:03.000 particularly when people stop taking the responsibility to actively engage.
01:26:07.000 Right.
01:26:07.000 And so if tech can produce – if tech has incentives to produce things that are catastrophically problematic for the world – And we need to regulate that somehow.
01:26:19.000 And the issues are too complex for individuals to understand.
01:26:21.000 So you need institutions.
01:26:22.000 But how do you make sure the institutions are trustworthy?
01:26:26.000 We have to create new 21st century institutions, but they have to arise of, for, and by the people, which means there's a cultural enlightenment that has to happen, right?
01:26:35.000 People actually taking responsibility to say we want – We want institutions we can trust and we want to engage in processes of recreating those.
01:26:44.000 How do you get people to be enthusiastic about some sort of a radical change like that other than some sort of catastrophic event like a 9-11?
01:26:53.000 Well, this is why we're talking about all the impending catastrophic events, is to say we don't want to wait until after they happen.
01:26:59.000 It would be, but it seems like that's the only way people really change the way they think about things.
01:27:05.000 It's something, some almost like cultural near-death experience has to take place.
01:27:09.000 Well, it's like the problem of humanity is paleolithic emotions, medieval institutions, and godlike tech.
01:27:15.000 One of the paleolithic emotions is it can't be real until, oh shit, it actually happened.
01:27:19.000 Right.
01:27:19.000 And so, like, but the test is, we are the one species who has the capacity to know this about ourselves, to know our Paleolithic emotions are limited in that way, and say, we're going to take the action, the leap of faith that we know we need to do.
01:27:33.000 Like, we're the only species that can do that.
01:27:34.000 If a lion was in this situation, or a gazelle, they can't, like, understand their own mind and realize they have the one marshmallow mind or the, you know, short-term bias or recency bias.
01:27:43.000 They're trapped inside of their meat suit.
01:27:45.000 This is a beautiful idealistic notion.
01:27:48.000 However, in real-world application, most people are just fucking lazy and they're not going to look into this and they're not going to follow through.
01:27:57.000 And this is why most people that really study tech-mediated catastrophic risk are not very optimistic.
01:28:04.000 And they think things like we have to chip human brains to be able to interface with the inevitable AIs or we have to have an AI overlord that runs everything because we're too irrational and nasty.
01:28:14.000 And the question is like there have always – there's always been a distribution of how rational people are and how kind of benevolent they are.
01:28:24.000 And – We have never with that distribution been very good stewards of our power.
01:28:30.000 We've always used our power for war and for environmental destruction and for kind of class subjugation.
01:28:35.000 But with the amount of power we have now, those issues become truly globally catastrophic.
01:28:39.000 And this is the thing is like – and this is what almost every ancient prophecy kind of speaks to.
01:28:45.000 As you get so much power that you can't keep being bad stewards of it, either the experiment self-terminates or you are forced to step up into being adequate stewards of it.
01:28:56.000 So the question is what would the wisdom to steward the power of exponential tech – what would the minimum required level be?
01:29:04.000 And that's like the experiment right now.
01:29:05.000 That's the opportunity for us.
01:29:08.000 The opportunity, but you're talking about a radical shift in human nature.
01:29:13.000 Well, it's a possibility.
01:29:14.000 In human conditioning.
01:29:16.000 Why don't you give some examples?
01:29:17.000 Okay.
01:29:18.000 So we can look at some cultures that have certain traits quite different than other cultures as a result of the conditioning of those cultures more than as a result of the genetics.
01:29:28.000 We can see that if you look at Jains… What are the Jainists?
01:29:32.000 The Jains are a religion that is highly emphasizing nonviolence, even more than the Buddhists.
01:29:36.000 Where are they?
01:29:37.000 … Asia.
01:29:56.000 They won't kill plants.
01:29:57.000 In lots of different environments that mostly don't hurt anybody including bugs as a result of the way they condition people.
01:30:03.000 Aaron Powell Education, conditioning, culture.
01:30:05.000 You can see that Jews have had historically a level of education that is higher than the embedding society that most everyone around them has as a result of investing in that.
01:30:16.000 And so we're like, can cultures value certain things and invest in developing those in people?
01:30:23.000 It doesn't mean that everyone is suddenly has the wisdom of gods to match the power of gods, but can we create a gradient?
01:30:29.000 This is where there used to be this concept of building.
01:30:32.000 What's that?
01:30:32.000 I'm sorry, I'm hearing what you're saying, but I don't see it.
01:30:36.000 Yeah, I'm hearing what you're saying, like idealistically, yes, but I don't see the motivation for the shift.
01:30:43.000 I feel like this is – it's a big ask.
01:30:47.000 It's a big ask.
01:30:47.000 And a big ask has to come with some sort of a master plan to get people to shift their perspective.
01:30:54.000 Well, if you take a look at the attractor of catastrophes and the attractor of dystopias, those are the likely ones.
01:31:02.000 Right, but we don't see it.
01:31:05.000 Like, people don't give a shit until it's happening.
01:31:07.000 Which is why one of those two will probably happen.
01:31:09.000 Yeah.
01:31:10.000 Well, and with social media, they do see it.
01:31:12.000 I think there's a unique moment, and the reason I thought this would be an interesting conversation with the three of us, is that social media has become the case.
01:31:19.000 Like, we can now all see it.
01:31:20.000 We can now, I mean, it took, unfortunately for some people, seeing the receipts, which is what France has provided, to things that we all predicted back, you know, eight years ago, But now people understand that that is a consequence of unregulated exponential technologies that are steering people at scale,
01:31:36.000 making things go viral at scale and dangerous at scale.
01:31:39.000 So that's a case we can now see that thing.
01:31:41.000 Can we leverage the understanding of that to realize what bigger thing needs to happen?
01:31:46.000 Before we get to the incentive, just imagine as a thought experiment for a minute that...
01:31:51.000 Facebook changed what it was optimizing for because Facebook is this 3 billion person AI optimized behavior machine, right?
01:31:59.000 Like that's a huge – it's not like normal companies and it's important to understand that.
01:32:03.000 And it's optimizing for engagement which usually ends up looking like outrage, desire, addiction, all those types of things.
01:32:09.000 But let's say that we – Could we assess for are people being exposed to different ideas than the ones they're used to?
01:32:17.000 Are they actually uptaking those ideas?
01:32:19.000 Are people expressing ideas that have more nuance and complexity?
01:32:22.000 And you were actually upregulating for those things.
01:32:26.000 There's a lot of actually quite constructive content on the internet.
01:32:30.000 And imagine that you could actually personalize development and education.
01:32:34.000 This is why you started to say when Tristan was saying what China is doing where the kids are seeing museum and science experiments and patriotism.
01:32:40.000 You're like, yeah, that actually kind of makes sense.
01:32:43.000 It makes sense but it only makes sense when you have an autocratic government that has complete control of the corporations and their motivations.
01:32:50.000 Like, if the corporation's motivations were specifically designed to rake in the most amount of profit, like Facebook's is, you'd never be able to trick them into doing that.
01:33:00.000 There's no way.
01:33:00.000 They'd be like, fuck you.
01:33:02.000 We're not gonna do it.
01:33:03.000 That infringes upon our rights as a business to maximize our profits.
01:33:07.000 We have an obligation to our stakeholders.
01:33:10.000 They would never do it.
01:33:11.000 And we can see how the government took major corporations that had such an Aaron Ross Powell Yes.
01:33:34.000 That they can't be harming our citizens and harming our democracy.
01:33:38.000 We actually have to put some regulation not on who gets to speak but what gets radically upregulated.
01:33:44.000 But the problem is the way they would do it is the same way they do like the Build Back Better bill where it's 40,000 pages and no one can read the whole thing and inside of it there's a bunch of shit about how they can spy on your bank account.
01:33:56.000 And lock you down if you spend more than $600 and you have to go to a committee to decide whether or not you get your money back and make everybody scared and paranoid.
01:34:05.000 I mean, this is the kind of behavior that our government engages in on a regular basis.
01:34:09.000 This is not just a big ask for us to get people to be motivated to make this radical change, but it's a big ask to the government.
01:34:16.000 It's like, hey, you fucks have to start being honest now.
01:34:19.000 And that's not going to happen.
01:34:23.000 Yeah, it's a tricky proposition because the question is … It's not just tricky.
01:34:27.000 It changes the way the government has been treating human beings through every single day of our lifetime.
01:34:34.000 So do you trust Facebook to hold this power?
01:34:37.000 Do you trust the government to hold it?
01:34:39.000 No.
01:34:40.000 Do you trust individuals to be resilient against all of this power pointed at them that is so radically asymmetric?
01:34:45.000 More that.
01:34:47.000 More that.
01:34:48.000 More I trust people to wake up to the fact that you do have control over your news feed.
01:34:53.000 You don't have to look at it.
01:34:54.000 You do have control over what you share and retweet.
01:34:58.000 You should be more objective about the information that you consume.
01:35:02.000 You should try to find fact checkers that are independent and are unbiased and are not motivated by financial means.
01:35:11.000 There's fact checkers that are clearly connected to parties.
01:35:14.000 We know this.
01:35:15.000 So I could similarly argue you're trying to ask too much of human nature.
01:35:19.000 It's way easier than that.
01:35:21.000 Ask that of the government and ask that of corporations.
01:35:25.000 At least human beings don't – they have a personal understanding of the consequences of what they're doing and they don't have this diffusion of responsibility that both government and corporations have.
01:35:36.000 The thing about the diffusion of responsibility is one person in a corporation doesn't feel evil when the corporation – Dumps pollutants into some South American river.
01:35:46.000 But that is happening and it is a part of it.
01:35:50.000 But when an individual takes responsibility for their own actions and if we can somehow or another Coach or explain or educate individuals about their consumption and what kind of impact their consumption has on their psychology,
01:36:09.000 on their future, on the way they view and interface with the world.
01:36:12.000 That could change.
01:36:14.000 The reason why these algorithms are effective It's because they play to a part of human nature that we don't necessarily have control over.
01:36:21.000 What we like to argue over.
01:36:23.000 What we like to engage with.
01:36:24.000 You know, I brought this up on your podcast before, but I'll bring it up again.
01:36:28.000 I have a good friend, Ari Shafir, and he ran an experiment on YouTube where he only looked up puppy videos.
01:36:35.000 And that's all it recommended to him.
01:36:37.000 The problem with the algorithm, except for what you were talking about before with the QAnon thing, that's fucked.
01:36:43.000 The problem with the algorithm on YouTube is it accentuates the things that people are actually interested in.
01:36:49.000 But when Facebook, those fucks, when they do something like that where someone just invites people into a group and you can mass invite, I'm assuming through some sort of a program, right?
01:36:59.000 They're not doing it individually, one by one.
01:37:02.000 So if some QAnon group mass invites a million people and then it's all of a sudden distributing disinformation to that million people, then you've got a problem with the company.
01:37:11.000 There's a problem with the way that company distributes information because you're not allowing people to make the decisions that they could make to clean up their algorithm, to clean up what they get influenced by, to clean up what their newsfeed looks like.
01:37:25.000 That's a problem.
01:37:26.000 That's a problem because it's not as simple as you're giving people choices.
01:37:31.000 This is what they choose.
01:37:32.000 No.
01:37:32.000 You're allowing someone to radicalize, like intentionally radicalize people with either willing or unbeknownst to them disinformation.
01:37:44.000 Yeah, and we don't want the Nestle Coca-Cola vending machine in the preschools because do the kids actually have the ability to win the two marshmallow experiment in the presence of that much advertising?
01:37:55.000 Do they understand advertising?
01:37:57.000 Do they understand marketing?
01:37:58.000 We want to spot asymmetries of power.
01:38:00.000 And the challenge here is...
01:38:02.000 The asymmetry is I've got a trillion-dollar market cap company that has observed three billion people's behaviors and click patterns.
01:38:10.000 So I know more about people before they even know about themselves.
01:38:13.000 Yuval Harari gives this example.
01:38:14.000 His partner, Itzik, he's gay.
01:38:16.000 His partner, Itzik, makes two clicks on TikTok and he knows exactly what he wants, right?
01:38:21.000 When you have that degree of asymmetry and it's designed with that much power on one side of the table, I mean, a system is inhumane if that symmetry of power is so asymmetric, right?
01:38:29.000 So if it's influencing me more than I understand my own brain, like a magician, we're not going to be able to get out of that.
01:38:36.000 Because if it's playing to my confirmation bias and I don't know that I have confirmation bias, I'm just run by confirmation bias, that's a form in which I'm essentially a foot soldier in someone else's culture war.
01:38:46.000 If it's playing to my social validation, I don't know that it's playing to my social validation, I don't even know I have a thing called social validation, that that's an exploitable part of me.
01:38:53.000 That's an asymmetric interaction.
01:38:55.000 So, I mean, you're right, by the way, as a part of an ecosystem of solutions, we do need a cultural, I mean, Daniel calls it a cultural enlightenment, but you can just simply say we need a mass education about how technology influences us that matches.
01:39:09.000 Everyone who uses social media deserves to know how it works.
01:39:11.000 And in the Carnegie Endowment, they did a sort of meta-analysis for the problem of misinformation.
01:39:16.000 If you look at, like, 100 organizations surveying how do we deal with this problem, like, I think, like, 98% of them said, like, the number two result was at least do digital literacy education for everyone, right?
01:39:27.000 Everyone should understand more about how this stuff works.
01:39:29.000 So, to your point about we should be educating everyone to be better stewards of their own attention, their own information consumption.
01:39:36.000 But when you have bad actors that are manipulating this stuff at scales and at levels that people don't understand, and we're about to have GPT-3, and printing basically full research papers that justify everything you've ever believed, that's not going to be an adequate solution.
01:39:52.000 So we have to change something at the systemic level.
01:39:55.000 The question is just how do we get that to happen?
01:39:59.000 Yeah, no solutions, right?
01:40:02.000 Well, if we're willing to, I mean...
01:40:04.000 You know what I'm saying?
01:40:05.000 Like, we have these ideas of what needs to happen, but this is like, what we need to do is get everybody to stop eating processed food and exercise regularly and only drink water.
01:40:17.000 Well, you can get like 10 people to do that.
01:40:20.000 You can get highly motivated people to do that.
01:40:23.000 You can get really intelligent, really conscientious people that are considering the impact that their choices have on their future.
01:40:33.000 But that's not normal human nature.
01:40:36.000 And also you're dealing with the fact that most people are very unhappy with what they do for a living.
01:40:42.000 They're very unhappy with their lives, their personal lives.
01:40:46.000 There's like a good percentage of people that are not happy with most aspects of their existence.
01:40:53.000 Well, they seek distractions.
01:40:55.000 They seek distractions.
01:40:56.000 That might be the only comfort that they have, is arguing about global warming with people online.
01:41:02.000 Or, you know, arguing about Second Amendment rights.
01:41:06.000 Like, we've got to take into consideration the motivation for people to engage in these acts in the first place.
01:41:13.000 And a lot of it is just they're very, very unhappy with their existence.
01:41:17.000 So that's why they get trapped doing this.
01:41:19.000 When you talk to someone who is like, hey, I realize that I've got to get off of social media, I don't do anything anymore, I wake up in the morning, I have fresh squeezed vegetable juice, and then I go on a nice long hike, and those are rare fucking humans.
01:41:32.000 They do exist.
01:41:33.000 But the idea that we're going to change the course of the vast majority of our civilization and have most people behave in that manner is very unrealistic.
01:41:44.000 So this is why now add increasing technological automation and the radical technological unemployment to that.
01:41:52.000 Meaning automating more of our jobs, etc.
01:41:54.000 Right.
01:41:55.000 Is that good or bad?
01:41:56.000 Does that make people less happy or more happy?
01:41:58.000 Well, for the most part, it makes a radically unemployable underclass, a huge radically unemployable underclass, where at least in feudalism, you still needed the people to do labor.
01:42:07.000 Now you won't need the people to do labor.
01:42:10.000 So then this is why there are a number of people in the upper wealth class who believe in universal basic income because it's at least the cheapest way to deal with those people.
01:42:18.000 Now you add the metaverse to that and this is the entry into the matrix world, right?
01:42:23.000 Right.
01:42:23.000 Now this is where we have to get to because that's what I'm really worried about.
01:42:28.000 What I'm really worried about is the solution will be to disengage with the actual material world and the solution would be to find yourself almost completely immersed And do whatever you can to just feed yourself and pay for the metaverse.
01:42:42.000 Or whatever it is.
01:42:43.000 Whether it's Zuckerberg's version of it, which by the way, I saw a fucking commercial, which is so strange.
01:42:48.000 It's a bunch of incredibly diverse, multiracial kids.
01:42:51.000 And they're sitting around bobbing their heads to a fucking tiger that's talking to them and dancing.
01:42:58.000 Have you seen that?
01:42:59.000 I haven't seen it.
01:43:00.000 Please find that.
01:43:01.000 Because it's like, what are you selling?
01:43:03.000 The fuck are you selling?
01:43:04.000 It doesn't even show what you're selling.
01:43:07.000 It's like this weird change from Facebook to Meta, right?
01:43:12.000 And so it's showing this ad.
01:43:13.000 It's very attractive.
01:43:14.000 It's interesting.
01:43:15.000 You see all these people.
01:43:16.000 They look cool.
01:43:17.000 And they're all bobbing their head.
01:43:18.000 And then, like, the tiger's talking to them, telling them anything is possible.
01:43:21.000 You're like, oh, cool.
01:43:22.000 Anything's possible.
01:43:24.000 But you're watching.
01:43:25.000 You're like, what are you saying?
01:43:26.000 Like, I don't even know what you're saying.
01:43:27.000 Like, what is this?
01:43:28.000 Like, watch this.
01:43:29.000 Because it's so fucking weird.
01:43:31.000 It's not loading.
01:43:32.000 Of course not.
01:43:33.000 Too many people are connected to the metaverse.
01:43:35.000 It's failing.
01:43:38.000 So it's the same thing, but different message.
01:43:41.000 I thought when the audio started that was weird.
01:43:42.000 Go ahead.
01:43:45.000 But see, the same thing.
01:43:46.000 It's like cultivated, multi-racial, multi-ethnic groups.
01:43:58.000 So the toucans are dancing.
01:44:01.000 Pelicans are dancing.
01:44:02.000 The tiger lets go of the buffalo.
01:44:06.000 Look, everybody's bobbing their head.
01:44:08.000 No one's going, what the fuck is going on?
01:44:18.000 Big, big, [...
01:44:27.000 What is this?
01:44:28.000 This is going to be fun.
01:44:29.000 No, it's not.
01:44:30.000 We're fucked.
01:44:31.000 This is not going to be fun.
01:44:33.000 This is a trap.
01:44:34.000 This is a trap.
01:44:35.000 They're going to lure you into this and you're not going to give a shit about your regular life anymore because it's going to be so much more exciting.
01:44:40.000 And next thing you know, they're going to say, listen, they'd be much more involving if you put a gasket in the back of your head and they could just connect you straight to a pipe.
01:44:50.000 And then you're in the Matrix.
01:44:52.000 So, a few things to say.
01:44:54.000 Um...
01:44:56.000 The competition for attention and the attention economy was always about the metaverse.
01:45:01.000 We've been living in the metaverse for a long time because it's about how do you capture and own and control people's personal reality.
01:45:08.000 That's what Instagram, Facebook, TikTok, YouTube, the whole thing, that's what these things are.
01:45:13.000 One of the things that this makes me think about, that's subtle actually, I know you've had Jonathan Haidt on the show talking about teenage mental health problems.
01:45:20.000 When you look at when self-harm and depression starts ticking up in the graph for the kids, 13 to 17-year-olds, there's a subtle point in a specific year period where that ticks up.
01:45:31.000 And you know what that year was?
01:45:32.000 It's like 2009 to 2011. What changed in that period?
01:45:35.000 The iPhone.
01:45:36.000 The iPhone.
01:45:37.000 And then social media.
01:45:37.000 We had social media before that.
01:45:39.000 Right, but we didn't.
01:45:40.000 What changed is when it went on mobile.
01:45:42.000 Right.
01:45:42.000 Now what changes when it's on mobile?
01:45:45.000 You have it all the time.
01:45:46.000 You have it all the time.
01:45:48.000 It becomes your new 24-7 metaverse.
01:45:51.000 I would say that it's when you virtualize people's experience so fully and that virtual world doesn't care about protecting and nurturing the real world.
01:46:00.000 When you virtualize people's relationships in a way that they don't care about nurturing your offline relationships when they virtualize your online relationships.
01:46:23.000 That's very similar to a virtual reality that's not protecting the social fabric that it depends on.
01:46:33.000 It depends on that thing for it being higher, if you want to say anything to that.
01:46:37.000 Yeah.
01:46:38.000 So why did Facebook buy Instagram and buy WhatsApp and the various things they did is because a monopoly of attention is the play.
01:46:47.000 And a monopoly of attention is a really big deal to be able to get.
01:46:50.000 But as soon as new devices come out, you're going to get attention in different ways.
01:46:53.000 So AR and VR as new platforms, obviously, you've got to lead the way in having the monopoly of attention there and increasingly hypernormal stimuli.
01:47:02.000 And the cell phone took us up to something like 50% screen time from say 25% screen time on the laptop.
01:47:08.000 The AR can take us up to like approaching 100% screen time or engagement time.
01:47:14.000 And then persistent tracking of reality across all those domains.
01:47:17.000 So we can see why this is super problematic and pernicious.
01:47:27.000 That was just speaking to how the metaverse is a natural extension of what they've already been doing.
01:47:34.000 Right.
01:47:34.000 Where's the middle lane?
01:47:37.000 We've got the gutters on each side.
01:47:40.000 What's the middle lane?
01:47:42.000 There is...
01:47:43.000 Oh, I remember.
01:47:44.000 What Tristan was just saying about if the...
01:47:48.000 Daniel, get on that microphone.
01:47:49.000 If I have virtual relationships online, but they're actually debasing the integrity of my...
01:47:56.000 In person relationships, so when we're talking, we're actually looking at our phones.
01:48:00.000 We would say from the Center for Humane Technology kind of perspective, what is humane tech?
01:48:04.000 One of the definitions would have to be And he was mentioning earlier that tech plus democracy makes a better democracy.
01:48:11.000 Similarly, if you want to think about what does humane tech mean, tech plus any of the foundations of what it means to live a meaningful human life and a sustainable civilization, tech has to make those things better.
01:48:21.000 So tech plus families has to actually increase the integrity of families.
01:48:24.000 Otherwise, it's fundamentally not humane.
01:48:26.000 It's misaligned with what is foundational to being human.
01:48:29.000 Tech plus democracy has to make better democracy.
01:48:32.000 Tech plus individual human mental well-being.
01:48:34.000 Right, but it's not, right?
01:48:36.000 Tech plus democracy is...
01:48:38.000 Debasing it fundamentally.
01:48:40.000 Currently.
01:48:41.000 But there are ways of actually, first of all, aligning and choosing your business models to be in alignment with that thing.
01:48:48.000 So, I mean, not to give Apple too much praise, but when it says, hey, you know, we're going to...
01:48:56.000 And they're choosing to go into health because they could just say, hey, we're going to build our own maximize engagement machine metaverse thing.
01:49:02.000 I'm sure they're working on one, but they're choosing business models.
01:49:05.000 Their business model isn't maximizing attention.
01:49:07.000 That's why when you use FaceTime, it doesn't have like, here's comments, here's notifications, here's the hearts and likes and thumbs up floating across the screen as you're using FaceTime because you're the customer, not the product.
01:49:17.000 Well, Apple's a fantastic example of what is possible when a company does have The most superior product in the market, right?
01:49:28.000 It's kind of widely acknowledged that when it comes to the phones, when it comes to the operating system that exists in the phones, and when it comes to the operating system that exists on the computers, and then the fact that Apple controls all of the hardware.
01:49:41.000 So the problem that Windows has is You got Lenovo and Dell, and there's all these different companies that are making Razer.
01:49:49.000 They're all making different hardware, and then you have the operating system that engages with that hardware, but there's all these different drivers, you got different video cards, you have different...
01:49:57.000 There's so many different things that it's very difficult for them to make this one perfect experience.
01:50:04.000 Whereas Apple's like, you know what?
01:50:05.000 What we're going to do is we're going to control all the hardware.
01:50:08.000 So they make the best laptop they can possibly make, they make the best phone they could possibly make, and they've done such a good job with it that they have this massive, loyal fan base.
01:50:18.000 And then through that, they decided, you know what?
01:50:21.000 We're going to give you the option to not have advertisers track you and apps track you everywhere.
01:50:26.000 That is a wonderful solution.
01:50:28.000 And when Tim Cook announced that, he said, we cannot let a social dilemma become a social catastrophe.
01:50:34.000 They're going after the social media business model of surveillance advertising, and that's one of the steps.
01:50:39.000 And that's a good example.
01:50:40.000 Maybe they can do something with the social media.
01:50:42.000 Maybe Apple can use the same idea that they have and the same ethics.
01:50:55.000 Yes.
01:51:10.000 You know, we could just take it off the shelf.
01:51:12.000 We could say those things don't exist in our shelf.
01:51:14.000 Here's a crazy move for you.
01:51:15.000 It's a good time to do that now.
01:51:16.000 They could do that.
01:51:17.000 Now, here's the thing.
01:51:17.000 If they did that, people would be cynical.
01:51:20.000 They would say, wait, hold on.
01:51:21.000 Apple's doing that only so they can basically keep a bigger monopoly on their app store.
01:51:25.000 Notice there's this whole lawsuit right now with Epic Games.
01:51:27.000 And Facebook is trying to dial up that because they don't want Apple to be this big top-down, take control with their app store.
01:51:33.000 But these social media apps are free apps.
01:51:35.000 Yeah.
01:51:35.000 If they decided to say, listen, we think there's a real problem with these apps, so we're not going to make them available.
01:51:41.000 They could simply do that.
01:51:42.000 They could simply do that and say, we're going to have something that's available that we don't have any kind of control over what your feed is.
01:51:49.000 Right.
01:51:49.000 And we were just talking about this last night.
01:51:54.000 One of the things we talk about is that there's always a cynical take when someone takes an action, and there's an optimistic good faith take.
01:52:00.000 The cynical take on Francis is a whistleblower who's a secret operative, da-da-da-da-da.
01:52:04.000 The cynical take on the government wanting to regulate social media is it's just because they want to take control, or if the media is ever criticizing social media, it's just because the media is upset that they don't have a monopoly on truth anymore.
01:52:14.000 There's a partial truth in each of these things.
01:52:16.000 If Apple takes a strong move against these social media companies, And the privacy thing that you just mentioned, they're now protecting people's privacy.
01:52:25.000 They prevent cross-app tracking.
01:52:26.000 There's an article that they make an extra billion per year out of that change.
01:52:31.000 They make a billion per year.
01:52:32.000 So the cynical person says, oh, they're just doing that so they can get more money for them.
01:52:37.000 How do they make an extra billion per year?
01:52:39.000 Because somehow the extra advertising goes through their network or something like that because you're not using cross-app tracking through the other companies.
01:52:45.000 Somehow people start spending more money on their system.
01:52:48.000 So now there's a cynical take there.
01:52:49.000 But here's the move.
01:52:51.000 If they wanted to prove that they're actually a good faith actor, this is your idea last night, they could take the billion dollars or even just a large chunk of it that's not legal fees and say, we're going to spend that billion dollars on solving the rest of the social dilemma.
01:53:04.000 We're going to fund nonprofits that are doing digital literacy education.
01:53:07.000 We're going to put $500 million into nonprofits that are doing digital literacy education and other sort of humane tech.
01:53:14.000 We're also going to put another $500 million into R&D to do more features that help address the social dilemma and actually move our whole product ecosystem further in the direction of protecting society and not manipulating society.
01:53:26.000 That might be the only solution.
01:53:28.000 If a company that's as massive as Apple, that has so much capital, I mean, they are literally one of the most wealthy corporations that's ever existed.
01:53:37.000 But we would need a massive...
01:53:37.000 If not the, right?
01:53:38.000 Yeah.
01:53:39.000 I think they're...
01:53:40.000 Up there with Sadia Rimko, they may be the most, you know, they keep going up and down.
01:53:44.000 But they would need a public will and support base of people.
01:53:47.000 And that's why your audience is really interesting also.
01:53:50.000 Because, like, this is going to take, as Daniel said, this is a we the people type moment.
01:53:53.000 Like, we have to actually ask, what is the best way out of this thing?
01:53:56.000 There isn't an easy answer, right?
01:53:57.000 It's not like, hey, we're going to just tell you, just do X and it's just all over.
01:54:01.000 We fix it all.
01:54:01.000 We have to navigate through this thing.
01:54:03.000 So we have to find levers that are at least a little bit more attractive than other levers.
01:54:06.000 This is one of them.
01:54:08.000 Taiwan is another one.
01:54:09.000 It's an example that works.
01:54:11.000 Biden could invite Audrey Tang to come to the United States and actually say, we're going to build a civic tech ecosystem.
01:54:16.000 The decentralized web community that's building these Ethereum-based, like new Web3 things, could actually say, we're going to take the central design imperative.
01:54:23.000 We're going to do digital democracy that helps us do the bowling alley and get that thin tightrope that we've got to walk.
01:54:29.000 These are the kinds of things that could happen, but we would need there to be a public zeitgeist that this has to happen.
01:54:35.000 And I know it sounds dystopian if we don't do that.
01:54:37.000 It's not an easy problem.
01:54:39.000 It's not an easy problem, but one thing we can show is if people are happier and more successful, if they follow this path, then the path of wanton destruction.
01:54:48.000 Because we know that about alcoholics and gambling addicts, right?
01:54:52.000 If you have an uncle that is an alcoholic and you see him, you're like, wow, I don't want to be like that guy.
01:54:56.000 You learn.
01:54:57.000 If you see someone just ruin their life with gambling, you go, wow, that's scary.
01:55:01.000 I know a lot of people that are ruining their lives with social media.
01:55:04.000 I know people that it's radically exacerbated their mental health problems.
01:55:08.000 And I personally have had a great increase in my peace of mind by never engaging with people online.
01:55:18.000 I know you told me that.
01:55:18.000 Don't look at the YouTube comments.
01:55:20.000 I don't look at any comments.
01:55:21.000 And I really don't.
01:55:23.000 And it's so much healthier for you.
01:55:24.000 Oh my god, I'm so much happier.
01:55:26.000 I've told that to friends, and occasionally they dip their toes back in the water, and then they go, fuck, why did I do that?
01:55:32.000 And they'll do an episode of maybe they don't like something that they said, and then they go read, and I'm like, my god, man, get out of there.
01:55:39.000 And I don't engage on Twitter, I don't engage in the comments of Instagram, or I don't even look at Facebook.
01:55:46.000 And because of that, what I take in is my choice.
01:55:50.000 Like, I look at things that I'm interested in.
01:55:53.000 And most of my social media, it's not really social media consumption, but most of it is YouTube.
01:55:59.000 And most of it is like educational stuff or complete distractions.
01:56:03.000 What was the Thanksgiving study?
01:56:05.000 I was going to say, I was just thinking the same thing.
01:56:08.000 There's a Thanksgiving study that after 2016, the more per...
01:56:12.000 They looked at zip codes that had the most media advertising, political advertising.
01:56:17.000 And the more of that media you had, the shorter the Thanksgiving dinners you were.
01:56:22.000 They did this mass study looking at tracking people's locations and how long they were in their Thanksgiving dinner location.
01:56:27.000 And basically the places that were most bombarded with polarizing media, Thanksgiving dinner was shorter.
01:56:33.000 Because they argued?
01:56:34.000 Yeah, and people I think stood further apart or something like this.
01:56:37.000 It actually had the geolocation on their phones too, right?
01:56:39.000 The people who had right versus left views interacted less at dinner.
01:56:43.000 Exactly.
01:56:43.000 That was what it was.
01:56:44.000 People with right versus left views interacted less at dinner.
01:56:46.000 And we're about to head into Thanksgiving.
01:56:48.000 And I actually would say that Facebook and Twitter, their business model has been ruining Thanksgiving dinner because their business model is personalizing confirmation bias for everyone so that when you show up...
01:56:59.000 So in the same way that that's an epitome of the problem, that's your personal version of the social dilemma, we could also say, what would be the first step for each person listening to this that we can do during Thanksgiving dinner that's putting our phones at the door and actually trying to have a conversation about the mind warp that's taking place?
01:57:16.000 It's hard because when people get together and they haven't seen each other for a while, they want to argue about things that they feel the other person's wrong about.
01:57:24.000 Because they've got so much of their time invested in these echo chambers.
01:57:30.000 But you just mentioned something that was so interesting, which was if people started to understand that the echo chamber was affecting them and affecting the integrity of their family.
01:57:38.000 So rather than try to...
01:57:49.000 Yeah.
01:57:50.000 Yeah.
01:57:51.000 Yeah.
01:58:02.000 And it's got to be with your own health.
01:58:05.000 It's got to be with relationships.
01:58:06.000 It's got to be with honesty.
01:58:08.000 There's got to be a lot of things that you do that you change.
01:58:12.000 If we can influence people in any way that's positive, it's to understand where the pitfalls are.
01:58:20.000 Where's the traps?
01:58:21.000 There's a lot of them out there.
01:58:22.000 Now, when we think about the social media issue to a degree, we can take the solution that you propose and just say maybe the individual can just remove themselves from it.
01:58:30.000 We would argue that this is actually...
01:58:32.000 It's impossible population-wide currently because there are companies that just can't succeed if they don't market on there compared to their competitors.
01:58:40.000 I'm not saying remove yourself from it.
01:58:42.000 That's not what I said.
01:58:43.000 What I said is don't engage in anything personal.
01:58:48.000 You can read people's thoughts on things.
01:58:50.000 You can go and watch a YouTube video.
01:58:52.000 You can stare at people's butts on Instagram.
01:58:55.000 But if you get involved in engagement, that's when things get fucked up.
01:59:00.000 The problem is, that is the only form of self-expression that a lot of people have when you deal with...
01:59:06.000 If you're talking about something that people think it's a critical issue, how do you express yourself?
01:59:12.000 How do you get your point of view if you think your point of view is significant?
01:59:15.000 How do you get it across?
01:59:16.000 Well, you have to engage.
01:59:17.000 That's a problem.
01:59:18.000 One thing I wanted to share, we interviewed Dan Vallone from an organization called More In Common on our podcast, and he does this work on what he calls the perception gap.
01:59:27.000 What they found in their work is the more time someone spends on social media, the more likely they are to actually misperceive what the other tribes believe.
01:59:37.000 So first of all, we get hit by a double whammy because you're talking about participation on social media.
01:59:42.000 And you could sit there looking at stuff but not participating.
01:59:45.000 Well, it turns out the people who participate the most, the extreme voices, participate more often than the moderate voices.
01:59:51.000 That's what they find in their work.
01:59:52.000 And when they participate, they share more extreme views, so their stuff goes more viral.
01:59:57.000 So we're looking at this weird funhouse mirror when we think, like, oh, we're getting a sample of what everybody believes.
02:00:01.000 We're not getting a sample of what everybody believes.
02:00:03.000 We're getting a sample of what the most extreme people believe.
02:00:06.000 So if you actually ask in their research, like, how many people, how many Democrats believe that, what would you estimate for Democrats, what percentage of Republicans believe racism is still a problem in the U.S.? I think they estimate like 40% or something like that, and the answer is closer to like 65% or 70%.
02:00:22.000 So we are misperceiving because we're seeing through the stereotypes and the straw men and the bad faith examples of everyone.
02:00:29.000 So part of this mind warp is we have to actually, again, understand that we're seeing a very specific view.
02:00:39.000 If, like, 50% of people on Facebook stopped participating per what you just said earlier, the problem is that the small remaining group, the most extreme voices there, they would be identified by the algorithm and they would just maximally upregulate them.
02:00:51.000 So we just have to realize what game we're in, what unfair fight we're in, so that we can unplug ourselves from the matrix.
02:00:57.000 You called me Morpheus last time I think I was here.
02:01:00.000 Well, there's also a problem with tribal identity.
02:01:03.000 And it's fucking silly that we only have two groups in this country.
02:01:08.000 And because of the fact that we really have broken it down to two political groups, we are so polarized.
02:01:16.000 We don't have this broad spectrum of choices that we can, you know, well, I like a little bit of this, I like a little bit of that, and to be in the center is to be a fence-sitter and to inspire the ire of both sides.
02:01:32.000 Most people are in the center.
02:01:33.000 That's why the show works.
02:01:34.000 Exactly.
02:01:35.000 But most people don't think that because when they look on social media, they just see people at the extremes and they're like, am I going crazy?
02:01:40.000 Has the world gone crazy?
02:01:41.000 And the answer is, you're not wrong.
02:01:43.000 Your mammalian instincts that things are upside down, that's not wrong.
02:01:47.000 But it's not because of some master global conspiracies because social media is just showing us the craziest of the craziest voices on all sides.
02:01:52.000 I just realized how much you look like Terrence McKenna.
02:01:55.000 Look at that.
02:01:56.000 For real.
02:01:56.000 It's kind of creepy.
02:01:57.000 We both got the white and beard thing going on.
02:01:59.000 Do you have the glasses?
02:02:00.000 Yeah, it's true.
02:02:01.000 For real.
02:02:02.000 It's a real problem.
02:02:03.000 You like Terrence if he's a little more buff.
02:02:05.000 Sorry.
02:02:05.000 Go ahead.
02:02:06.000 So let's say that we could have a bunch of people get off social media.
02:02:10.000 Yes.
02:02:10.000 That's one of the exponential tech risks that we've talked about.
02:02:13.000 But that doesn't actually do much about the fragility of decentralized drones.
02:02:18.000 It doesn't do much about the fragility of decentralized cyber weapons.
02:02:21.000 Oh, you're a bummer, man.
02:02:23.000 You had a little bit of a solution.
02:02:25.000 Debbie Downer over here.
02:02:27.000 Well, no.
02:02:27.000 The reason I'm bringing it up is because an individualistic-only answer doesn't work when other individuals and other small groups have the capacity to – small and large groups to affect everything so significantly.
02:02:39.000 Right.
02:02:39.000 But it does significantly impact the health of the overall population.
02:02:43.000 If we're more healthy mentally and physically, we can make better choices.
02:02:48.000 But the next step is not just that we make better individual choices but that those who can work to make new better systems.
02:02:53.000 Yes.
02:02:53.000 And so when you think about the founders of this country, they didn't just remove themselves from believing in whatever the dominant British Empire thought at the time was.
02:03:01.000 They removed themselves from that and then said, we actually need to build a more perfect union and they invested themselves radically to do so.
02:03:07.000 And it wasn't a huge percentage of the population but it was working to build something that could apply to a much larger percentage of the population.
02:03:14.000 So we need some sort of a radical solution in terms of the way we interface with each other, the way we do business, the way we govern, the way we do everything.
02:03:24.000 Yes.
02:03:25.000 And so let's say you have people who start pulling themselves off social media and saying, I actually want to engage with other people where I really seek to understand their ideas.
02:03:32.000 Before I just jump and criticize, I want to make sure I get their values and what it's like to be them.
02:03:36.000 And so they first, they remove themselves from the toxicity.
02:03:40.000 Second, they work to actually start making sense of the world better and being in better relationship with each other.
02:03:44.000 Next, they say, I want to make a platform that facilitates this for other people.
02:03:48.000 Yeah.
02:03:49.000 And then I want to come on Joe's podcast and talk about the platform and get a lot of people on there so we start to actually get the beginning of a new attractor, a new possibility.
02:03:57.000 Don't you put that out there.
02:03:59.000 Don't you do it.
02:04:00.000 Because then there's a lot of people that think they have the solution.
02:04:04.000 What this sounds like is kind of a radical...
02:04:07.000 You know, reboot of the US, but there's the January 6th version of that, which we don't want.
02:04:11.000 Right.
02:04:12.000 There is a different version.
02:04:13.000 I wanted us to tell you in the Taiwan example, the way that that happened is actually it was a bunch of activists stormed the parliament, except they didn't try to break the glass and the windows and break through everything.
02:04:22.000 They sat outside the parliament.
02:04:24.000 They brought in all these Ethernet cables and they set up a Wi-Fi network and they had a bunch of hackers build this alternative civic engagement platform where people could debate ideas right there using technology.
02:04:35.000 So they did storm the parliament, but they didn't storm it to hurt people.
02:04:41.000 They did it to create the better form of government.
02:04:43.000 But to debate ideas where you have things like where unlikely consensus is found, that's what gets upregulated.
02:04:49.000 So they were designing that the better angels of our nature are appealed to rather than the lower angels of our nature.
02:04:54.000 And it's possible to do that.
02:04:55.000 That's a real working example.
02:04:56.000 I want people to really check that out.
02:04:57.000 It's a real thing.
02:04:58.000 We're not just, you know, pointing at a random idea.
02:05:02.000 People do say it's obviously a much smaller country.
02:05:04.000 It's not as homogenous as people think.
02:05:06.000 They think Taiwan, they think everyone's the same.
02:05:07.000 There's 20, I think, indigenous cultures or languages there.
02:05:10.000 So they actually have quite a lot of plurality.
02:05:13.000 Democracy has plurality, deliberation, and compromise.
02:05:16.000 You have to have those three things work.
02:05:18.000 Are you aware of the agent provocateur aspect of January 6th?
02:05:23.000 Same word.
02:05:25.000 I don't exactly know what the reality is, but what people are insinuating is that there was federal agents that were involved in instigating the violence.
02:05:39.000 Instigating the entering into the Capitol and that there's this one guy in specific that they've got him isolated on video.
02:05:46.000 They've shown him over and over again.
02:05:48.000 He's faced no legal consequences.
02:05:51.000 They know this guy's name.
02:05:53.000 They know exactly who he is.
02:05:54.000 All these other guys are in jail.
02:05:56.000 All these other guys who got into the Capitol, I mean, so many of them are facing like these massive federal charges and four years plus in jail.
02:06:05.000 This one guy is like, we have to go in there.
02:06:08.000 We have to take back.
02:06:09.000 We have to get inside there.
02:06:11.000 And people start calling him a fed in one of these videos.
02:06:14.000 And I think he takes off and runs away.
02:06:16.000 But this is what it seems like.
02:06:18.000 It seems like – and this is something that governments have done forever.
02:06:22.000 You take a peaceful protest.
02:06:25.000 What's the best way to break up a peaceful protest?
02:06:27.000 You bring in agent provocateurs to turn it into a non-peaceful, a violent protest, smash windows, light things on fire, then you can send in the troops and you can clean up the mess and then you don't have any protest anymore.
02:06:40.000 This was the World Trade Organization in, what was it, in Seattle in 99 or whatever it was?
02:06:46.000 That's what they did.
02:06:46.000 It's been documented that that is what happened.
02:06:49.000 I mean, literal government agents went in wearing Antifa outfits, and this is pre-Antifa, right?
02:06:56.000 Smashing windows, lighting things on fire, and they were all eventually released, conveniently.
02:07:02.000 Well, this guy, do you know about this, Jamie?
02:07:04.000 See if you can find it.
02:07:06.000 Because it's a curious case of this one particular individual who's like yelling in these various groups that we have to get in there, and like he did it pre-January 6th, he did it during the January 6th thing, and then these guys face no legal charges whatsoever.
02:07:23.000 And people are like, well, what the fuck is going on here?
02:07:26.000 Because when you see some kind of organized debacle like that, and then you see people insisting that we have to take this further and we have to go inside, and then if you find out that those people are actually federal agents that are doing that,
02:07:42.000 you're like, well, what is happening here?
02:07:43.000 And how is that possible?
02:07:45.000 And how is this legal?
02:07:46.000 That's a problem.
02:07:48.000 Yeah.
02:07:49.000 I haven't seen this one.
02:07:50.000 I remember the umbrella man who was breaking windows at the George Floyd riots.
02:07:56.000 I think they found out that that guy was a cop and that I think that was like a rogue human.
02:08:03.000 But I'm not sure if that's true.
02:08:04.000 So this is where it's interesting in this case.
02:08:07.000 I don't know the case at all.
02:08:08.000 Is it that somebody in government actually initiated him doing it as an agent provocateur to shut down the protest or was he someone who happened to be in government who has himself radicalized who acting on his own because of radicalization did the thing?
02:08:22.000 Or is he an agent provocateur but he's doing so independently just because he's a fucking psycho?
02:08:28.000 Some firemen start fires.
02:08:30.000 Right, but notice that whichever view you have, you probably had a motivated interest to see it that way, right?
02:08:36.000 Yeah, I didn't have any view on it.
02:08:38.000 That's the thing.
02:08:38.000 I'm looking at it like this, like, what is this video?
02:08:41.000 I'm watching this guy, like this one big, beefy-looking federal agent guy, telling them they gotta go inside, and I think he was wearing a MAGA hat.
02:08:49.000 And, you know, he's like a guy in his 50s, and he's like, I'll tell you what we gotta do.
02:08:53.000 We gotta get inside there.
02:08:54.000 We gotta go inside the Capitol.
02:08:56.000 And these people are like, inside?
02:08:58.000 Isn't that illegal?
02:08:58.000 Like, what the fuck?
02:08:59.000 This guy's taking it to the next level.
02:09:01.000 But he's doing it, like, multiple times.
02:09:06.000 There is a real problem with intelligence agencies doing that kind of shit.
02:09:11.000 Totally.
02:09:11.000 Because they do do it.
02:09:12.000 And I think they do it thinking that, look, these group of fucking psychos, we've got to stop this from escalating, so here's the way.
02:09:20.000 We get them to do something really stupid, then we can put fences up and create a green zone, and then we lock this down.
02:09:27.000 Meet Ray Epps.
02:09:29.000 Meet Ray Epps, the Fed-protected provocateur who appears to have led the very first January 6th attack on the U.S. Capitol.
02:09:41.000 So let's watch some of this, because it's fucking crazy.
02:09:43.000 It's really weird.
02:09:45.000 This guy is doing this over and over and over again.
02:09:49.000 There's a video of it, but this is an article about...
02:09:52.000 Oh, so this is an article that's in Revolver.
02:09:55.000 Hold on, I'll get the video.
02:09:56.000 We'll find the video, because the video is fucking strange.
02:10:00.000 Ray Epps video.
02:10:03.000 Here it is.
02:10:03.000 Well, that's 20 minutes long.
02:10:05.000 Well, just watch.
02:10:06.000 We'll see some of it.
02:10:09.000 Oh, these are guys that are watching it.
02:10:12.000 What about that one?
02:10:14.000 It goes to a website.
02:10:17.000 These are on Twitter.
02:10:20.000 Arrest Rave Epps.
02:10:23.000 Some people are hip to it.
02:10:24.000 But most people, including you guys, have no idea that this is a person, right?
02:10:29.000 You've never heard of this before.
02:10:31.000 I don't know why it's not playing a video.
02:10:34.000 Oh, these fucks with the clicks.
02:10:36.000 Oh, my God.
02:10:39.000 Please log in.
02:10:40.000 Log in.
02:10:41.000 I want you to log in.
02:10:42.000 We need to track you.
02:10:44.000 God.
02:10:45.000 Fucking Twitter.
02:10:46.000 One of the things that was so cool about the C-SPAN was the idea of being able to actually see what was happening inside of proceedings.
02:10:54.000 And we know that the idea of a modern liberal democracy is that we want to leave markets to do most of the innovation and provisioning of resources because they do a good job.
02:11:03.000 But we still want rule of law because there are places where markets will have a financial incentive for things that really harm everybody like complete destruction of environments or organ trades or whatever it is.
02:11:12.000 And so rule of law is intended to be a way that if you have a government that is up for and by the people that – and it's given a monopoly of violence that it can check the predatory aspects of markets where the basis of the law because of voting is the collective values of the people.
02:11:28.000 But the state only has integrity and can check the markets if the people check the state.
02:11:35.000 Again, at a much smaller scale, it was easier to have transparency and being able to see what was happening.
02:11:55.000 What terrifies me is the solution of this is an autocratic government that controls all aspects of society so none of this ever happens.
02:12:03.000 That scares the shit out of me.
02:12:04.000 Because that seems to be where- there's that fuck.
02:12:07.000 Let's play this.
02:12:08.000 The Capitol!
02:12:10.000 Tomorrow?
02:12:11.000 But do it from the beginning.
02:12:12.000 I don't even like to say it.
02:12:13.000 Tomorrow, we need to go into the Capitol.
02:12:16.000 Into the Capitol.
02:12:18.000 What?
02:12:20.000 I don't even like to say it because I'll be arrested.
02:12:22.000 Well, let's not say it.
02:12:23.000 We need to go.
02:12:24.000 I'll say it.
02:12:25.000 All right.
02:12:26.000 We need to go in.
02:12:27.000 Shut the fuck up, Boomer.
02:12:28.000 To the Capitol.
02:12:33.000 We are going to the Capitol where our problems are.
02:12:37.000 It's that direction.
02:12:40.000 We spread the word.
02:12:42.000 All right, no David, one more thing.
02:12:44.000 Yeah, so can we go up there?
02:12:46.000 No?
02:12:46.000 When we go in, leave this year.
02:12:47.000 Are we going to get arrested if we go up there?
02:12:49.000 Yeah, you don't need to get shot.
02:12:50.000 Are you going to arrest us all?
02:13:02.000 Okay, I think we've seen enough.
02:13:04.000 There's a lot of instances.
02:13:05.000 It goes on for quite a while.
02:13:06.000 There's a lot of videos of this guy, which is really fascinating because I think these methods that they've used forever are kind of subverted by social media because you have 100,000 different cameras pointed at this guy.
02:13:20.000 When someone starts screaming loudly, people start filming it, and then you get a collection of these, and you can go, oh, what is happening here?
02:13:28.000 Like, I don't think they've realized that people would be so cynical that they would go over all these various videos and find this one guy who's not being prosecuted or arrested.
02:13:38.000 He's not being prosecuted or arrested.
02:13:40.000 Ding!
02:13:42.000 Congratulations.
02:13:42.000 I don't know.
02:13:43.000 No, he's not.
02:13:44.000 Look at that guy.
02:13:45.000 Yeah.
02:13:45.000 I mean, if you had a guess, if you had like 50 bucks, what are you going to put your chips on, red or black?
02:13:53.000 I might put my chips on the result of stochastic terrorism.
02:13:57.000 If I was China, I would have wanted to infiltrate the Facebook group that guys like him were in and just radicalize as much as possible so that some of them were motivated to do it earnestly.
02:14:06.000 So it was like some patsy, but I don't even know who it is.
02:14:09.000 Oh, for sure there's some of that going on there.
02:14:11.000 There's a lot of stuff going on with January 6th.
02:14:14.000 And it's a lot of sad humans who don't have a lot going on in their life.
02:14:19.000 Did you see the, what is it, Into the Storm?
02:14:23.000 Is that what it was, the HBO documentary on QAnon?
02:14:26.000 No.
02:14:26.000 Did you see it?
02:14:27.000 No.
02:14:27.000 It's fascinating.
02:14:28.000 It's really good.
02:14:30.000 And it's a multi-part documentary series about QAnon and the people that are involved.
02:14:36.000 And one thing you get out of it is that these people found meaning in this nonsense.
02:14:42.000 They found meaning.
02:14:43.000 And they really thought they were part of something bigger than them.
02:14:46.000 And it gave them hope and happiness.
02:14:48.000 And what I got out of that is, well, this is exactly what we were talking about earlier, the people that are getting sucked into this.
02:14:55.000 Totally.
02:14:56.000 Distraction life is that most people don't feel like they live a meaningful existence.
02:15:01.000 So when something like this comes up and you get radicalized, whether it's by China or Russia or that guy and he's saying, you know, that guy's just basically incendiary, right?
02:15:10.000 He's just throwing gasoline on the fire.
02:15:12.000 You're saying, is there something out there that you can connect to that's bigger than you?
02:15:20.000 And they're saying, yes, there is.
02:15:21.000 You can be a part of this group.
02:15:23.000 You can be a patriot.
02:15:24.000 Are you a patriot?
02:15:26.000 Do you want to storm the Capitol?
02:15:27.000 And then you've got the fucking president.
02:15:29.000 Who's saying, you know, we have to make a big movement.
02:15:31.000 We have to do a big thing.
02:15:32.000 They stole this election.
02:15:34.000 You're like, holy shit.
02:15:35.000 You know, we have to go there and it's a show of force.
02:15:37.000 And then they pull them off of Twitter and like, oh my God, the conspiracy is even bigger than I thought.
02:15:42.000 Twitter's involved and it becomes something that is all encompassing.
02:15:47.000 It involves every aspect of their life.
02:15:50.000 They wake up in the middle of the night to check Twitter.
02:15:52.000 They take a leak and they check it and make sure that, you know, we move it in the right.
02:15:56.000 Has Q released a new drop?
02:15:58.000 And these fucking people get completely locked into it.
02:16:01.000 And at the end of this documentary on HBO, which is really excellent, I can't recommend it enough, you see a lot of them are realizing, like, this is all bullshit.
02:16:11.000 And they're like, what have I done with my life?
02:16:13.000 There's a Reddit channel called QAnon Casualties, which is like people, especially who struggle with family members, who've fallen down different rabbit holes, and I guess that's one of them.
02:16:21.000 And as people come out of it, just what happens?
02:16:23.000 I have a friend who just reached out about that, about his own wife.
02:16:26.000 He asked me, like, what can he do?
02:16:28.000 Yeah.
02:16:29.000 Well, I mean, I think what you're pointing to, our friend Jamie Wheel, who's here in Austin, we had him on our podcast to talk about this.
02:16:35.000 When we think about social media, a lot of times people think about it as an information problem, misinformation, disinformation.
02:16:40.000 It's actually about meaning and identity, which is what you're pointing to.
02:16:44.000 People are getting meaning and purpose from a thing.
02:16:46.000 And therefore, it's not a matter of like, well, let's just like tell people the true thing or the fact check thing.
02:16:51.000 There's a sense of meaning, purpose, narrative, what I'm participating in that's bigger than myself that people are seeking.
02:16:59.000 And part of that, which is exacerbated by social media, because it's mass alienation and loneliness.
02:17:04.000 And those are exactly the kinds of people that can be pulled in various directions, which includes also some of the decentralized ways that they can use those tools to cause havoc.
02:17:14.000 Something I was thinking is, in the founding of this country, it was...
02:17:18.000 I understood that both high-quality education and a fourth estate, right?
02:17:22.000 A kind of free and open press were considered prerequisite institutions for democracy to work.
02:17:27.000 You had that— You see what a fourth estate is?
02:17:29.000 Journalism, right?
02:17:30.000 Some kind of—but at that time—so both education and newspaper were the result of a printing press where you didn't just have a nobility class who had access to books when books were really hard to get, but we could print a newspaper so everybody could know what was going on.
02:17:44.000 We could print textbooks so everyone could get educated.
02:17:46.000 If you could have – at least that was the idea, right?
02:17:48.000 If we have a population where everyone can make sense of the world, like they've learned how to make sense of the world.
02:17:54.000 They've got history and civics and science and like that and they know what's going on currently, then they can all go to the town hall and participate in government.
02:18:02.000 So it was – acknowledge that without something like a fourth estate, a shared way to make sense of the world together, democracy doesn't work.
02:18:10.000 Facebook in particular is not just a destruction of the fourth estate.
02:18:14.000 It's like an anti-fourth estate rather than share something where everybody gets the same information to then be able to go debate.
02:18:20.000 Right now, two different people will have Facebook feeds that have almost nothing in common and polarized, right, and are identifying your fellow countrymen as your most significant enemy and that everything they think is wrong and a conspiracy and a lie or something like that.
02:18:33.000 Trevor Burrus Right.
02:18:33.000 But how do you rectify that and still have independent media?
02:18:39.000 Right.
02:18:40.000 So one of the things I was going to say that's interesting is that as we started to scale more, one of the things that newspaper and then with TV and broadcast became able to do was scaled propaganda, give the same message to everybody.
02:18:53.000 And there was this whole big debate in World War I and then going into World War II that democracy requires propaganda because people are too dumb to make sense of the world adequately.
02:19:02.000 So we have to propagandize them so they aren't fighting the war effort while we're in war.
02:19:07.000 One of the things that is interesting, just from a potential, and you'll say, yeah, but how do we get there because how do you incentivize the Zuckerbergs or whatever?
02:19:14.000 And the enactment is a real tricky thing.
02:19:19.000 You could use the tools of social media which is the ability to personalize a huge amount of content to the individual to actually not – to make real democracy possible where you don't need to give everyone propaganda because they're dumb.
02:19:32.000 You can actually help people understand the issue progressively better in a personalized way.
02:19:37.000 How are they already leaning?
02:19:38.000 Expose them to the other ideas.
02:19:39.000 See that they're understanding it.
02:19:41.000 And you can imagine that, like, real democracy could actually be possible at scale if you could personalize the kinds of education and civic virtue that would be necessary for people to engage in a democracy.
02:19:53.000 Let me add on to that, because this example you just showed me, right, with this guy, I had never seen that video.
02:19:58.000 Imagine a Thanksgiving dinner happening a few weeks from now where one set of people had been all exposed to this guy, and this is like the central way that they see January 6th, is through the lens of that guy.
02:20:09.000 If you're in one of the other filter bubbles, all you see is just the violent, crazy, whatever.
02:20:15.000 You are not even operating on a shared reality.
02:20:17.000 So when you talk about January 6th, normally if we have a shared printing press or we have a shared fourth estate, we've at least been exposed to some of the same things.
02:20:25.000 Yeah.
02:20:26.000 But when you show up at that Thanksgiving dinner table, when we argue about January 6th...
02:20:29.000 And you haven't seen something.
02:20:30.000 You haven't seen...
02:20:31.000 Right, but you assume our brains are not built...
02:20:33.000 Part of our paleolithic emotions is that we were built to assume...
02:20:37.000 My brain constructs reality from my eyeballs.
02:20:39.000 So, like, I have to assume...
02:20:40.000 I was built evolutionarily to assume that your brain is constructing reality from some of the shared stuff that I'm seeing with my eyeballs.
02:20:45.000 So all my biases are to assume other people are talking about the same reality.
02:20:49.000 And there's a little bit of a magic trick optical illusion because we both saw, quote unquote, January 6th, but we were exposed to completely different media sets.
02:20:57.000 So now when we get in a conversation, it completely breaks down, not because we're actually even talking about the same thing.
02:21:03.000 But because we don't even get to that layer of detail.
02:21:06.000 And one of the things in a humane technology world, I think I mentioned to you in the more uncommon research they found that the more you use social media, the more likely it is that you are not able to predict what someone else's views are on a topic.
02:21:19.000 You think all Republicans are racist or something like that if you're on the Democrat side.
02:21:22.000 Or if you're on the Republican side, you believe that all Democrats are LGBTQ and only 6% of Democrats are LGBTQ. So we are far off in terms of our estimations of other people's beliefs.
02:21:32.000 And in the current world, the more you use social media, the worse your estimations are.
02:21:36.000 In a humane future, the more you use social media, the better our shared understanding and my understanding of your understanding would be.
02:21:44.000 And so you can imagine there's some sense maker out there who's showing both sides of these different filter bubbles and helping us bridge build.
02:21:51.000 So we're actually even able to have a shared conversation.
02:21:53.000 Those are the kinds of people that Daniel was just talking about would get kind of upregulated to be at the center of our shared undivided attention.
02:22:00.000 Let's say I wanted to say, how do I increase trust in our institutions that are processing things too complex for an individual to figure out on their own, like the reality of climate change or COVID? Well, let's say that C-SPAN-like, I had debates happen inside those institutions where people who had real expertise but had conflicting views had a long-form facilitated debate but not the type of debate that is just oriented towards rhetoric and gotchas to try to win but that is earnestly
02:22:31.000 trying to seek better understanding.
02:22:33.000 And there's a facilitated process and the people agree to it.
02:22:36.000 One of the things they agree to is what would I need to change my mind about this?
02:22:39.000 If the answer is nothing, then you don't even engage in the debate.
02:22:42.000 If we can't even say what would change our mind, we're not really responsible participants of a democracy because we're not really open to it.
02:22:48.000 And each of the debaters has to read each other's content first and agree to a facilitation process that's long form and we start with what do we all agree on?
02:22:57.000 Say we're looking at climate change.
02:22:58.000 What do we all agree on?
02:23:00.000 That means now, both around the values that are relevant and the facts of the matter, that where we go to what we disagree on, we know what our shared basis to derive a solution looks like.
02:23:10.000 Then we try to formalize our disagreement.
02:23:12.000 I believe X... I believe not X. And we say, what would it take to solve this?
02:23:16.000 Do we need to do a new piece of science?
02:23:17.000 Do we disagree because of an intuitive hunch or a different value?
02:23:21.000 We could do a much...
02:23:21.000 Or because Al Gore finds a private plane?
02:23:23.000 There's these common cynical narratives also that get in the way, right?
02:23:26.000 Because we're all just like, oh, well, it's just this one thing.
02:23:28.000 Or, you know, Ben Shavir says, oh, the media doesn't like social media because it's losing its monopoly on truth.
02:23:32.000 Partial truth, but not complete.
02:23:33.000 Go on.
02:23:34.000 But we could have people who had...
02:23:36.000 Different views but were earnest and wanted to know what was true more than hold their own view, be able to engage in a process that could bring us to what is shared knowledge?
02:23:46.000 Where are there disagreements?
02:23:47.000 What is the basis of the disagreement?
02:23:49.000 What would it take to solve that?
02:24:04.000 I think I got the name right.
02:24:10.000 I think?
02:24:35.000 Because you can imagine a world where Facebook's like, oh, do you want to see more bridge building between January 6th?
02:24:40.000 Do you want to see more bridge building stuff on climate change?
02:24:42.000 And you could imagine sorting for that thing, right?
02:24:45.000 They could design it very differently.
02:24:47.000 Right, but you would have to change people's intuitions and change human nature because human nature is to seek conflict.
02:24:55.000 Most people seek conflict.
02:24:57.000 This is a fundamental question about how we view human nature.
02:25:01.000 It is true that the worser vices, the worser devils or whatever you call them, the worser angels of human nature are there within us.
02:25:08.000 Right.
02:25:08.000 But if that's what we assume is true, that that is the full story of who we are when we look in the mirror, then this story is over and we should just go do something else.
02:25:16.000 No, no, [...
02:25:18.000 I just think most people live these sort of unrealized lives.
02:25:24.000 You have a giant percentage of the population that is disenfranchised.
02:25:29.000 Totally.
02:25:29.000 And they're angry.
02:25:30.000 Yeah.
02:25:30.000 And they look for things that make them angry.
02:25:33.000 Yes.
02:25:34.000 So I think we have to address it at the root level before we address it even at a social media level.
02:25:39.000 That's why you had Johan Harion saying the opposite of addiction is not sobriety, it's connection.
02:25:43.000 People need meaning, purpose, connection.
02:25:46.000 And you can imagine a world where social media is like, hey, here's some drum circles or dance events or obviously post-COVID or whatever, but just...
02:25:52.000 Social media could be steering us.
02:25:54.000 We're making life choices every day.
02:25:55.000 When you look at a screen, it's basically allocating decisions of where time blocks are going to land in your calendar.
02:26:00.000 Most of those time blocks are like, spend another five seconds doing this with your phone.
02:26:04.000 But imagine social media becomes a GPS router for life choices, where it's actually directing you to the kinds of things that create more meaning.
02:26:10.000 Now, of course, the deeper thing is work inequality, meaning not existing in a lot of the work that people do.
02:26:16.000 Agreed.
02:26:17.000 And relationships.
02:26:18.000 Yeah, absolutely.
02:26:19.000 Right.
02:26:20.000 I mean, who's the angriest people?
02:26:21.000 Incels, right?
02:26:22.000 When people get really angry, you accuse them of being incels.
02:26:24.000 But we can imagine a world that facilitates ways for people to, you know, go to dance events together where they meet other people in a more, like, facilitated environment as opposed to you're going to sit there at home and, like, let's just get you swiping and Tinder's profiting from the attention of casinos so you match and then you never message someone,
02:26:40.000 right?
02:26:40.000 Yeah.
02:26:40.000 They profit from just like that machine working that way.
02:26:43.000 Right.
02:26:43.000 And then we also have the emergence of the metaverse where people are just going to be more incentivized to go into that because it's going to be very exciting.
02:26:49.000 Which is why a humane future is the online world has to care about actually like regenerating the connective tissues of the offline world.
02:26:56.000 If it doesn't do that, it's not going to work.
02:26:58.000 Apple could be in a position to do that.
02:27:00.000 You take it back to similar to people exercising and not eating too much sugar because those are...
02:27:05.000 The too much sugar is a hypernormal stimuli, right?
02:27:08.000 Remove the sugar, fat and salt from evolutionary food, which are the parts that create the dopamine hit and just make fast food out of it.
02:27:14.000 And in the same way of like what is fast food to real food is just the hypernormal stimuli.
02:27:20.000 That's what porn is to real intimacy.
02:27:22.000 That's what dating apps are to actually meeting people.
02:27:25.000 It's what social media is to real human relationships.
02:27:27.000 It's kind of just the hypernormal stimuli.
02:27:29.000 Right.
02:27:31.000 We know that GDP is not a good metric of the health of a society because GDP goes up when war goes up.
02:27:36.000 It goes up when addiction goes up.
02:27:38.000 The question of what is a good measure of the health of a society, one metric that I like – no one is applying the metric just as a thought experiment – is the inverse of the addiction in the society as a whole is a good measure of the health.
02:27:50.000 A healthy society produces less addiction, meaning more sovereign individuals because addiction creates a spike in pleasure and then an erosion of their baseline of pleasure.
02:27:59.000 And baseline of health fulfillment in general.
02:28:02.000 One of the reasons we're so susceptible to hypernormal stimuli is what you're saying is because we live in environments that are hyponormal, like not enough of the type of stimuli that we really need, which is mostly human connection, creativity, and meaning.
02:28:15.000 Right.
02:28:15.000 And so at the basis of it is like, how do we actually increase those is the only way that we become unsusceptible to the supply side marketing that appeals to...
02:28:26.000 Yes.
02:28:27.000 And it's interesting to think about if Apple were to take the, you know, small percentage of people who opt into tracking their usage statistics, and they could actually measure for a given country, hey, this is the percentage of people that are addicted based on usage patterns.
02:28:39.000 Again, it's privacy respecting and everything, and reporting that back to society so there's a better feedback loop between...
02:28:50.000 I mean, again, Apple's in this really unique position where their business model is not addicting people, polarizing people.
02:28:55.000 You know, they could actually make their whole system about how do we have deeper choices in the real world.
02:29:01.000 Well, there is a movement in society currently to try to get people to recognize through radically induced introspective thought brought on by psychedelics what the problems of our society and Not necessarily the problems of these choices,
02:29:20.000 but the problems you're talking about like indulging primarily in these choices, whether it's porn or fast food or gambling or alcohol or whatever these problems are that people have, is that there are certain psychedelic compounds that allow you to see yourself in an incredibly ruthlessly introspective way that'll allow you to make radical changes.
02:29:42.000 And those are all illegal right now.
02:29:44.000 And there's a lot of great work being done right now with MAPS, where Rick Doblin's organization has worked to try to introduce these compounds to specifically help soldiers deal with PTSD. It's a big one.
02:30:00.000 And I think through that and through their advocacy and the understanding that this stuff is very effective, whether it's through MDMA or whether it's through psilocybin, through some of the research they're doing with that, that there's a way to get a view outside of the pattern,
02:30:18.000 this deeply cut groove that you're stuck in.
02:30:22.000 The default mode.
02:30:23.000 Yes.
02:30:23.000 And I think if we're dealing with anything that is a possible potential real solution for radically re-engaging thought, for changing the way we interface with each other and with society in general, I think that's it.
02:30:39.000 And I think the fact that that is illegal currently is one of the big problems, one of the big barriers between us changing the way our culture operates and what we find to be important.
02:30:56.000 Yeah, I mean, you remember so many of the, like, founding writings of the country said we need freedom of religion, but we actually need a religious people.
02:31:04.000 And what they were saying is, like, we don't care if it's Confucianism or whatever, but you need a people that have some transcendent values and morals that bind them to more than just their own self-interest.
02:31:13.000 Yes.
02:31:16.000 Love thy neighbor and give the benefit of the doubt and things like that.
02:31:52.000 Yeah.
02:32:00.000 I have seen narcissists and sociopaths get more severely that way using psychedelics because it just creates reinforcement.
02:32:07.000 Yeah, and gurus.
02:32:08.000 So it has to be like psychedelics in a community of practice where there is checks and balances on each other.
02:32:17.000 And ethics.
02:32:17.000 Exactly.
02:32:17.000 And I think also it can...
02:32:20.000 Move us away from this concept, to use Terence McKenna's words, a dominator culture.
02:32:25.000 That you can have advancement without having a dominator culture.
02:32:29.000 And you can have advancement where you seek to engage in the greater good of the whole.
02:32:35.000 And the choices can be made that way.
02:32:37.000 And I think, in many ways, it's one of the things that Apple does.
02:32:40.000 When Apple is talking about this world where they're creating less impact of advertisement by not having you track amongst all apps and allowing you to choose whether or not apps track you.
02:32:56.000 That's a bold move in this world where everybody is trying to accentuate the influence that apps have and the amount of engagement they have and to be able to use advertiser money and to be able to generate more of it through that.
02:33:13.000 It's so attractive to people.
02:33:15.000 Look, Android is just a big data-scooping machine, right?
02:33:20.000 I mean, they're tracking everything and anything.
02:33:22.000 And it's one of the things they said about TikTok, when software engineers first looked at it, they're like, Jesus Christ, this is tracking everything.
02:33:30.000 And it's one of the most invasive of the applications.
02:33:34.000 Why TikTok is not considered a major immediate national security threat, I still don't understand.
02:33:39.000 I mean, if Russia in the Cold War was running the media programming for the United States for all of its youth, like, that's insane.
02:33:48.000 There's actually a specific strategy China uses called the The CCP uses called the borrowing mouths to speak.
02:33:53.000 So you can imagine when anyone says, any Western voice in the U.S. speaks positively of the CCP, you can just add a little inflation.
02:34:00.000 They just get a little bit more boost than someone, because you're more trusting of someone who's a Western voice than of someone who's from, say, the CCP or China.
02:34:08.000 And so that's one of the invisible ways you can steer culture.
02:34:12.000 But going on back to the Apple point, we all sound like we're promoting Apple in this podcast, and I just wanted to say this.
02:34:17.000 Well, we're kind of promoting good faith companies that are moving in the right direction.
02:34:22.000 Yeah, and you had John Mackey on Whole Foods.
02:34:23.000 We went to Whole Foods last night and talked about how that's creating an environment for health and trying to at least couple better.
02:34:29.000 It's not perfect.
02:34:30.000 It's just coupling better towards we can make money by helping things be healthier.
02:34:34.000 Apple could say, we're going to couple our business model, put on the Johnson& Johnson, whatever you think of Johnson& Johnson, but you can...
02:34:39.000 We're going to orient our business towards long-term health of people.
02:34:42.000 We're going to do Apple Fitness.
02:34:43.000 We're going to do things like that.
02:34:44.000 We're going to change the app stores to put down in the shelf space all the toxic social media stuff, if not take it off completely, and put back on what are the things that help people connect with each other and connect with each other in person.
02:34:55.000 Part of that is it's actually kind of hard to host.
02:34:58.000 There's certain people in a community who are kind of the event hosters.
02:35:01.000 They're the people that bring people together.
02:35:03.000 And right now, I mean, they're good at it, but imagine that was just like a hundred times easier.
02:35:06.000 I'm not trying to sell anything.
02:35:07.000 I don't have any product or anything like here in thinking about this, but there are people who work on how do we make it easier to bring people together in the physical world?
02:35:15.000 And if we made that a lot easier than it is today, so that was happening more often, so that when you thought about what you wanted to do, instead of I could open up this app or that app, I felt in my own community, in my physical community, I think we're good to go.
02:35:45.000 Again, this is part of a longer-term trend and transition of how you get out of this.
02:35:49.000 But I do think that we have to make the choices that are fulfilling as easy to make as the choices that are not fulfilling but have the hypernormal stimuli instant hit.
02:35:57.000 I was thinking about something, Joe, when you were asking, like, what are the solutions?
02:36:01.000 And jumping quickly to why some proposed solutions don't work, which is true.
02:36:06.000 It's like you think about what are the nutrients the body needs?
02:36:09.000 You can die just from a vitamin C deficiency even if you have all the B vitamins, vitamin D, etc.
02:36:15.000 And so it's like the body doesn't need a nutrient.
02:36:17.000 It needs minimum levels of lots of nutrients.
02:36:19.000 The same as like how do you get buff?
02:36:21.000 Which muscle do you work?
02:36:22.000 Well, you have to work all the muscles and you have to work them in lots of different ways.
02:36:25.000 How do you make a functional civilization?
02:36:27.000 Do you do that through culture?
02:36:29.000 Do you do it through collective efforts or individual efforts?
02:36:32.000 Do you do it through technology?
02:36:33.000 Do you do it through changing markets or states?
02:36:36.000 We propose that there's some stuff in all those areas that has to happen simultaneously that drives virtuous cycles.
02:36:43.000 And otherwise, it's kind of like answering the question of like which nutrient do you need or which muscle do you need to work out?
02:36:47.000 Like the problems are complex.
02:36:48.000 They have many different causes.
02:36:50.000 And all of the kind of single solutions might do something but end up failing.
02:36:55.000 And so we have to also – and this is again something that's very hard to do when attention spans are getting shorter and shorter – is how do we actually look at a whole ecosystem of solutions that collectively can start to address it even though any of them individually can't?
02:37:10.000 Yeah.
02:37:11.000 So we're going to hit the brakes and go backwards or go in a completely different direction than the culture seems to be going in.
02:37:18.000 So we, you know what I mean?
02:37:19.000 We have to not just stop our forward, it's not even forward momentum, the general direction that we're going in.
02:37:26.000 Well, I think with things like The Social Dilemma, which was seen by 150 million people and Francis' stuff coming out and people having a negative reaction to the metaverse.
02:37:36.000 I don't know that many people who saw it and was like, yeah, let's totally do that.
02:37:38.000 Obviously, they have asymmetric marketing power.
02:37:40.000 They're going to put billions of dollars into funding this thing.
02:37:43.000 They're hiring 10,000.
02:37:44.000 Let's talk about that.
02:37:45.000 What are they doing?
02:37:46.000 Because that commercial where the tiger is talking to the buffalo and then all the kids are dancing, I don't know what the fuck is happening.
02:37:52.000 I mean, I don't know what's happening in that example, but it's a race to control the whole experience.
02:37:58.000 I mean, the reason that Facebook is doing the metaverse, Zuckerberg doesn't like the fact that Apple has controlled his destiny by controlling the operating system inside of which Facebook has to sit.
02:38:08.000 And then all the various ways that whether they make advertising like they did recently, the privacy tracking stuff, it makes him not have control over his destiny.
02:38:16.000 And so if you own the whole platform, bottom to top, it's a classic vertical integration.
02:38:20.000 If I own the entire stack, I can control the entire thing.
02:38:24.000 And then I own my own destiny.
02:38:26.000 That's what this is really about.
02:38:27.000 And it's going to become a land grab between these companies for who can sort of own the next metaverse platform.
02:38:33.000 It's a fascinating thing to observe when you're watching someone who has ungodly amounts of wealth clearly, ambitiously pursuing more in a very transparent way.
02:38:51.000 What's interesting to psychoanalyze him a bit is that he has 55% of the ownership and voting structure shares of Facebook.
02:38:58.000 He has a very unique position.
02:38:59.000 There's never been, I don't think, a company as He's a young guy.
02:39:17.000 Yeah.
02:39:17.000 How old is he?
02:39:19.000 Is he even 40?
02:39:23.000 So he, and then I think what he cares the most about is being seen as an innovator.
02:39:28.000 Like if I had to name it, it's not, like you said, it's not the money.
02:39:31.000 I think people always say, oh, it's just he's greedy.
02:39:33.000 He just wants the money.
02:39:34.000 No, I think it's, he wants to be seen as an innovator.
02:39:37.000 And if the world said the way you can be an innovator is not by...
02:39:42.000 Building more stuff that basically hollows out the physical world so we can make this unsustainable virtual world that collapses society.
02:39:50.000 You can be an innovator by actually fixing and helping to support the planet that we're on the actual world that we're living in the social fabric that needs to be strengthened.
02:39:58.000 I think?
02:40:17.000 About the gap between what his incentives are and what the world needs for basically sustaining it.
02:40:23.000 Well, also imagine if you've created something that says, whether or not he created it is a different debate, but you're the controller of something that's so massively influential on a global scale, and maybe he thinks that at least he's not evil.
02:40:38.000 Like, he may be trying to make money, and he may be trying to come off as an innovator, but he's not an evil person.
02:40:47.000 I don't get an evil sense off of Mark Zuckerberg.
02:40:50.000 He's kind of robotic.
02:40:51.000 He's odd.
02:40:52.000 He's odd in the way he communicates, but maybe that's like a social awkwardness in dealing with his own public image being broadcast to the world and comes off clunky.
02:41:01.000 People come off clunky when they're concerned with how people view them.
02:41:07.000 Maybe that's it.
02:41:08.000 But imagine just giving that up.
02:41:11.000 I'm gonna back out now like, you know, Jeff Bezos is leaving Amazon and he's gonna like hand it over to another CEO. Imagine him handing over Facebook to some other person and watching them fuck it up or watching them take this insanely powerful thing and actually make it more evil or make it more destructive but more profitable.
02:41:34.000 That's a Total possibility, right?
02:41:38.000 I mean, if they just went full capitalist and some really ruthless CEO got a hold of Facebook and they said, listen, our bottom line is, like, we're trying to increase the amount of money that our shareholders get off of this, and what we're going to do is we're going to make these choices.
02:41:53.000 And these choices might not be that popular with analysts and with people that are, you know, sort of trying to examine culture and the impact that social media has on it, but for us, it's going to be a windfall.
02:42:08.000 We were speaking with a friend who is in a senior position at Google working on AI and has come to the conclusion that a lot of people in senior positions in AI have come to that something like artificial general intelligence is inevitable and inevitable near term.
02:42:23.000 Near term like how many years?
02:42:25.000 Depends upon who you're talking to, but this was forms that are inevitable in the five-year time period.
02:42:31.000 Jesus.
02:42:32.000 Well, how come that's debatable?
02:42:33.000 Because some really intelligent people think it's off by 50 years.
02:42:37.000 Partly it has to do with how you define artificial general intelligence.
02:42:41.000 Are you defining it as something that is truly sentient and self-aware or simply that can beat us at all games?
02:42:47.000 Oh, okay.
02:42:48.000 Beat us at all games is already here, isn't it?
02:42:50.000 Well, pretty much.
02:42:51.000 And so then let's say you start applying that to beating us at the games of how to concentrate all the capital, right?
02:42:59.000 Because ultimately, market is a game.
02:43:02.000 And most of the market is high-frequency trading run by AI now already.
02:43:06.000 If you do the super one, then you concentrate all capital into one thing.
02:43:10.000 Yeah.
02:43:10.000 It's actually worth noting.
02:43:14.000 It's a chess game, and you can out-compete.
02:43:17.000 If you can see more moves ahead in the chessboard against the other AIs, you win the other AIs, and then you just move faster.
02:43:21.000 I know, but that's what's terrifying, is that it moves completely into the realm of AI, and it's outside of human comprehension.
02:43:30.000 We're not even in the game anymore.
02:43:32.000 So, specifically, his thinking was, it's inevitable that that will happen.
02:43:38.000 It will be dystopic.
02:43:39.000 There's no way for it to not be dystopic.
02:43:53.000 Actually, the only answer is to jack our brains in so that the meat suit is somewhat useful to the AGI. So now we're in a race to do that.
02:44:03.000 When people understand the catastrophic risks and they don't see any good possibility out, then oftentimes they will actually accelerate some version of a catastrophe as the only reasonable solution.
02:44:15.000 And this is why...
02:44:17.000 It's so important to actually define the design criteria right and have people committed to find solutions even though they're really hard.
02:44:23.000 And it's why I think something like this is interesting is truly a belief that a lot more people focused on what we need to be trying to solve is actually useful.
02:44:32.000 We think there's a lot of super smart people at a lot of universities and in institutions and… So let's start with the right design criteria,
02:44:50.000 right?
02:44:51.000 The design criteria of...
02:44:53.000 If you're adding tech that affects society, it has to actually be increasing the quality of democracy.
02:44:59.000 It has to be increasing the integrity of markets.
02:45:01.000 It has to be increasing the quality of families and mental health.
02:45:04.000 You look at what are the foundational things.
02:45:05.000 If it's not doing that, it failed the design criteria.
02:45:08.000 Similarly, the idea that we have these dystopias and these catastrophes, we need – and the catastrophes come from not being able to create order in the presence of the success of tech.
02:45:19.000 The dystopias come from top-down order.
02:45:21.000 So what that means is rather than have imposed order or no order, we need emergent order, which is what democracies and markets are supposed to do.
02:45:28.000 But they haven't – they have to be upregulated, a new, more perfect union that's upregulated to the level of tech we have because the tech has advanced so far.
02:45:36.000 We need new versions of it.
02:45:37.000 So how do we bring about emergent order of or by the people that can direct the tech to not be catastrophic but isn't dystopic?
02:45:46.000 I just want a lot more people thinking about that.
02:45:49.000 I want a lot more smart people at MIT and Stanford and the State Department and wherever and in Ethereum working on those issues, proposing things, finding out what's wrong with them so that the collective intelligence of the world is centrally focused on How do we make it through the metacrisis?
02:46:06.000 How do we make it through the fact that we are emerging into the power of gods without the ability to steward that well?
02:46:13.000 What would it take to steward it well?
02:46:16.000 What will it take?
02:46:20.000 Well, a lot of people working on what it will take and coming up with partial answers is part of the answer, right?
02:46:26.000 That's what we're kind of saying right now.
02:46:27.000 When we started the Manhattan Project, we didn't know all the ways that it was going to come together, right?
02:46:32.000 So there is a leap of faith.
02:46:34.000 We have to be comfortable with the uncertainty.
02:46:35.000 It's one of the developmental qualities that's needed.
02:46:37.000 It's like we don't know how to make it through this.
02:46:39.000 Like, got it.
02:46:40.000 Step into that reality.
02:46:42.000 Take a breath into that.
02:46:43.000 Yeah.
02:46:44.000 And we have to figure out how we're going to do this.
02:46:46.000 We have to refresh the way people felt after they saw the social dilemma.
02:46:51.000 Because the problem is they waited about two weeks and then they got right back to their normal consumption.
02:46:55.000 Maybe not even two weeks.
02:46:57.000 100%.
02:46:57.000 And I think that we have a very short memory when it comes to things that are impactful and really sort of...
02:47:06.000 Jog your view of reality, you know, it's so easy to slide right back into it and I think There has to be an effort Where we remind people, we remind each other,
02:47:22.000 we remind ourselves, whether it's a hashtag, whether it's some sort of an ethic, a movement, an understanding, like we're moving in the wrong direction.
02:47:31.000 And we need to establish that as a real clear parameter, like we've got a problem here.
02:47:40.000 I think people do get that.
02:47:41.000 A lot of people.
02:47:42.000 They do.
02:47:42.000 The social dilemma.
02:47:43.000 People got it.
02:47:44.000 A lot of people.
02:47:44.000 The Facebook file stuff.
02:47:46.000 People's negative reaction to the metaverse.
02:47:47.000 Yes.
02:47:48.000 Yes, there's a lot of power on the other side of the table, right?
02:47:51.000 We've got trillions of dollars of market power.
02:47:53.000 The question is, are we the people, the culture, going to be able to identify what we don't want and then steer ourselves in the direction of what we do?
02:47:59.000 But are we operating in an echo chamber where we're talking to a lot of people that are aware of it?
02:48:03.000 So when you say people are aware of it, like what percentage are we talking about?
02:48:07.000 Is it even 20?
02:48:08.000 Most people who watch The Social Dilemma walked away with something like tech is a problem.
02:48:13.000 It's kind of generally scary and it seems to be bad for teenagers and families.
02:48:18.000 They didn't get is fundamentally incompatible with democracy because it polarizes the population, polarizes the representative class, creates gridlock and makes it less effective relative to other forms of government.
02:48:29.000 This is also like...
02:48:30.000 Here we are.
02:48:31.000 We're at the three-hour mark here.
02:48:33.000 So we've been having this conversation, and even though we're doing our best to try to pin down solutions, and it's like...
02:48:42.000 This is a very ethereal thing.
02:48:44.000 It's a very...
02:48:45.000 It's just like...
02:48:46.000 It almost seems ungraspable, you know?
02:48:52.000 It just seems like you can nail down all these...
02:48:57.000 Problem issues, but then when it comes to real-world application, I'm like, what the fuck do you do?
02:49:03.000 Well, this show is going to air by bouncing off of satellites that are in outer space to be able to go to people's phones and computers using the most unbelievably advanced technology.
02:49:18.000 That's pretty ethereal.
02:49:19.000 It's actually very hard for people to grasp the whole scope of the technological complexity.
02:49:24.000 When you have that much technological complexity and that much technological power, we also have to be able to work on complex social systems that can make us able to wield it.
02:49:33.000 And we just haven't had the incentive and motive to do that.
02:49:37.000 But hopefully, recognizing where it goes if we don't is incentive for enough people to start working on innovation more.
02:49:43.000 But this technology, this fascinating and super complex technology is disconnected from human nature, from these thousands and thousands of years of human reward systems that are baked into our DNA. That's part of the problem.
02:49:57.000 Only because we have a whole system, trillion dollar market cap system, dependent on hacking and mining from those human vulnerabilities.
02:50:04.000 Because they've already done that.
02:50:07.000 And we instead reflect back in the mirror not the worst angels of our nature but the better angels of our nature.
02:50:12.000 We see examples of people doing the hard thing over the easy thing.
02:50:15.000 We see examples of people hosting events for each other and being better to each other rather than being nasty to each other.
02:50:20.000 We're just not reflecting the right things back in the mirror.
02:50:23.000 We do reward when people do those things, right?
02:50:26.000 Occasionally, but the social media algorithms don't reward them by and large, right?
02:50:30.000 They take a couple examples where the positive thing happens, but mostly we see the most engaging, outrageous, controversial thing.
02:50:35.000 And so we have to reflect back something else in the mirror.
02:50:37.000 I think it's like, if you remember the 1984 ad, and bring it back to Apple, and if you remember the ad for the Macintosh, the famous thing where there was a woman running down the...
02:50:45.000 Like, there's the booming big brother on the screen, and the woman's running down, and she takes this hammer, and she's wearing a Macintosh t-shirt, and she takes this hammer, and she throws the hammer at the screen, and it blows up.
02:50:56.000 And it says, on January 24, 1984, Apple will introduce Macintosh, and you will see why 1984 won't be like 1984. Yeah.
02:51:06.000 Was it 1984 that Apple came up with that computer?
02:51:09.000 Yeah.
02:51:10.000 And the reference was, we have to defeat Orwell.
02:51:13.000 We have to defeat that fiction.
02:51:14.000 That can't come true.
02:51:16.000 And we, I mean, it was specifically against IBM as being Orwell.
02:51:19.000 I don't think I've seen that.
02:51:19.000 Oh, you haven't?
02:51:19.000 It's worth seeing for people to check it out.
02:51:21.000 So this is pre-internet.
02:51:23.000 This is it.
02:51:23.000 Let's watch it.
02:51:26.000 This is one army on Earth.
02:51:28.000 We are one people.
02:51:30.000 With one whim, one resolve, one cause.
02:51:35.000 Our enemies shall talk themselves to death, and we will carry them.
02:51:48.000 On January 24th, Apple Computer will introduce Macintosh.
02:51:53.000 And you'll see why 1984 won't be like 1984. Wow.
02:52:00.000 It's powerful.
02:52:01.000 That is powerful.
02:52:02.000 You know, we ended our last conversation with me giving you...
02:52:05.000 I was in high school.
02:52:05.000 Yeah.
02:52:06.000 I was actually born that year, which is crazy.
02:52:08.000 That's nuts.
02:52:09.000 That was aired during the Super Bowl, by the way.
02:52:13.000 It was rated the most successful television advertisement in TV history.
02:52:17.000 Steve Jobs had a direct role in it.
02:52:20.000 You can imagine...
02:52:22.000 We have to not let the Orwell-Huxley two gutters thing happen.
02:52:27.000 We have to throw a hammer down the middle and create a future where technology is actually humane and cares about protecting the things that matter to us.
02:52:36.000 One thing that gives me hope is that these kind of conversations are very popular.
02:52:40.000 You know, like The Social Dilemma is very popular.
02:52:43.000 Last one we did got like 9 million views or something like that.
02:52:45.000 Yeah, this one will probably be similar.
02:52:47.000 It's like people are interested in this conversation because they know it's a real issue, at least the kind of people that are tuned into this podcast.
02:52:55.000 And I think it's going to be like little baby steps into the correct direction.
02:53:03.000 And, you know, what I said about psychedelics is it's one of those...
02:53:06.000 Seemingly frivolous discussions.
02:53:10.000 People that don't have any psychedelic experiences, they don't realize the dramatic transformative impact that those things can have on cultures.
02:53:19.000 But we don't have much time.
02:53:21.000 No one has much time.
02:53:22.000 Not a fucking human that's ever lived.
02:53:24.000 A hundred years ain't shit.
02:53:26.000 And during this time of this lifespan that we've experienced, we've seen so much change.
02:53:33.000 And so much almost unstoppable momentum in the general direction, and it doesn't seem good.
02:53:39.000 But recognizing it, discussing it, and having documentaries like The Social Dilemma, having folks like you guys come on and discuss, like, what is really going on?
02:53:49.000 And we didn't even really get into a lot of the real technological dilemmas that we have.
02:53:54.000 You know, we basically glossed over the idea of drones and the idea of CRISPR and many of these other problems that Just watching that text-to-code thing going, oh my god, the barrier of entry has been eliminated.
02:54:10.000 Can you type now?
02:54:12.000 And you can code.
02:54:14.000 It's wild.
02:54:15.000 But hopefully through conversations like this and you putting attention on it, I mean, you are part of the sense-making world.
02:54:21.000 You are helping people make sense of the world.
02:54:24.000 And when you put your attention on it, I mean, I'm grateful for you creating this opportunity to talk about these things because, you know, they're heavy conversations and they're hard to look at.
02:54:33.000 Well, and it's actually important that you have these long-form podcasts, right, that are two-plus hours as opposed to...
02:54:40.000 Matthew Feeney, Jr.: Five second clips or tweets is when we talk about tech has to enhance the fundamentally important things.
02:54:47.000 So we saw how tech kind of specifically social media tech with the polarization algorithms messed up the fourth estate also messing up education.
02:54:54.000 It doesn't matter what you learn if you can't remember anything and you have no control of your attention.
02:54:59.000 And so one of the things is that tech has to actually be increasing people's attention.
02:55:05.000 Right?
02:55:05.000 Their attention span, their control over their own attention, and their memory.
02:55:09.000 If we were to be able to measure those things and say, let's upregulate that.
02:55:12.000 If you want a democracy to work, the tech should upregulate people's perspective seeking.
02:55:16.000 How much are they actually seeking other people's perspective?
02:55:18.000 And if I have a short attention span, I can't hold multiple perspectives simultaneously because you just can't fit that in a tweet.
02:55:24.000 It takes a while.
02:55:24.000 You just say one cynical perspective and I'm right and that's the only thing I'm going to think.
02:55:28.000 Aaron Ross Powell And so imagine that like instead of the short clickbait thing because otherwise I'll bounce, if I actually read the longer thing and if my post had more nuance, that actually got upregulated.
02:55:37.000 So it created an incentive to take in other perspectives, to try to parse them and synthesize them.
02:55:42.000 That would be really different.
02:55:44.000 We've got to incentivize kindness, too.
02:55:46.000 You know, this willingness to engage in nonsensical arguments, it's just so common.
02:55:53.000 Twitter is the best example of that.
02:55:54.000 It's like a mental institution where people are just throwing shit at each other.
02:55:58.000 It's so wild to watch when you don't see examples like this in real life.
02:56:03.000 It's like accentuating the worst examples of the way people can communicate But doing so in this weird public square.
02:56:11.000 It itself is a virtual reality.
02:56:12.000 If you think about just what it does, it's ranked by what's most engaging.
02:56:15.000 So it's like every dramatic event that happened with anyone anywhere, like little drive-by like, oh, you just cut me off on the freeway and I'm upset for a second.
02:56:23.000 Anywhere that happens anywhere, it just collects it all into this one efficient feed.
02:56:28.000 And then people are responding as if it's all happening at the same time.
02:56:31.000 It's already this weird chronologically distorted reality because it's pulling from all these different moments and making it seem as if it's all one moment.
02:56:39.000 So people need to just see.
02:56:41.000 Twitter is just bad.
02:56:42.000 And it affects people's minds where they think that this is the world that they live in, where it is this concentrated form of it.
02:56:53.000 You know, my friend Peter Attia was on the other day, and he was talking about how bad juice is for you.
02:56:59.000 Like, people think that juice is good for you, like orange juice.
02:57:01.000 He's like, it is such a sugar rush to your liver that your liver is like, what the fuck is all this?
02:57:08.000 Like, you're drinking like...
02:57:10.000 11 oranges worth of juice, and it just going straight to your liver, and your liver has a really hard time processing.
02:57:17.000 This is almost like that, the social media version of that.
02:57:22.000 Like, your brain gets all these...
02:57:26.000 Impactful moments without all the regular life space in between them if you live a normal existence.
02:57:34.000 Instead, it concentrates it from billions of people all around the world and shoves it right through your fucking brain.
02:57:40.000 And we're not designed for that.
02:57:42.000 I had a period where I intentionally went and curated my Facebook algorithm where I followed all of the groups that look at police violence, cop block and those ones.
02:57:52.000 And so my feed just became filled with cops killing black guys and killing – escalating violence in ways that didn't look useful.
02:58:02.000 Now, of course, those videos also didn't show what happened beforehand to possibly justify it or not.
02:58:07.000 So like they were selected for a reason.
02:58:10.000 But even where they were egregiously wrong, they might be a very small statistical percentage of all police interactions.
02:58:18.000 But even though I knew I was curating my feed for this purpose, it emotionally affected me intensively just watching that many in a row.
02:58:25.000 But by the time I've watched 12, it feels like this is everything that's happening.
02:58:29.000 Right, right.
02:58:30.000 That's the whole world.
02:58:31.000 And then I got rid of those and I curated ones that were like pro-police, thin blue line kind of ones.
02:58:38.000 And you saw people aggressing against cops and you saw what they have to deal with.
02:58:42.000 And I was like, man, these guys are heroes.
02:58:44.000 And again, it only took like 12 videos.
02:58:46.000 And even though I was knowingly doing it to myself… It was that emotionally compelling because we are used to evolutionarily seeing a world that is representative of the world.
02:58:56.000 But when there's so much data that 0.01% of it is more information than I can possibly take in and it can be totally statistically unrepresentative but it still affects what I feel the world is, you can see how… Earnest people can get completely polarized.
02:59:13.000 That's such a good point.
02:59:14.000 And it is earnest people.
02:59:16.000 And the fact that you are consciously curating it and still having this effect on you, but you at least can objectively express it to other people.
02:59:23.000 And, you know, hopefully that gets into some people's brains and they see how dangerous this stuff is.
02:59:31.000 And this is also why these troll farms exist.
02:59:35.000 Because they can really influence the way our culture moves and behaves and the way it thinks about itself.
02:59:43.000 Gentlemen, thank you very much for being here.
02:59:45.000 This was...
02:59:47.000 Terrifying and daunting.
02:59:49.000 I mean, I feel good, but I also don't.
02:59:53.000 It's hard.
02:59:54.000 I get it.
02:59:54.000 I feel like we should another time come back and talk about more concrete pathways for this stuff.
03:00:00.000 Yeah, let's do it.
03:00:01.000 Let's do it.
03:00:01.000 Let's give it a couple of months and hope things don't turn to nuclear war.
03:00:06.000 I'll tell you why it feels inspiring to me, and thank you for having us here, is...
03:00:10.000 There's a lot of people who are focused on systemic injustice or climate change or economics or AI issues, but how do all these issues fit together and how do we actually deal with the fact that we've created so much technological power and we've had such a huge impact on our environment through the whole industrial use of technology that the world's increasingly fragile.
03:00:33.000 These aren't just separate issues.
03:00:35.000 They are related.
03:00:35.000 There is a kind of global meta-crisis and there is a need for real innovative solutions in how we think about it.
03:00:41.000 And I think because of this, more people just at least be thinking about that and then be talking about it.
03:00:46.000 And that means more of the collective intelligence of the world hopefully being able to start to work on solutions.
03:00:51.000 And I am hopeful about that.
03:00:53.000 I'm hopeful about that, too.
03:00:54.000 Let's end in a positive note.
03:00:56.000 Gentlemen, thank you very much.
03:00:57.000 I really appreciate you.
03:00:59.000 Thank you.
03:01:00.000 Bye, everybody.