Daniel Schmachtenberger is a UX designer at Google, and he's spent the last few years trying to figure out how to ethically influence other people's thoughts, concerns, and behaviors. In this episode, we talk to Daniel about the challenges he's faced over the years, and how he's come up with a framework for how to address them. He also talks about the role of asymmetries of power, and why it's important to understand the relationship between what technology knows about us and what we don't know about ourselves. It's a really important conversation, and one that I think a lot of people need to hear. If you haven't heard it before, you should definitely check out the movie "The Social Dilemma" and watch it on Amazon Prime or wherever else you get your media. If you don't have a Kindle device, you can get a free eReader app from Amazon so you can read my book on any laptop, desktop, smartphone, tablet, or other device you want. I'm giving away a Kindle eReader device for free! Kindle $9.99, iBook $99, Paperback 49, Hardcover 49, or AudioBook 49, and Tascam 49, which is also free for Audible $99.99.00. You can get an Audible membership for 49.99 and Audible 49, plus Audible is 49,99 gets a 4-track membership for 4-tracks for 49-99,99.95, plus shipping & Audible for 4 years.99 a year, shipping starts from $49.99 starting from $99 a month. Music: "Goodbye, goodbye, hello, goodbyes" by John Carmack, "goodbye, bye, goodmorning, good morning, hello! by John Rocha, my love you, bye bye, bye! John, I'll see you next week! - John, Timestamps: 3:00) 4:00 5:30) 6:15) 7:40) 8:15 9:30 11: How do you influence people? 12: How to influence people 13:30:00s 15:30s 16:40 17:40s: What do you think about the future of the future? 17, 18:10s: How would you like to know more?
00:00:14.000I keep doing these podcasts where I just talk to people, so please introduce yourself and tell people what you do.
00:00:21.000I am Tristan Harris and came on the show about a year ago after The Social Dilemma came out.
00:00:27.000That's probably where most people know me.
00:00:29.000And used to be a design ethicist at Google, studying how do you ethically influence people's attention and thoughts and behaviors.
00:00:37.000And really enjoyed the conversation last year.
00:00:40.000The reason that today I'm here with Daniel Schmachtenberger, who's really a person I've learned so much from the last few years and why I thought it'd be a good through line, What a daunting task.
00:01:11.000And what a weird thing that this Industry that didn't exist 20 years ago has such a I mean think about life on earth and then 20 years ago all of a sudden this social media thing sort of evolves and Now you have to wonder how much of an effect it has on our just day-to-day lives and how to ethically influence people Yeah.
00:01:40.000How do those thoughts even get, you know, how does that get worked out?
00:01:44.000Actually, I should first say that there wasn't at Google a department that said, how do we ethically influence people?
00:01:49.000I actually sort of, as was shown in the film The Social Dilemma, wrote this presentation worried about how technology was influencing people's thoughts, concerns, behaviors, etc.
00:01:59.000And I studied persuasive technology at Stanford, which is a whole discipline and field, the idea that technology can influence people.
00:02:06.000And it was out of my own personal concern that when that presentation went viral at Google, I kind of worked my way into this position that never existed before, which was how could we create a framework for what it means to ethically influence other people?
00:02:21.000And a lot of that has to do with asymmetries of power.
00:02:23.000I mean, when I was a kid, I was a magician.
00:02:34.000And actually across some of these things we're going to talk about today are ways that there is an asymmetric relationship between what technology knows about us and what we don't know about ourselves.
00:02:44.000When you were studying at Stanford, what year was this?
00:02:48.000This was 2002 to 2006. I was an undergrad and then 2006 I got involved.
00:02:54.000With Professor BJ Fogden, who, again, actually studied ways that persuasive technology could be used for positive purpose.
00:03:01.000Like, how do you help people be healthier?
00:03:08.000But I got concerned because it was all of...
00:03:12.000This increasing arms race to use persuasive tools to harvest and capture people's attention, now known as the race to the bottom of the brainstem, to go down the brainstem into more social validation, more social narcissism, all of that.
00:03:25.000And that's one of the arms races we see everywhere, which is like in every single thing.
00:03:29.000If one oil company doesn't drill for that oil well, the other one will.
00:03:33.000If one attention company doesn't add the beautification filter, the other one will.
00:03:37.000If one company doesn't do narcissism, social validation hacking, and likes and variable rewards, the other one will.
00:03:44.000And it's true across so many of the other issues that we're facing, whether it's like, if I don't build the drone for everyone, then someone else is going to build the drone for everyone.
00:04:03.000iPhone came out in 2007. We were studying persuasive technology and I was, as I've said in the past, partners with the co-founder of Instagram in the persuasive technology class.
00:04:13.000So we were actually studying how would you apply persuasive technology to people before the iPhone even existed.
00:04:20.000And, you know, what bothered me is that I think when people think about how do you ethically persuade people, you just get into a whole bunch of ethical cop-outs.
00:04:29.000Like, well, we're just giving people what they want.
00:04:32.000You know, or if they don't want this, they'll use something else.
00:04:35.000There's these very simple ways that the minds of people in technology, the tech industry, I think defend what they're doing.
00:04:41.000And what concerned me was that the ethical framework wasn't really there.
00:04:44.000Not that I had one at the time, by the way.
00:04:45.000I studied at Google for three years to try to...
00:04:47.000Like, what does it mean to ethically influence three billion people who are jacked into the system?
00:04:52.000And this is before Cambridge Analytica, before the Facebook files and Frances Haugen talking about that, you know, we now have the receipts for all these things.
00:04:59.000So we talked about all these things in The Social Dilemma.
00:05:01.000But now there's the evidence with Frances Haugen's whistleblowing that, you know, Instagram makes body image issues worse for one in three teenage girls.
00:05:09.000I know I'm going fast, but that's the broad strokes.
00:05:12.000Do you know the conspiracy theory about her?
00:05:15.000The conspiracy theory amongst the tinfoil hat folk is, first of all, she started a Twitter account like right before she went there and was immediately verified.
00:05:29.000And then instantaneously was on all these major media outlets, major network television shows and being interviewed.
00:05:37.000And she was saying something that a lot of people felt like was a call to authoritarian intervention into social media.
00:05:47.000That it was government censorship was the solution and regulation was the solution to dealing with this problem and that it seemed like she was a She was a sanctioned whistleblower.
00:05:58.000She was saying all the things that they wanted to hear, and that's why they put her in the position to make a big loud noise.
00:06:06.000What did you think about that when it came up?
00:06:09.000I always have to do this, you know, when something like that happens, like, hmm, maybe, maybe, because you know the government would do that.
00:06:17.000Most certainly, they would love to have control over social media.
00:06:20.000They would love to be able to censor things like the Hunter Biden laptop story.
00:06:24.000They would love to be able to hide Joe Biden's medical records or Kamala Harris's time as a prosecuting attorney.
00:06:34.000There's a lot of stuff they would like to do.
00:06:38.000There's a lot of stuff they would like to do with access to information.
00:06:42.000I mean, you're seeing it right now in terms of one of the things that's been fascinating about COVID. During this pandemic and during this terrible time of paranoia and dealing with this disease and fear and anxiety, you're seeing this narrative from social media networks that absolutely walk step-in-step with the government,
00:07:05.000where if the government wants certain information censored, it's being censored across major social media platforms that has to be coordinated.
00:07:16.000There's no way it's not, and there's no way they're incentivized to not have people discuss certain things, because we've said before, you know, it's one of the major points of the social dilemma, is that things that are controversial,
00:07:32.000whether they're true or not, are the things that are the most clicked on, the most shared, and that's where the money is.
00:07:41.000So there's got to be some sort of incentive for them to not do what they do with every other subject, whether it's immigration or gun control or abortion or anything.
00:08:01.000I mean, the border crisis is a great example of that.
00:08:06.000The government would probably like us to not see all those Haitian immigrants storming across the border, but my God, those were shared like crazy.
00:08:19.000Well, because there was a narrative that they could say, well, this is dangerous misinformation and we can protect people, even though some of it turned out to actually be accurate, like the lab leak hypothesis.
00:08:33.000It's a hypothesis that at least is being considered by virologists.
00:08:37.000But the point is that Who the fuck are they to decide what can and can't be discussed?
00:08:44.000And when they're doing something step-in-step with the government, I get concerned.
00:08:49.000So when someone comes along and this person who's a whistleblower says, something needs to be done, you know, we're endangering young girls' lives, we're doing this, we're doing that, we need some sort of government intervention.
00:09:01.000I mean, this is essentially calling for censorship and calling for government control of social media, which freaks people.
00:09:08.000So she's pretty clear that she's not calling for censorship.
00:09:11.000But the reason I asked you was curious how it came across your radar, because I happened to know and hear a little bit about this from her.
00:09:21.000And the story that goes viral about her saying that she's a psyop or that she's a plant, that's an incendiary, inflammatory, controversial story.
00:09:31.000So when that gets suggested, is it going to go, is it just going to fizzle out or is it going to go viral?
00:09:39.000And in fact, when you kind of realize everything...
00:09:43.000I mean, there's some things that are real conspiracy theories, and there's some things that are real psyops, and that's a real thing.
00:09:48.000But notice how many things we think of as psyops, conspiracies, etc.
00:09:53.000now, and it's because anything that has that incendiary quality goes viral.
00:09:58.000And I happen to know, for example, I think one of the things that claims in there is that she's funded by this billionaire, Piero Midiar, But I happened to know from talking to her that that happened at the very, very end of what she was doing.
00:10:09.000And it was a tiny grant of like $150,000 for us in the nonprofit world.
00:10:13.000That's like a tiny amount of money basically just to support her flight costs.
00:10:16.000And I happened to also sort of hear from her how much of the media was constructed at the very last minute.
00:10:23.000Like she was working this one newspaper, the Wall Street Journal, to do this sort of procedural rollout of specific stuff that she thought was concerning.
00:10:31.000I guess what I'll just say is like, What if she's just a good-faith person who saw that virality was driving people crazy and that it was harmful to teenage girls?
00:10:41.000And it's true that the government would see some of that and say, hey, we could use that for something else.
00:10:49.000She could be a tool for us to do something else.
00:10:51.000But I guess in the aim of complexity and nuance and not jumping to conclusions and this sort of thing, My perception from talking to her now extensively, she's a very good-faith actor who was concerned that this was going to drive the world apart.
00:11:05.000I should be really clear that this is not my position.
00:11:21.000Young girls are a point of focus for...
00:11:25.000Why they're a point of focus more than young boys, I'm not entirely sure.
00:11:29.000I guess it has to do with their emotional makeup, and there's higher risk of self-harm due to social media, and Jonathan Haidt talked about that in his book, The Coddling of the American Mind.
00:11:42.000And my kids, you know, my 13-year-old does Have interactions with our friends, and I do see how they bully each other and talk shit about each other, and they get so angry and mad at each other.
00:11:56.000It is a factor, but it's an algorithm issue, right?
00:12:01.000So the first thing is, just to kind of set the stage a little bit...
00:12:06.000I always use E.O. Wilson, the sociobiologist who sort of defined what the problem statement for humanity is.
00:12:14.000He said, the fundamental problem of humanity is we have paleolithic emotions and brains, like easy brains that are hackable for magicians.
00:12:21.000We have medieval institutions, you know, government that's not really good at seeing the latest tech, whether it was railroads or now social media or AI or deepfakes or whatever's coming next.
00:13:00.000Meaning, when you're on a news feed, like, I don't want to just show you any news.
00:13:04.000I want to show you the most viral, engaging, like, longest argumentative comment threads.
00:13:10.000So that's like pointing a trillion-dollar market cap AI at your brain, saying, I'm going to show you the next perfect boogeyman for your nervous system, the thing that's going to make you upset, angry, whether it's masks, vaccines, Francis Haugen, whatever the thing is, it will just drive that over and over again and then repeat that thing.
00:13:27.000And that's one of the tools in the arsenal to get attention, is that the algorithms.
00:13:31.000Another one is technology making design decisions, like how do we inflate people's sense of beautification filters?
00:13:38.000In fact, just recently, since we talked last time, I think it's an MIT Tech Review article showing that they're all competing, first of all, to inflate your sense of beauty.
00:13:51.000But they're competing for who can give you a nicer filter, right?
00:13:54.000And then now, instead of waiting for you to actually add one, TikTok was actually found to actually do like a 2%, like just bare beautification filter on the no filter mode.
00:14:04.000Because the thing is, once they do that, the other guys have to do it too.
00:14:07.000So I just want to name that all of this is taking place in this race to capture human attention, because if I don't do it, the other guy will.
00:14:14.000And then it's happening with design decisions, like the beautification filters and the follow you, and if you follow me, I'll follow you back, and the like button, and check, pull, refresh, the dopamine stuff.
00:14:23.000Then there's the algorithms, which is I'm pointing a thing at your brain to figure out how can I show you an infinite feed that just maximally enrages you?
00:14:31.000And we should talk about that because that thing drives polarization, which breaks democracy.
00:14:38.000So how did you guys meet and how did this sort of dynamic duo come about?
00:14:45.000Yeah, I was working on studying kind of catastrophic risks writ large.
00:14:49.000You've had people on the show talking about risks associated with AI and with CRISPR and genetic engineering and with climate change and environmental issues.
00:15:18.000Are there any kind of like societal generator functions of all the catastrophic risks that we can address with to make a more resilient civilization writ large?
00:15:27.000Tristan was working on the social media issues and When you had Eric on, he talked about the twin nuclei problem of atomic energy and kind of genetic engineering and basically saying these are extremely powerful technologies that we don't have the wisdom to steward that power well.
00:15:44.000Well, in addition to that, it's all things computation does, right?
00:16:14.000You're impacting a billion people in deeper ways much faster, which means that if you're blind to something, if you don't know what you might be doing, the consequences show up faster than you can actually remediate them.
00:16:23.000When we say exponential tech, we mean a number of things.
00:16:26.000We mean tech that makes more powerful versions of itself, so it can use computer chips to model how to make better computer chips, and then those better computer chips can recursively do that.
00:16:33.000We also mean exponential speed of impact, exponential scale of impact, exponentially more capital returns, exponentially smaller numbers of people capable of achieving a scale of impact.
00:16:45.000And so when he's mentioning godlike powers and kind of medieval institutions, the speed at which our tech is having influences in the world and not just first order influences, the obvious stuff, but the second and third order ones.
00:16:57.000Facebook isn't trying to polarize the population.
00:17:01.000of the thing they're trying to do which is to optimize ad revenue.
00:17:05.000But the speed at which new technologies are having effects on the world and the total amount of consequence is way faster than regulation can keep up with.
00:17:12.000And just by that alone, we should be skeptical of any government's ability to regulate something that's moving faster than it.
00:17:19.000Faster than it can appraise of what the hell is even happening in the first place.
00:17:22.000Not only that, you need someone who really understands the technology and you're not going to get that from elected officials.
00:17:29.000You're going to need someone who's working on it and has a comprehensive understanding Of how this stuff works, how it's engineered, where it goes.
00:17:38.000I mean, I'm skeptical of the government being able to regulate almost everything.
00:17:49.000Like where the nuclear pathways of escalation or the way a satellite or GPS could get knocked out triggers a nuke somewhere, that's also really complex.
00:17:58.000CRISPR, you know, bio stuff is complex.
00:18:00.000So in general, like one of the ways to summarize the kind of problem from our friend Zach Stein's kind of work is that the complexity of humanity's problems is going up like this, but the capacity to meet them is like not really meeting it.
00:18:12.000And then you add in social media and you polarize people and divide them into like they don't even know what's true because everyone's got their own personalized version of reality.
00:18:38.000And when that goes viral, everybody saw that.
00:18:40.000And they didn't see that, you know, the five senators who I talked to, who actually do really get these things pretty decently.
00:18:45.000Now, I'm not going to say, like, let's just, like, regulate it, but just to notice, right?
00:18:48.000So the cynical take about every time an institution makes a mistake, that thing goes viral, which means we lose trust in so many things, because no matter what the issue is.
00:18:58.000You noticed that you were bringing up the conspiracy theory of, might the government have an incentive to make a plant like Francis?
00:19:05.000And so it's plausible, but plausible doesn't automatically mean is.
00:19:09.000One of the challenges is when someone has a confirmation bias, they hear something that's plausible and they just assume that it is without doing the due diligence of saying, what would I need to know?
00:19:17.000And you do a good job of checking that.
00:19:18.000We could also say, would Facebook have an incentive to say that she's a plant and try to hire a bunch of PR to do that?
00:19:24.000And they were helping to spread that story, by the way.
00:19:27.000I'm not saying they're responsible for it.
00:19:29.000I actually think that what happens organically, again, the cynical take goes viral.
00:19:34.000And then if you're Russia or China or you're Facebook in this case, you can be like, hmm, that's a really helpful cynical take from my perspective.
00:19:40.000In fact, one of the things that Facebook does try to do is turn the social media debate into a censorship or free speech debate because they know that divides the political class because they know that the right doesn't want censorship, obviously.
00:19:54.000And so they say the more they can spin whatever Frances is doing as she's claiming censorship, the more they can divide any possibility for actual action.
00:20:03.000In fact, I'll tell you just a quick story, really quick, is during the three-hour testimony that Frances gave, if you watch the full three hours, she had both people on the left and the right.
00:20:13.000And I've been working on this for eight years.
00:20:14.000I have never seen someone create a bipartisan consensus the way that she did.
00:20:18.000She actually did if you watch the video.
00:20:32.000Because the story went viral saying that she was a democratic operative and he said, my base will hate me if I meet with you.
00:20:38.000So the very thing we're talking about, which is the ability to regulate anything, is being broken and shattered because the incendiary controversial take on everything goes viral.
00:20:49.000Now, again, I'm not saying that we're this easy world we should therefore regulate.
00:21:42.000It just shows you that so much of what ideology is, is tribal.
00:21:47.000It's like you find a group that agrees to a certain pattern of behavior and thought, and you subscribe to that.
00:21:57.000Now, I am a right-wing conservative, I am a left-wing progressive, and then you just follow the playbook.
00:22:04.000And it makes it so much easier than having your own individual nuanced thoughts on complex and difficult issues like this.
00:22:11.000But the fact that he couldn't talk to her because his base would somehow or another think that she actually is a democratic operative and she does work for the government and is some sort of an attempt at censorship.
00:22:22.000And I'm sure not only is Facebook amplifying that, but All of the different Russian troll pages on Facebook are amplifying that, which confuses the water.
00:23:36.00085% of the Christians who saw that stuff in their feed, they didn't actually accept an invitation from the group or the page to say, yes, I want to subscribe to you.
00:23:47.000Facebook, because they're optimizing for growth, they changed the way the system works so that if a page invites you, that's enough for it to start putting the content in your feed.
00:23:56.000So there's an example in Francis' work where there was a QAnon person who invited 300,000 people in one day.
00:24:53.000It's just Facebook makes that amazingly easy because it automatically already puts people into tribal groups That whatever the content is in that group is going to keep getting upregulated, optimizes for inflammation and tribal identity and those types of things.
00:25:08.000And so you don't have to kinetically attack a country to make the country so turned against itself that the polarized population supports a polarized population.
00:25:18.000Representative class, which means you get gridlock on everything, which means you can't do effective governance, which means another country that does autocratic governance just wins geopolitically.
00:25:26.000It seems absolutely insane that they could, through one page, inviting people, instantaneously start to distribute all of their information on those people that they invited.
00:25:37.000So why would Facebook even allow that?
00:25:40.000So if I'm designing Facebook, you would probably say, Wait, wait.
00:25:43.000You just said the government should regulate social media.
00:25:49.000Well, this is – I don't think the government should regulate.
00:25:52.000But I do think there should be rules in terms of like – if you're a regular person that, say, has a specific group of interests – Say you only like motor cars.
00:26:07.000You like hot rods or whatever, and that's what you're interested in.
00:26:11.000You use Facebook when you're off duty at work and you just want to check some stuff out, and all of a sudden you get QAnon shit because they invited you into this QAnon group, and you start getting all this information.
00:26:24.000It seems like And again, I don't know what we should do in terms of regulation.
00:26:32.000But I don't think that social media groups should be able to just distribute information to people based on this concept of universal growth.
00:26:46.000I mean, if we were designing Facebook with a feature called groups, and groups had a feature called invitations, and you could invite people.
00:26:52.000Wouldn't you design it so that people have to accept the invitation for the group before it shows up in your feed?
00:26:57.000Why would Facebook not do it that way?
00:26:59.000Because what happened is, starting in 2018, people stopped posting as much on Facebook.
00:27:04.000So you and I, and maybe we used to post a lot more in 2016, 2017. If we stopped posting as much, oh shit, we can't harvest all that attention from people.
00:27:15.000Oh, just like people being more skeptical maybe of Facebook or just realizing they don't want to share as much or just usage burning out, more people moving to Instagram or TikTok.
00:27:22.000People are getting older as well, right?
00:27:35.000Oh, I've got this thing called Facebook groups where people are posting all the time.
00:27:39.000So I'm going to start putting that stuff in people's feeds to just – so now I'm fracking for attention.
00:27:44.000I'm going lower into all these other places to backfill this attention harvesting, we are the product machine.
00:27:50.000And how do you know, since there isn't rigorous identity, if a user that says they're a user is really who they are or if they're a troll farm or if pretty soon they're an AI GPT-3 algorithm?
00:29:06.000The ability to say – make arguments for vaccines or against vaccines and say only use real data and then be able to show the financial vested interest of anyone arguing on the other side and just have it be able to create – More data than people can parse in any reasonable amount of time.
00:29:25.000Like an academic-looking paper that's 10 pages long saying why the vaccine is not safe, with citing real charts, real graphs, real statistics, and the real vested interests of people who are, say, positively pointing out that the vaccine is safe, who maybe they have some connection to Pfizer or something like that.
00:29:39.000And it'll generate that full 10-page or 20-page document, and it'll take a team of statisticians a while to decode that thing.
00:29:46.000And you can flood the Internet with that kind of text.
00:29:51.000Through OpenAI and the GPT-3 algorithm, the ability to pass the Turing test in many areas.
00:29:55.000You should explain what the Turing test is.
00:29:57.000Meaning that if you're reading the text, you can't tell that it wasn't produced by a human.
00:30:48.000Because if you just read this and you didn't know, I don't really care if you disagree with my opinion as long as you don't call me a pedophile.
00:30:55.000If you were a real man, you would be with a young girl and take care of her and you would be a sex offender.
00:31:05.000One of the things people don't know, it actually was just developed over the summer.
00:31:08.000They announced that OpenAI, just to track since we came and talked about some of these things last time, in August 2020, OpenAI released a video of using the same technology of machines generating stuff.
00:32:21.000And it writes the code in the right-hand side.
00:32:22.000So this video that Jamie pulled up on YouTube is OpenAI Codex Live Demo, and you can see this all happening while this person types in the data, and they're actually explaining it now, how this is going to work.
00:32:35.000Once you see it later, set its position to 500 pixels down and 400 pixels from the left, and then it just does that.
00:32:43.000Oh my god, look how quick it codes it.
00:33:02.000This is an example of a kind of deep point to think about for the state of the world as a whole is one of the things that exponential tech means is exponentially more powerful.
00:33:11.000I hate to tell you this, but get this thing right up in your face.
00:33:14.000Exponentially more powerful tech that's also exponentially cheaper, which also means more distributed.
00:33:19.000And so pretty soon this level of tech will not only be getting better but available to everybody.
00:33:25.000So what happens when you have an internet where not only do you have an AI that is curating the Facebook feed for the most sticky stuff, which usually means the most toxic stuff, and that's an AI that is curating human-made content.
00:33:37.000But now you have AIs that are creating content that also get to maximize for stickiness.
00:33:41.000And then you have the relationship between the curation and the creation AIs.
00:33:45.000Like how does anyone ever know what is true about anything again?
00:33:48.000So AI can create fake stories and the fake stories can be boosted up by these troll farms.
00:33:56.000Which themselves could be run by fake accounts and fake logic.
00:35:14.000Is the regulation that you have to have some specific level of clearance before you have access to it?
00:35:20.000But if that's the case, then you put it in control of the government and then also bad actors and other governments are going to just distribute it wildly.
00:35:30.000And how do you control that someone...
00:36:09.000Okay, you've got your coding problem, and you have this biology problem with CRISPR. How does a civilization navigate this without killing itself?
00:36:19.000Well, Daniel's going to be able to speak to a lot more of this.
00:36:22.000I just wanted to connect it first to social media so people see the through line.
00:36:25.000So I actually think that social media is its other kind of...
00:36:37.000And in the same way that that dangerous capacity, we're now seeing what...
00:36:41.000That dangerous godlike power was doing of steering three billion people's thoughts, personalized to them the thing that would most outrage, you know, boogeyman their lizard brain and their nervous system.
00:39:08.000I mean, and I'm not going to claim that everyone is just...
00:39:10.000That's an example, I think I would say, of I can basically go into a group of the Boogaloo Boys or, you know, Stop the Steal groups or something like that, and I can just see stuff that's like, hey, let's get our guns out.
00:39:38.000One gutter is like, let's lock it down with surveillance.
00:39:41.000Let's lock it down with Mark Zuckerberg controls everything.
00:39:43.000Let's lock it down with the government, tells us what we can and can't do on computers.
00:39:47.000And the other gutter, which is the decentralized power for everyone, which without people having the wisdom to wield that godlike power, or at least not evidence in people's own usage of it right now.
00:39:58.000Also, we've incentivized people to do destructive things just for likes.
00:40:11.000It's a population that is getting continuously more radicalized on all sides that simultaneously has continuously more powerful tools available to them in a world that's increasingly fragile.
00:40:26.000And so if you have an increasingly fragile world, meaning more interconnected global supply chains where a collapse somewhere leads to collapse everywhere, more sensitive infrastructure, things like that.
00:40:37.000If you have an increasingly fragile world, you have more and more radicalized people and you have those radicalized people having access to more and more powerful tech.
00:40:45.000That's just fragility across lots of different dynamics.
00:40:48.000And this is why the social media thing is so central is it's a major part of the radicalization process.
00:40:53.000It's both a major part of the radicalization process and it's itself an example of the centralized control censorship, which we don't want, and the decentralized viral means for everyone which radicalize and enrage people and polarized democracies into not working.
00:41:06.000The thing is, in those two gutters, the gutters are getting bigger every day, like on each side.
00:41:11.000You've got more potential for centralized control.
00:41:14.000You've got China basically doing full control over its internet, you know, doing a bunch of stuff to top-down control.
00:41:20.000And the other side, you have more and more decentralized power in more hands, and that gutter is growing.
00:41:26.000The question is, how do you basically—we have to bowl a strike down the center of that alley, but it's getting thinner and thinner every day.
00:41:33.000And the goal is, how do we actually sort of—it's almost like a test, right?
00:41:37.000We are given these godlike powers, but we have to have the wisdom, love, and prudence of gods to match that set of capacities.
00:41:43.000You were just mentioning what China's doing to kind of regulate its internet.
00:41:47.000That's because you're worth speaking about.
00:42:10.000This idea that they're slowly working their way into our everyday lives in this sort of inexorable way where you have to have some sort of paperwork or some sort of a Q code or something on your phone or QR code.
00:42:23.000That scares the shit out of me because you're never going to get that back.
00:42:28.000Once the government has that kind of power and control, they're going to be able to exercise it whenever they want with all sorts of reasons to institute it.
00:42:37.000But I will say also, just to also notice that everywhere there's a way in which a small move in a direction can be shown to lead to another big boogeyman and that boogeyman makes us angry, social media is upregulating the meaning of everything to be its worst possible conclusion.
00:42:53.000So like a small move by the government to do X might be seen as this is the first step in this total thing.
00:42:57.000I'm not saying that they're not going to go do that.
00:43:22.000So it's quite literally as if Xi Jinping saw the social dilemma because they've, in the last two months, rolled out a bunch of sweeping reforms That include things like if you're under the age of 14 and you use Douyin,
00:43:38.000which is their version of TikTok, when you swipe the videos, instead of getting like the influencer dancing videos and soft pornography, you get science experiments you can do at home, museum exhibits, and patriotism videos.
00:43:51.000So you're scrolling and you're getting stuff that's educating because they want their kids to grow up and want to be astronauts and scientists.
00:43:57.000They don't want them to grow up and be influencers.
00:43:59.000And when I say this, by the way, I'm not, just to be clear, I'm not praising that model, just noticing all the things that they're doing.
00:44:06.000If you're going to influence people, that's a great way to do it.
00:44:08.000They also limit it to three hours, sorry, 40 minutes a day on TikTok.
00:44:13.000For gaming, let me actually do the TikTok example.
00:44:16.000So they do 40 minutes a day for TikTok.
00:44:18.000They also, when you scroll a few times, they actually do a mandatory five-second delay saying, hey, do you want to get up and do something else?
00:44:26.000Because when people sit there, infinitely scroll.
00:44:29.000Even Tim Cook recently said, mindless scrolling, which is actually invented by my co-founder of the Center for Humane Technology, Azar Raskin.
00:44:50.000until 6 in the morning, if you're under 14, It's like it's closed.
00:44:54.000Meaning one of the problems of social media for teenagers is if I'm not on at one in the morning but all my friends are on and they're still commenting on my stuff, I feel the social pressure.
00:45:03.000I'm going to be ostracized if I don't participate.
00:45:05.000And if your notifications are on, your phone keeps buzzing.
00:45:23.000Because they're all competing for attention.
00:45:25.000So when you do this mandatory thing where you say we're going to close from 10 p.m.
00:45:29.000to 6 in the morning, suddenly everyone, if you're in the same time zone, it's another important side effect, can't use it at the same time.
00:46:22.000So while we're spending all this money building physical borders, building walls, or spending $50 billion a year on the passport controls and the Department of Homeland Security and the physical...
00:46:32.000You know, if Russia are trying to try to fly a plane into the United States, we've got Patriot missiles to shoot it down.
00:46:37.000But when they try to fly an information, like precision-guided information bomb, we, instead of responding with Patriot missiles, we respond with, here's a white-glove Facebook algorithm that says, which zip code or Facebook group would you like to target?
00:46:52.000Typically, what made the U.S. powerful was the geographic...
00:46:55.000We had these huge oceans on both sides.
00:46:58.000It gives us a unique place in the world.
00:46:59.000When you move to the digital world, it erases that geographic asymmetry of power.
00:47:03.000So this is an imminent national security threat.
00:47:06.000This is not just like, hey, social media is adding some subtle pollution in the form of mental health, or hey, it's adding a little bit of polarization, but we can still get things done.
00:47:13.000It's an imminent national security threat to our continuity of our model of governance, which we want to keep.
00:47:20.000Have you spoken to congresspeople about this?
00:47:23.000Yes, but I'm hoping many more of them watch this, because I think people need to see the full scope.
00:47:27.000And I really do want to make sure we're not sounding like just full disaster porn, because we want to get to the point— Don't worry about that.
00:47:43.000It should scare people, because we're so far behind the eight ball.
00:47:47.000There's a really important point Tristan was just at that we actually need to double click on which is that democracies are more affected by what's happening with social media than authoritarian nations are and for a number of reasons but do you want to… And we sort of hinted at it earlier,
00:48:03.000but when social media's business model is showing each tribe their boogeyman, their extreme reality, it forces a more polarized political base, which means to get elected, you have to say something that's going to appeal to a base that's more divided.
00:48:18.000And in the Facebook files that Francis Haugen put out, they showed that when Facebook changed the way its ranking system worked in 2018 to something called meaningful social interactions, I won't go in the details, they talked to political parties in Europe.
00:48:32.000It's 2018. They do an interview with political parties in Poland and Hungary and Taiwan and India.
00:48:38.000And these political parties say, Facebook, we know you changed your ranking system.
00:48:42.000And Facebook like smugly responds, yeah, everyone has a conspiracy theory about how we change our ranking system because those stories go viral.
00:48:50.000And they're like, no, no, no, we know that you changed how your ranking system works.
00:48:54.000Because we used to be able to publish, here's a white paper on our agriculture policy to deal with, like, soil degradation.
00:48:59.000And now, when we publish the white paper, we get crickets.
00:49:03.000And we tested it, and the only thing that we get traffic and attention on is when we say negative things about the other political parties.
00:49:13.000We don't want to run our campaign that's about saying negative things about the other party.
00:49:16.000But when you change the algorithm, that's the only thing we can do to get attention.
00:49:20.000It shows how central the algorithm is to everything else.
00:49:23.000If I'm Tucker Carlson or Rachel Maddow or anybody who's a political personality...
00:49:27.000Are they really saying things just for their TV audience?
00:49:30.000Are they also appealing to the algorithm?
00:49:32.000Because more and more of their attention is going to happen downstream in these little clips that get filtered around.
00:49:37.000So they also need to appeal to how the algorithm is rewarding saying negative things about the other party.
00:49:42.000So what that does is it means you elect a more political representative class that's based on disagreeing with the other side and being divided about the other side, which means that it throws a wrench into the gears of democracy and means that democracy stops delivering results.
00:49:56.000In a time where we have more crisis, we have more supply chain stuff and inflation and all these other things to respond to, and instead of responding effectively, it's just division all the way down.
00:50:05.000But it's been division from the jump even long before there was social media.
00:50:44.000If I'm a small business, I have to appeal to the algorithm.
00:50:46.000If I'm a newspaper, do I just like write the articles I want to write or the investigative stories, the fourth estate that we need for democracy to work?
00:50:52.000No, I have to write the clickbait title that's going to get attention.
00:50:55.000So I have to exaggerate and say Joe Rogan just takes horse dewormer because that's going to get more attention than saying he took ivermectin.
00:51:01.000Particularly in this world where no one's buying paper anymore.
00:51:05.000Everyone's buying everything, clicking online, so you really – and very few people are even subscribing, so you have to give them these articles and then have these ads in the articles.
00:51:14.000And those publishers – and that's also driven by the business models of these central tech companies, especially Facebook, Twitter, and Google.
00:51:21.000There's two feedback loops that he just mentioned.
00:51:24.000Politically, if you have Facebook and other platforms like this polarizing the population, then the population supports a more polarized representative class.
00:51:33.000But the representatives to be elected are doing political ads and so the political ads then further polarize the population.
00:51:41.000So now you have this feedback loop and then the same is also true with media.
00:51:44.000The media has to – meaning newspapers, television.
00:51:49.000Still has to do well on the Facebook algorithm because more and more there's a monopoly of attention happening there and it's someone seeing a clip there that has them decide to subscribe to that paper or keep subscribing to it or whatever it is.
00:52:00.000So you end up having the algorithm radicalizing what people want to pay attention to where then the sources of broadcast have to appeal to that, which then in turn further radicalizes the population.
00:52:39.000There's obviously many steps to this, right?
00:52:41.000So once you've kind of let this cancer sort of spread, if you take out the thing that was causing the cancer, we've now already pre-polarized everyone's beliefs.
00:52:51.000Like when you say, what's the solution to all this?
00:52:52.000Like all of our minds are running malware.
00:53:00.000Everyone thinks the other ones are, but not me.
00:53:02.000But the point is that all of us, like, it doesn't matter, like, people on all sides of the political aisles and all tribes, we've all been shown our version of the boogeyman, our version of the inflated thing that got our attention, and then made us focus on that and then make us double down and go into those habits of those topics being the most important.
00:53:22.000I almost think we need a shared moment for that.
00:53:24.000I wish The Social Dilemma was a little bit more of a...
00:53:26.000It was a shared moment, but I think there's almost like a truth and reconciliation moment that we need to unwind our minds from the cult factory.
00:53:35.000Because it's a cult factory that found each of the little tribes and then just sucked them together and made them in a self-reinforcing chamber.
00:53:42.000Let's say we take any issue that some people care about and think is central, whether we take social justice or climate change or US-China relations.
00:53:51.000If half of the population thinks that whatever – half the population has a solution they want to implement, carbon taxes or whatever.
00:53:59.000Other half of the population is polarized to think that that is – I'm going to mess everything up.
00:54:05.000So that other half are still political actors and they're going to escalate how they counter that.
00:54:11.000How do you get enough cooperation to get anything done especially where there are real issues and not just have all the energy become waste heat?
00:54:18.000In autocracy, let's take China as an example where you don't have to – where you don't have so much internal dissent.
00:54:26.000So you can actually do long-term planning.
00:54:29.000So one of the things that we see is we have decreasing ability to make shared sense of the world.
00:54:35.000In any kind of democratic society, if you can't make shared sense of the world, you can't act effectively on issues.
00:54:40.000But the tech – the types of tech that are decreasing our ability to make shared sense of the world are also increasing the speed at which tech is changing the world and the total consequentiality of it.
00:54:53.000That's one way to start to think about like this bowling alley example is We're having faster and faster, more and more profound consequential effects and less and less ability to make sense of it or do anything about it.
00:55:04.000So underneath the AI issue, the CRISPR issue, the US-China issue, the how do we regulate markets issue, the how do we fix the financial crisis issue is can we make sense of anything collectively, adequately to be able to make choices effectively in the environment we're in and that's underlying it.
00:55:24.000Tristan was laying out that you got these two gutters, right?
00:55:27.000You've got decentralized catastrophe weapons for everyone if we don't try to regulate the tech in some ways and that world breaks or to say if we don't want decentralized catastrophe weapons for everyone, maybe we do something like the China model but where you have ubiquitous surveillance and that's a dystopia of some kind.
00:55:44.000So either you centralize the power and you get dystopias or it's decentralized and you get catastrophes and right now – The future looks like one of those two attractor states, most likely.
00:55:56.000How do you have a world that has exponential tech that doesn't go catastrophic, where the control mechanisms to keep it from going catastrophic aren't dystopic?
00:56:04.000And by the way, we're not here saying like, go buy our thing or we've got a new platform.
00:56:08.000This is just about describing what is that center of that bowling alley that's not the gutters that we can skate down.
00:56:16.000The closest manifesting example of this so far, although when you do one more construction, I think, which is this, but is Taiwan.
00:56:24.000Because Taiwan, actually, I think I talked about it last time we were here, is a...
00:56:30.000They've got this digital minister, Audrey Tang, who has been saying, how do you take a democracy and then use technology to make a stronger democracy?
00:56:40.000So you can look right now at the landscape.
00:56:42.000We can notice that China, countries like China, autocratic countries, We're good to go.
00:57:06.000By contrast, open societies, democracies, Western democracies, are not consciously saying, hey, how do we take all of this tech and make a stronger democracy?
00:57:17.000How do we have tech plus democracy equals stronger democracy?
00:57:20.000One of the other reasons I wanted to talk to you.
00:57:22.000So far, I think the tech reform conversation is like, how do we make social media like 20% less toxic and then call it a day?
00:57:29.000Or like take a mallet and break it up and then call it a day?
00:57:31.000That's not enough when you understand the full situation assessment that we're kind of laying out here of the skating down the middle of the bowling alley.
00:57:38.000The thing that we need that competes with that thing, because we can't just also allow, that thing is going to outperform.
00:57:43.000The China autocratic bottle is going to outcompete a, you know, democracy plus social media that, like, is 20% less toxic, isn't going to outcompete that thing.
00:57:53.000Well, ultimately in the long run it's going to.
00:57:55.000But what's fascinating is they're willing to forego any sort of profits that they would have from these children from 10 p.m.
00:58:34.000We were talking with a sitting senator who was saying – or at some national security conference – talking to a foreign minister of a major EU country and said – Who do you think the CCP, the Chinese Communist Party, considers to be the greatest rival to its power?
00:58:50.000You would say the United States, right?
00:59:18.000That would open up our military to foreign hacking.
00:59:20.000So they see correctly that technology is the new source of power of basically what guides societies.
00:59:27.000It is the pen that is writing human history.
00:59:29.000And it doesn't have, if you let just for-profit motives, again, coupled with like, how do I get as much attention out of people as possible in the race to the bottom of the brainstem to suck it out of people, that thing doesn't work with societies.
00:59:49.000But a post-cynical perspective is they're also appropriately recognizing that there's a certain threat that comes with allowing unregulated technology.
01:00:00.000One way to think about this, Tristan was just saying that they recognize the power of new technologies and the need to be able to employ them if they want to be effective.
01:00:10.000We can see how much the world responded, how much the US responded to the possibility of a nuclear bomb with the Manhattan Project, just even the possibility that the Germans would get it and how that would change everything asymmetrically.
01:00:22.000And so we make basically an indefinite black budget, find all the smartest scientists in the world because that much asymmetry of tech will determine who runs the world.
01:00:31.000It's important to also say there are some people who will have just like a knee-jerk reaction that says, oh, you guys are just being catastrophic.
01:00:39.000Yeah, you guys are just trying to scare us, disaster porn.
01:00:54.000Like, did go extinct for different reasons.
01:00:57.000But World War II was the first time we had truly globally catastrophic tech, and we had to build an entire world system, mutually assured destruction, the Bretton Woods world, the IGO world, to basically not use that tech.
01:01:10.000Well, now, that was basically the first catastrophe weapon, and then we had only two superpowers that had it, so you could do mutually assured destruction, and it's really hard to...
01:01:36.000And the exponential technologies with kind of computation at the center, AI and these other ones we're talking about, are so much more powerful than all forms of legacy power that only the groups that are developing and deploying exponential tech will influence the future.
01:02:00.000Facebook is, Google is, like major corporations that are also top-down, non-democratic systems are and they're becoming – like Facebook has 3 billion people.
01:02:17.000So you either have corporations that are wielding the power of all this technology for mass behavior modification, surveillance of everyone, perfect sort of understanding of their psychological traits, and then moving them that scale.
01:02:28.000But in the big tech corporation model, they're doing it for a for-profit motive, whereas in the CCP model, they're doing it for...
01:02:38.000Neither of them have some kind of participatory governance, jurisprudence of, for and by the people and the open societies are not innovating and how do we develop and deploy exponential tech in an open society way?
01:02:50.000That's fundamentally what we're saying has to be like the central imperative of the world right now.
01:03:07.000Well, the simple way is you don't, right?
01:03:14.000The simple way is you lock things down and become an autocratic – yeah.
01:03:18.000So you either beat China by becoming China or you figure out a third way.
01:03:22.000We'd like to see there be a third way.
01:03:24.000I'd like to see a third way too, but I don't see it.
01:03:46.000John Cena, there was an opening weekend for Fast and the Furious 9, I believe, and John Cena accidentally or inadvertently said that Taiwan is going to be the first country that sees the movie.
01:04:02.000Well, China doesn't recognize Taiwan as a country, and if you want to do business with China, you can't say that.
01:04:08.000That was on full display, and it made people very skeptical of the World Health Organization when one of their spokespeople was having a conversation with a journalist.
01:04:17.000When she brought up Taiwan's response, and other countries have done it like this, but Taiwan's response, and he disconnected his line.
01:05:02.000So this is the perfect example of a kind of dystopia that we don't want to go to a future where people are all accommodating or can't feel or think their actual thoughts because they have to appeal to some source of power.
01:05:14.000Exactly, and the source of power is financial, because $160 million was the opening weekend for Fast and Forest 9, and $134 million of it came from China.
01:05:31.000Saying I had many, many interviews, and one of them...
01:07:05.000I don't know, but I was working out sometime.
01:07:08.000So this is a good example of we don't want to live in dystopias where our thought and our ideas and our free expression and our ability to figure out what's true in an open-ended way because we don't know what's true.
01:07:22.000Remember last time I ended our conversation talking about Orwellian dystopias and Huxleyian dystopias, that quote about amusing ourselves to death?
01:07:29.000Orwell feared a world where we would ban books and censor information.
01:07:33.000Huxley feared a world where we'd be drowned in irrelevance and distraction.
01:07:38.000So that's kind of another version of the two gutters.
01:07:41.000Right now, we're kind of getting a little bit of both, right?
01:07:43.000We're getting a little bit of, hey, we don't like the way that the companies are doing this sort of censorship or platforming, deplatforming of people.
01:07:51.000We also don't want the unregulated, like, virality machines where the craziest stuff and the most controversial stuff that confirms our biases goes viral because both those things break society.
01:08:04.000Well, let me just make one narrow solution for that one because it's funny because Frances in her own testimony says Facebook wants you to believe in false choices between free speech and censorship.
01:08:13.000There is a solution that Facebook themselves knows about for this particular problem.
01:08:18.000Which is actually just to remove the reshare button, basically the retweet button, the reshare button.
01:08:25.000What they found in their own research, Facebook sent something like a billion dollars or something, multibillion dollars, on integrity, content moderation, all that stuff.
01:08:34.000And they said in their own research it would be more effective than the billions of dollars they spent on content moderation to just remove the reshare button after people click it twice.
01:08:46.000In other words, you can hit reshare on a thing and it goes to all your friends.
01:08:49.000And then all those friends, they still see a reshare button and they can click reshare and it goes to all their friends.
01:08:56.000After that, there's no reshare button.
01:08:58.000If you just remove the instant frictionless, like, make my nervous system twitch, and then boom, I'm resharing it to everybody.
01:09:05.000If you just remove that one thing, you keep freedom of speech, but you kill an irresponsible reach, just like instant reach for everyone.
01:09:13.000But you also kill the ability to retweet something or reshare something that's interesting.
01:09:18.000You could still copy and paste a thing and share it.
01:09:36.000And they're like, what if we made it so in the podcast app we had a reshare button and you could see a feed of all the stuff that your friends were looking at?
01:09:52.000If something is truly that important and truly that meaningful, someone will copy and paste it as a link and be like, you got to check out this interview with Joe Rogan, which I hope people do with this episode, because it's crossing a threshold of significance.
01:10:07.000Of what is truly worth our undivided attention, which is also the name of our podcast.
01:10:11.000As opposed to just publicizing something and spending a bunch of money or doing a bunch of PR work.
01:10:16.000Or creating influencer culture or just rewarding, again, the controversy and the conspiracy theory.
01:10:21.000And again, I shouldn't use that phrase because it sounds like you're always one-sided on it.
01:10:24.000But just the most kind of aggressive take on anything, the most cynical take on everything being rewarded.
01:10:30.000It's nowhere is it written that a virality-based information ecosystem Where you have the...
01:10:35.000People are familiar now with the metaphor of like a lab that's doing gain-of-function research and the idea that something could get leaked out of the lab.
01:11:02.000Especially the ones that most excite your nervous system and your lizard brain go as viral as possible.
01:11:06.000And then what you can even say that the Zuckerberg Institute of Virology released this mimetic virus and it shut down the global democracy world because now we don't have shared sense making on anything.
01:11:17.000But we do if everyone's intelligent and objective and they just, you know, they don't use the reshare button for nonsense.
01:11:25.000The problem is that people are, you know, we're impulsive and we also don't spend a lot of time researching a lot of things that we read.
01:12:07.000If someone has kind of an emotional bias towards what they already generally think is true, even if they're following the scientific method well, what experiment they decide to do, what they go looking for will be influenced because there's a lot of things to look at, right?
01:12:22.000Am I trying to do science on natural supplements versus on drugs versus on vaccines versus like I'll have an intuition that is the basis of a hypothesis or a conjecture that then I'll do the scientific method on.
01:12:38.000So a point that Tristan was saying earlier that I think is really important is that this model of Facebook and it is not – and it's not just Facebook.
01:12:55.000It's a second-order effect, an unintended consequence.
01:12:58.000But in the way that like the unintended consequence of – Cigarettes had an externality which was lung cancer and oil companies had oil spills and so then government had to regulate it.
01:13:09.000These companies are fundamentally different because their externality is a polarized population which in a democracy decreases the capacity of government directly.
01:13:19.000So the big oil companies and big pharma companies and whatever can do lobbying and campaign budget support and whatever they can to affect government.
01:14:23.000TikTok is just looking at engagement and upregulating the things that get most engagement.
01:14:27.000This is actually a key point, because let's say that we tried to make a piece of legislation based on thinking it was about shares.
01:14:32.000Then Facebook would just move to the TikTok algorithm.
01:14:35.000Just by what you look at the most, that thing gets re-shared to other people.
01:14:38.000Maybe viral based on unconscious signals as opposed to explicit click signals.
01:14:42.000So there's a deeper point here, which is not, is there a piece of regulation that we can put in, even if we trust the government to do that?
01:14:48.000Let's even say we had a very trustworthy government.
01:14:51.000It's, can the government regulate at the speed that tech can outmaneuver it?
01:14:58.000If you didn't have any algorithms whatsoever, wouldn't you be now open to being manipulated by troll farms just simply by volume?
01:15:07.000You know, if they have 100,000 accounts at each individual location, they have 100,000 locations, and they're just pumping out different Instagram pages and TikTok.
01:15:17.000And we don't really even know how many they actually have, because they discover them.
01:15:20.000Facebook shuts down 2 billion fake accounts per quarter.
01:16:07.000They do these quarterly reports, I think, called information quality reports.
01:16:12.000They publish every quarter how many accounts they take down.
01:16:15.000But it's just like a PDF. They're just putting out a post as opposed to letting external researchers know, for example, in each country, what are the top 10 stories that go viral?
01:16:24.000How do they find out that someone's a troll post?
01:16:31.000Usually, if you've ever had this happen, you use Facebook and you click around a bunch, and then it says, like, you look like you're clicking around too much.
01:16:36.000Have you ever gotten one of those messages?
01:17:23.000But Twitter, you know, you can have jackmeoff69 and that's your Twitter handle with some weird Gmail account and then just post nonsense, you know?
01:17:32.000Trevor Burrus And so there's no ability to – for justice in that system and accountability, there's also no ability for the user reading somebody else's thing to know who they are, right?
01:17:42.000For the veterans group or whichever group to know, is this a Russian troll farm that is pretending to be a Christian or whatever?
01:17:48.000And especially, is this even a human or is this an AI in the very near future?
01:18:06.000Because if there's rigorous identity, who has access to that information?
01:18:21.000There's a whole decentralized community.
01:18:24.000There's a great movement called Radical Exchange, run by Glenn Weil, and they're trying to create part of this third attractor, this like, what's the center of the bowling alley that's a digital version of democracy.
01:18:32.000What are decentralized ways of proof of personhood?
01:18:36.000There's a project called IDENA. There's a bunch of things people can look up by radical exchanges work.
01:18:40.000It's part of a whole movement of which Taiwan is included, which is that, I don't know if we really got to the Taiwan example, but- We didn't, I don't think.
01:18:55.000It's a direction of how we can go, which is, You know, you can only fit so many people into a town hall to deliberate, right?
01:19:02.000So there's sort of a limit to our idea of democracy is kind of guided by ideas from 200 years ago.
01:19:07.000They've created a system called POLIS, which is a way of gathering opinions about various ideas and then sort of seeking who wants more funding for various things, less funding for various things.
01:19:17.000And whenever there's an unlikely agreement, so they sample a bunch of people, say, you sit over here, you sit over here, they get these clusters, like, these people kind of like this, these people kind of like these other things.
01:19:26.000Whenever there's an unlikely agreement between those clusters, in other words, consensus, rough consensus, that's what they sort of boost to the top of that system.
01:19:34.000So everyone's seeing areas of common ground, common ground, common ground, as opposed to fault line of society, fault line of society, be more angry, join the common thread, etc.
01:19:42.000And then you're invited into a civic design process where you actually say, hey, I don't like the tax system.
01:19:48.000And they're like, great, we're going to invite 30 of the people who were part of that rough consensus.
01:19:53.000We're like, let's improve the tax system.
01:19:54.000Let's talk about how we're going to do it.
01:19:56.000And they do a combination of in-person stuff.
01:19:57.000This is a little bit before COVID. And Zoom calls.
01:20:00.000And then do like these mechanisms to kind of get an idea of where do people agree and then how do we make it better?
01:20:44.000And if you identify places where it could get more efficient, you could get money or resources by making the whole system work better for everyone.
01:20:52.000If you ran a current audit of the U.S. government through blockchain, you'd have a goddamn revolt.
01:20:57.000They would go, holy shit, this whole thing is corrupt.
01:21:06.000And I think the Nancy Pelosi's of the world would really have a hard time with that.
01:21:11.000I heard some clip that you did where you were talking about your pot thoughts of people being in big buildings and the pipes everywhere.
01:21:19.000And just like how weird some aspects of civilization are.
01:21:23.000So think about how weird democracy is.
01:21:25.000Like as an idea, the idea that you can have some huge number of people who don't know each other, who all believe different stuff and want different stuff, figure out how to actually agree and work together as opposed to just tribalize against each other and do their own thing.
01:21:45.000And we've always had a scale issue, right?
01:21:47.000In 1776, you could all go into the town hall and fit.
01:21:51.000And so I wasn't just hearing a proposition that a special interest group had created and I get a vote yes or no, which will inherently polarize the population, right?
01:21:59.000Very few propositions get 99% of the vote.
01:22:02.000They get 51% of the vote because they benefit something and they harm some other things.
01:22:06.000And the people who care about what gets harmed are fighting against it.
01:22:08.000That polarizes the population against each other.
01:22:11.000Social media— This is a conversation about censorship or free speech.
01:22:16.000And boom, you just split the population in half.
01:22:18.000As opposed to, hey, we all agree we could do a little bit less virality.
01:22:21.000We could stop the teenager use in these ways and we'd be better for everyone.
01:22:24.000The proposition creation isn't designed to polarize the population.
01:22:28.000Because as soon as you get beyond the scale of we can all actually inform what a good proposition would be by being in the town hall and having a conversation.
01:22:53.000Also, if you had a representative, the level of tech at the time was something that a very well-educated person could understand most of.
01:22:59.000They could understand a lot of the tech landscape.
01:23:02.000Obviously, we're in a situation now where the scale issue of democracy has been completely broken.
01:23:08.000So almost nobody – we're supposed to have a government upformed by the people but nobody really understands the energy grid issues or first strike nuclear policy or monetary policy or anything like that.
01:23:18.000And everyone's voice can't be heard, right?
01:23:21.000Now what Taiwan was working on is, is the tech that is particularly in the West breaking democracy, could that same tech be employed differently to actually make 21st century democracy more real?
01:23:34.000So the same AI that can mess up the information landscape for everyone, could we use that type of AI that understands language to be able to see what does everyone think and feel and actually be able to parse that into something we can understand?
01:23:45.000So there's an online environment that says here's the distribution of people's values.
01:23:48.000Here's the various values people care about.
01:23:53.000And then is there a place where we can actually craft propositions together?
01:23:58.000So there's a way to make it to be able to utilize these same tools To make democracy more realized, to make collective intelligence more realized.
01:24:08.000But right now, as we were saying, autocracies are working on employing these tools.
01:24:13.000Corporations are working on it, both of which are top-down.
01:24:22.000Who would be incentivized to use that other than the people?
01:24:26.000And it's pretty clear that the people don't have control over Facebook, don't have control over Twitter, certainly don't have real control over the government.
01:24:33.000You have control over elected officials who, it's almost universally agreed, will lie to you to get into office and then not do what they said they were going to do, which is the standard operational procedure.
01:24:44.000So what's the incentive and how would these get implemented?
01:24:48.000So again at a small scale, 1776, your representative couldn't lie all that much because everybody – they lived in the same town, right?
01:24:56.000And you could all go see what was going on.
01:24:58.000And so can we recreate things like as you were mentioning, people would freak out if they could actually see how government spending worked.
01:25:11.000Could we have places where there's direct democracy and people can actually engage in the formation of what a good proposition is, not just voting yes or no on a proposition or yes or no on a – But can I stop you there?
01:25:25.000How would you have that transparency and who would be incentivized to allow this transparency?
01:25:31.000If the transparency has not existed up until now, why would they ever allow some sort of blockchain-type deep understanding of where everything's going?
01:25:43.000I don't think they are incentivized, which is actually why this show is interesting.
01:25:47.000Because if we're really talking about a government of, for, and by the people, where the consent of the governed is where the power of government comes from, like, ultimately, if And the founding fathers said a lot of things about that the government will decay at a certain point,
01:26:03.000particularly when people stop taking the responsibility to actively engage.
01:26:07.000And so if tech can produce – if tech has incentives to produce things that are catastrophically problematic for the world – And we need to regulate that somehow.
01:26:19.000And the issues are too complex for individuals to understand.
01:26:22.000But how do you make sure the institutions are trustworthy?
01:26:26.000We have to create new 21st century institutions, but they have to arise of, for, and by the people, which means there's a cultural enlightenment that has to happen, right?
01:26:35.000People actually taking responsibility to say we want – We want institutions we can trust and we want to engage in processes of recreating those.
01:26:44.000How do you get people to be enthusiastic about some sort of a radical change like that other than some sort of catastrophic event like a 9-11?
01:26:53.000Well, this is why we're talking about all the impending catastrophic events, is to say we don't want to wait until after they happen.
01:26:59.000It would be, but it seems like that's the only way people really change the way they think about things.
01:27:05.000It's something, some almost like cultural near-death experience has to take place.
01:27:09.000Well, it's like the problem of humanity is paleolithic emotions, medieval institutions, and godlike tech.
01:27:15.000One of the paleolithic emotions is it can't be real until, oh shit, it actually happened.
01:27:19.000And so, like, but the test is, we are the one species who has the capacity to know this about ourselves, to know our Paleolithic emotions are limited in that way, and say, we're going to take the action, the leap of faith that we know we need to do.
01:27:33.000Like, we're the only species that can do that.
01:27:34.000If a lion was in this situation, or a gazelle, they can't, like, understand their own mind and realize they have the one marshmallow mind or the, you know, short-term bias or recency bias.
01:27:43.000They're trapped inside of their meat suit.
01:27:45.000This is a beautiful idealistic notion.
01:27:48.000However, in real-world application, most people are just fucking lazy and they're not going to look into this and they're not going to follow through.
01:27:57.000And this is why most people that really study tech-mediated catastrophic risk are not very optimistic.
01:28:04.000And they think things like we have to chip human brains to be able to interface with the inevitable AIs or we have to have an AI overlord that runs everything because we're too irrational and nasty.
01:28:14.000And the question is like there have always – there's always been a distribution of how rational people are and how kind of benevolent they are.
01:28:24.000And – We have never with that distribution been very good stewards of our power.
01:28:30.000We've always used our power for war and for environmental destruction and for kind of class subjugation.
01:28:35.000But with the amount of power we have now, those issues become truly globally catastrophic.
01:28:39.000And this is the thing is like – and this is what almost every ancient prophecy kind of speaks to.
01:28:45.000As you get so much power that you can't keep being bad stewards of it, either the experiment self-terminates or you are forced to step up into being adequate stewards of it.
01:28:56.000So the question is what would the wisdom to steward the power of exponential tech – what would the minimum required level be?
01:29:04.000And that's like the experiment right now.
01:29:18.000So we can look at some cultures that have certain traits quite different than other cultures as a result of the conditioning of those cultures more than as a result of the genetics.
01:29:28.000We can see that if you look at Jains… What are the Jainists?
01:29:32.000The Jains are a religion that is highly emphasizing nonviolence, even more than the Buddhists.
01:30:05.000You can see that Jews have had historically a level of education that is higher than the embedding society that most everyone around them has as a result of investing in that.
01:30:16.000And so we're like, can cultures value certain things and invest in developing those in people?
01:30:23.000It doesn't mean that everyone is suddenly has the wisdom of gods to match the power of gods, but can we create a gradient?
01:30:29.000This is where there used to be this concept of building.
01:31:10.000Well, and with social media, they do see it.
01:31:12.000I think there's a unique moment, and the reason I thought this would be an interesting conversation with the three of us, is that social media has become the case.
01:31:20.000We can now, I mean, it took, unfortunately for some people, seeing the receipts, which is what France has provided, to things that we all predicted back, you know, eight years ago, But now people understand that that is a consequence of unregulated exponential technologies that are steering people at scale,
01:31:36.000making things go viral at scale and dangerous at scale.
01:31:39.000So that's a case we can now see that thing.
01:31:41.000Can we leverage the understanding of that to realize what bigger thing needs to happen?
01:31:46.000Before we get to the incentive, just imagine as a thought experiment for a minute that...
01:31:51.000Facebook changed what it was optimizing for because Facebook is this 3 billion person AI optimized behavior machine, right?
01:31:59.000Like that's a huge – it's not like normal companies and it's important to understand that.
01:32:03.000And it's optimizing for engagement which usually ends up looking like outrage, desire, addiction, all those types of things.
01:32:09.000But let's say that we – Could we assess for are people being exposed to different ideas than the ones they're used to?
01:32:17.000Are they actually uptaking those ideas?
01:32:19.000Are people expressing ideas that have more nuance and complexity?
01:32:22.000And you were actually upregulating for those things.
01:32:26.000There's a lot of actually quite constructive content on the internet.
01:32:30.000And imagine that you could actually personalize development and education.
01:32:34.000This is why you started to say when Tristan was saying what China is doing where the kids are seeing museum and science experiments and patriotism.
01:32:40.000You're like, yeah, that actually kind of makes sense.
01:32:43.000It makes sense but it only makes sense when you have an autocratic government that has complete control of the corporations and their motivations.
01:32:50.000Like, if the corporation's motivations were specifically designed to rake in the most amount of profit, like Facebook's is, you'd never be able to trick them into doing that.
01:33:11.000And we can see how the government took major corporations that had such an Aaron Ross Powell Yes.
01:33:34.000That they can't be harming our citizens and harming our democracy.
01:33:38.000We actually have to put some regulation not on who gets to speak but what gets radically upregulated.
01:33:44.000But the problem is the way they would do it is the same way they do like the Build Back Better bill where it's 40,000 pages and no one can read the whole thing and inside of it there's a bunch of shit about how they can spy on your bank account.
01:33:56.000And lock you down if you spend more than $600 and you have to go to a committee to decide whether or not you get your money back and make everybody scared and paranoid.
01:34:05.000I mean, this is the kind of behavior that our government engages in on a regular basis.
01:34:09.000This is not just a big ask for us to get people to be motivated to make this radical change, but it's a big ask to the government.
01:34:16.000It's like, hey, you fucks have to start being honest now.
01:35:21.000Ask that of the government and ask that of corporations.
01:35:25.000At least human beings don't – they have a personal understanding of the consequences of what they're doing and they don't have this diffusion of responsibility that both government and corporations have.
01:35:36.000The thing about the diffusion of responsibility is one person in a corporation doesn't feel evil when the corporation – Dumps pollutants into some South American river.
01:35:46.000But that is happening and it is a part of it.
01:35:50.000But when an individual takes responsibility for their own actions and if we can somehow or another Coach or explain or educate individuals about their consumption and what kind of impact their consumption has on their psychology,
01:36:09.000on their future, on the way they view and interface with the world.
01:36:14.000The reason why these algorithms are effective It's because they play to a part of human nature that we don't necessarily have control over.
01:36:37.000The problem with the algorithm, except for what you were talking about before with the QAnon thing, that's fucked.
01:36:43.000The problem with the algorithm on YouTube is it accentuates the things that people are actually interested in.
01:36:49.000But when Facebook, those fucks, when they do something like that where someone just invites people into a group and you can mass invite, I'm assuming through some sort of a program, right?
01:36:59.000They're not doing it individually, one by one.
01:37:02.000So if some QAnon group mass invites a million people and then it's all of a sudden distributing disinformation to that million people, then you've got a problem with the company.
01:37:11.000There's a problem with the way that company distributes information because you're not allowing people to make the decisions that they could make to clean up their algorithm, to clean up what they get influenced by, to clean up what their newsfeed looks like.
01:37:32.000You're allowing someone to radicalize, like intentionally radicalize people with either willing or unbeknownst to them disinformation.
01:37:44.000Yeah, and we don't want the Nestle Coca-Cola vending machine in the preschools because do the kids actually have the ability to win the two marshmallow experiment in the presence of that much advertising?
01:38:16.000His partner, Itzik, makes two clicks on TikTok and he knows exactly what he wants, right?
01:38:21.000When you have that degree of asymmetry and it's designed with that much power on one side of the table, I mean, a system is inhumane if that symmetry of power is so asymmetric, right?
01:38:29.000So if it's influencing me more than I understand my own brain, like a magician, we're not going to be able to get out of that.
01:38:36.000Because if it's playing to my confirmation bias and I don't know that I have confirmation bias, I'm just run by confirmation bias, that's a form in which I'm essentially a foot soldier in someone else's culture war.
01:38:46.000If it's playing to my social validation, I don't know that it's playing to my social validation, I don't even know I have a thing called social validation, that that's an exploitable part of me.
01:38:55.000So, I mean, you're right, by the way, as a part of an ecosystem of solutions, we do need a cultural, I mean, Daniel calls it a cultural enlightenment, but you can just simply say we need a mass education about how technology influences us that matches.
01:39:09.000Everyone who uses social media deserves to know how it works.
01:39:11.000And in the Carnegie Endowment, they did a sort of meta-analysis for the problem of misinformation.
01:39:16.000If you look at, like, 100 organizations surveying how do we deal with this problem, like, I think, like, 98% of them said, like, the number two result was at least do digital literacy education for everyone, right?
01:39:27.000Everyone should understand more about how this stuff works.
01:39:29.000So, to your point about we should be educating everyone to be better stewards of their own attention, their own information consumption.
01:39:36.000But when you have bad actors that are manipulating this stuff at scales and at levels that people don't understand, and we're about to have GPT-3, and printing basically full research papers that justify everything you've ever believed, that's not going to be an adequate solution.
01:39:52.000So we have to change something at the systemic level.
01:39:55.000The question is just how do we get that to happen?
01:40:05.000Like, we have these ideas of what needs to happen, but this is like, what we need to do is get everybody to stop eating processed food and exercise regularly and only drink water.
01:40:17.000Well, you can get like 10 people to do that.
01:40:20.000You can get highly motivated people to do that.
01:40:23.000You can get really intelligent, really conscientious people that are considering the impact that their choices have on their future.
01:40:56.000That might be the only comfort that they have, is arguing about global warming with people online.
01:41:02.000Or, you know, arguing about Second Amendment rights.
01:41:06.000Like, we've got to take into consideration the motivation for people to engage in these acts in the first place.
01:41:13.000And a lot of it is just they're very, very unhappy with their existence.
01:41:17.000So that's why they get trapped doing this.
01:41:19.000When you talk to someone who is like, hey, I realize that I've got to get off of social media, I don't do anything anymore, I wake up in the morning, I have fresh squeezed vegetable juice, and then I go on a nice long hike, and those are rare fucking humans.
01:41:33.000But the idea that we're going to change the course of the vast majority of our civilization and have most people behave in that manner is very unrealistic.
01:41:44.000So this is why now add increasing technological automation and the radical technological unemployment to that.
01:41:52.000Meaning automating more of our jobs, etc.
01:41:56.000Does that make people less happy or more happy?
01:41:58.000Well, for the most part, it makes a radically unemployable underclass, a huge radically unemployable underclass, where at least in feudalism, you still needed the people to do labor.
01:42:07.000Now you won't need the people to do labor.
01:42:10.000So then this is why there are a number of people in the upper wealth class who believe in universal basic income because it's at least the cheapest way to deal with those people.
01:42:18.000Now you add the metaverse to that and this is the entry into the matrix world, right?
01:42:23.000Now this is where we have to get to because that's what I'm really worried about.
01:42:28.000What I'm really worried about is the solution will be to disengage with the actual material world and the solution would be to find yourself almost completely immersed And do whatever you can to just feed yourself and pay for the metaverse.
01:44:35.000They're going to lure you into this and you're not going to give a shit about your regular life anymore because it's going to be so much more exciting.
01:44:40.000And next thing you know, they're going to say, listen, they'd be much more involving if you put a gasket in the back of your head and they could just connect you straight to a pipe.
01:44:56.000The competition for attention and the attention economy was always about the metaverse.
01:45:01.000We've been living in the metaverse for a long time because it's about how do you capture and own and control people's personal reality.
01:45:08.000That's what Instagram, Facebook, TikTok, YouTube, the whole thing, that's what these things are.
01:45:13.000One of the things that this makes me think about, that's subtle actually, I know you've had Jonathan Haidt on the show talking about teenage mental health problems.
01:45:20.000When you look at when self-harm and depression starts ticking up in the graph for the kids, 13 to 17-year-olds, there's a subtle point in a specific year period where that ticks up.
01:45:51.000I would say that it's when you virtualize people's experience so fully and that virtual world doesn't care about protecting and nurturing the real world.
01:46:00.000When you virtualize people's relationships in a way that they don't care about nurturing your offline relationships when they virtualize your online relationships.
01:46:23.000That's very similar to a virtual reality that's not protecting the social fabric that it depends on.
01:46:33.000It depends on that thing for it being higher, if you want to say anything to that.
01:46:38.000So why did Facebook buy Instagram and buy WhatsApp and the various things they did is because a monopoly of attention is the play.
01:46:47.000And a monopoly of attention is a really big deal to be able to get.
01:46:50.000But as soon as new devices come out, you're going to get attention in different ways.
01:46:53.000So AR and VR as new platforms, obviously, you've got to lead the way in having the monopoly of attention there and increasingly hypernormal stimuli.
01:47:02.000And the cell phone took us up to something like 50% screen time from say 25% screen time on the laptop.
01:47:08.000The AR can take us up to like approaching 100% screen time or engagement time.
01:47:14.000And then persistent tracking of reality across all those domains.
01:47:17.000So we can see why this is super problematic and pernicious.
01:47:27.000That was just speaking to how the metaverse is a natural extension of what they've already been doing.
01:47:49.000If I have virtual relationships online, but they're actually debasing the integrity of my...
01:47:56.000In person relationships, so when we're talking, we're actually looking at our phones.
01:48:00.000We would say from the Center for Humane Technology kind of perspective, what is humane tech?
01:48:04.000One of the definitions would have to be And he was mentioning earlier that tech plus democracy makes a better democracy.
01:48:11.000Similarly, if you want to think about what does humane tech mean, tech plus any of the foundations of what it means to live a meaningful human life and a sustainable civilization, tech has to make those things better.
01:48:21.000So tech plus families has to actually increase the integrity of families.
01:48:24.000Otherwise, it's fundamentally not humane.
01:48:26.000It's misaligned with what is foundational to being human.
01:48:29.000Tech plus democracy has to make better democracy.
01:48:32.000Tech plus individual human mental well-being.
01:48:41.000But there are ways of actually, first of all, aligning and choosing your business models to be in alignment with that thing.
01:48:48.000So, I mean, not to give Apple too much praise, but when it says, hey, you know, we're going to...
01:48:56.000And they're choosing to go into health because they could just say, hey, we're going to build our own maximize engagement machine metaverse thing.
01:49:02.000I'm sure they're working on one, but they're choosing business models.
01:49:05.000Their business model isn't maximizing attention.
01:49:07.000That's why when you use FaceTime, it doesn't have like, here's comments, here's notifications, here's the hearts and likes and thumbs up floating across the screen as you're using FaceTime because you're the customer, not the product.
01:49:17.000Well, Apple's a fantastic example of what is possible when a company does have The most superior product in the market, right?
01:49:28.000It's kind of widely acknowledged that when it comes to the phones, when it comes to the operating system that exists in the phones, and when it comes to the operating system that exists on the computers, and then the fact that Apple controls all of the hardware.
01:49:41.000So the problem that Windows has is You got Lenovo and Dell, and there's all these different companies that are making Razer.
01:49:49.000They're all making different hardware, and then you have the operating system that engages with that hardware, but there's all these different drivers, you got different video cards, you have different...
01:49:57.000There's so many different things that it's very difficult for them to make this one perfect experience.
01:50:05.000What we're going to do is we're going to control all the hardware.
01:50:08.000So they make the best laptop they can possibly make, they make the best phone they could possibly make, and they've done such a good job with it that they have this massive, loyal fan base.
01:50:18.000And then through that, they decided, you know what?
01:50:21.000We're going to give you the option to not have advertisers track you and apps track you everywhere.
01:51:42.000They could simply do that and say, we're going to have something that's available that we don't have any kind of control over what your feed is.
01:51:49.000And we were just talking about this last night.
01:51:54.000One of the things we talk about is that there's always a cynical take when someone takes an action, and there's an optimistic good faith take.
01:52:00.000The cynical take on Francis is a whistleblower who's a secret operative, da-da-da-da-da.
01:52:04.000The cynical take on the government wanting to regulate social media is it's just because they want to take control, or if the media is ever criticizing social media, it's just because the media is upset that they don't have a monopoly on truth anymore.
01:52:14.000There's a partial truth in each of these things.
01:52:16.000If Apple takes a strong move against these social media companies, And the privacy thing that you just mentioned, they're now protecting people's privacy.
01:52:32.000So the cynical person says, oh, they're just doing that so they can get more money for them.
01:52:37.000How do they make an extra billion per year?
01:52:39.000Because somehow the extra advertising goes through their network or something like that because you're not using cross-app tracking through the other companies.
01:52:45.000Somehow people start spending more money on their system.
01:52:51.000If they wanted to prove that they're actually a good faith actor, this is your idea last night, they could take the billion dollars or even just a large chunk of it that's not legal fees and say, we're going to spend that billion dollars on solving the rest of the social dilemma.
01:53:04.000We're going to fund nonprofits that are doing digital literacy education.
01:53:07.000We're going to put $500 million into nonprofits that are doing digital literacy education and other sort of humane tech.
01:53:14.000We're also going to put another $500 million into R&D to do more features that help address the social dilemma and actually move our whole product ecosystem further in the direction of protecting society and not manipulating society.
01:53:28.000If a company that's as massive as Apple, that has so much capital, I mean, they are literally one of the most wealthy corporations that's ever existed.
01:54:11.000Biden could invite Audrey Tang to come to the United States and actually say, we're going to build a civic tech ecosystem.
01:54:16.000The decentralized web community that's building these Ethereum-based, like new Web3 things, could actually say, we're going to take the central design imperative.
01:54:23.000We're going to do digital democracy that helps us do the bowling alley and get that thin tightrope that we've got to walk.
01:54:29.000These are the kinds of things that could happen, but we would need there to be a public zeitgeist that this has to happen.
01:54:35.000And I know it sounds dystopian if we don't do that.
01:54:39.000It's not an easy problem, but one thing we can show is if people are happier and more successful, if they follow this path, then the path of wanton destruction.
01:54:48.000Because we know that about alcoholics and gambling addicts, right?
01:54:52.000If you have an uncle that is an alcoholic and you see him, you're like, wow, I don't want to be like that guy.
01:55:26.000I've told that to friends, and occasionally they dip their toes back in the water, and then they go, fuck, why did I do that?
01:55:32.000And they'll do an episode of maybe they don't like something that they said, and then they go read, and I'm like, my god, man, get out of there.
01:55:39.000And I don't engage on Twitter, I don't engage in the comments of Instagram, or I don't even look at Facebook.
01:55:46.000And because of that, what I take in is my choice.
01:55:50.000Like, I look at things that I'm interested in.
01:55:53.000And most of my social media, it's not really social media consumption, but most of it is YouTube.
01:55:59.000And most of it is like educational stuff or complete distractions.
01:56:44.000People with right versus left views interacted less at dinner.
01:56:46.000And we're about to head into Thanksgiving.
01:56:48.000And I actually would say that Facebook and Twitter, their business model has been ruining Thanksgiving dinner because their business model is personalizing confirmation bias for everyone so that when you show up...
01:56:59.000So in the same way that that's an epitome of the problem, that's your personal version of the social dilemma, we could also say, what would be the first step for each person listening to this that we can do during Thanksgiving dinner that's putting our phones at the door and actually trying to have a conversation about the mind warp that's taking place?
01:57:16.000It's hard because when people get together and they haven't seen each other for a while, they want to argue about things that they feel the other person's wrong about.
01:57:24.000Because they've got so much of their time invested in these echo chambers.
01:57:30.000But you just mentioned something that was so interesting, which was if people started to understand that the echo chamber was affecting them and affecting the integrity of their family.
01:58:22.000Now, when we think about the social media issue to a degree, we can take the solution that you propose and just say maybe the individual can just remove themselves from it.
01:58:30.000We would argue that this is actually...
01:58:32.000It's impossible population-wide currently because there are companies that just can't succeed if they don't market on there compared to their competitors.
01:58:40.000I'm not saying remove yourself from it.
01:59:18.000One thing I wanted to share, we interviewed Dan Vallone from an organization called More In Common on our podcast, and he does this work on what he calls the perception gap.
01:59:27.000What they found in their work is the more time someone spends on social media, the more likely they are to actually misperceive what the other tribes believe.
01:59:37.000So first of all, we get hit by a double whammy because you're talking about participation on social media.
01:59:42.000And you could sit there looking at stuff but not participating.
01:59:45.000Well, it turns out the people who participate the most, the extreme voices, participate more often than the moderate voices.
01:59:52.000And when they participate, they share more extreme views, so their stuff goes more viral.
01:59:57.000So we're looking at this weird funhouse mirror when we think, like, oh, we're getting a sample of what everybody believes.
02:00:01.000We're not getting a sample of what everybody believes.
02:00:03.000We're getting a sample of what the most extreme people believe.
02:00:06.000So if you actually ask in their research, like, how many people, how many Democrats believe that, what would you estimate for Democrats, what percentage of Republicans believe racism is still a problem in the U.S.? I think they estimate like 40% or something like that, and the answer is closer to like 65% or 70%.
02:00:22.000So we are misperceiving because we're seeing through the stereotypes and the straw men and the bad faith examples of everyone.
02:00:29.000So part of this mind warp is we have to actually, again, understand that we're seeing a very specific view.
02:00:39.000If, like, 50% of people on Facebook stopped participating per what you just said earlier, the problem is that the small remaining group, the most extreme voices there, they would be identified by the algorithm and they would just maximally upregulate them.
02:00:51.000So we just have to realize what game we're in, what unfair fight we're in, so that we can unplug ourselves from the matrix.
02:00:57.000You called me Morpheus last time I think I was here.
02:01:00.000Well, there's also a problem with tribal identity.
02:01:03.000And it's fucking silly that we only have two groups in this country.
02:01:08.000And because of the fact that we really have broken it down to two political groups, we are so polarized.
02:01:16.000We don't have this broad spectrum of choices that we can, you know, well, I like a little bit of this, I like a little bit of that, and to be in the center is to be a fence-sitter and to inspire the ire of both sides.
02:01:35.000But most people don't think that because when they look on social media, they just see people at the extremes and they're like, am I going crazy?
02:01:43.000Your mammalian instincts that things are upside down, that's not wrong.
02:01:47.000But it's not because of some master global conspiracies because social media is just showing us the craziest of the craziest voices on all sides.
02:01:52.000I just realized how much you look like Terrence McKenna.
02:02:27.000The reason I'm bringing it up is because an individualistic-only answer doesn't work when other individuals and other small groups have the capacity to – small and large groups to affect everything so significantly.
02:02:53.000And so when you think about the founders of this country, they didn't just remove themselves from believing in whatever the dominant British Empire thought at the time was.
02:03:01.000They removed themselves from that and then said, we actually need to build a more perfect union and they invested themselves radically to do so.
02:03:07.000And it wasn't a huge percentage of the population but it was working to build something that could apply to a much larger percentage of the population.
02:03:14.000So we need some sort of a radical solution in terms of the way we interface with each other, the way we do business, the way we govern, the way we do everything.
02:03:25.000And so let's say you have people who start pulling themselves off social media and saying, I actually want to engage with other people where I really seek to understand their ideas.
02:03:32.000Before I just jump and criticize, I want to make sure I get their values and what it's like to be them.
02:03:36.000And so they first, they remove themselves from the toxicity.
02:03:40.000Second, they work to actually start making sense of the world better and being in better relationship with each other.
02:03:44.000Next, they say, I want to make a platform that facilitates this for other people.
02:03:49.000And then I want to come on Joe's podcast and talk about the platform and get a lot of people on there so we start to actually get the beginning of a new attractor, a new possibility.
02:04:13.000I wanted us to tell you in the Taiwan example, the way that that happened is actually it was a bunch of activists stormed the parliament, except they didn't try to break the glass and the windows and break through everything.
02:04:24.000They brought in all these Ethernet cables and they set up a Wi-Fi network and they had a bunch of hackers build this alternative civic engagement platform where people could debate ideas right there using technology.
02:04:35.000So they did storm the parliament, but they didn't storm it to hurt people.
02:04:41.000They did it to create the better form of government.
02:04:43.000But to debate ideas where you have things like where unlikely consensus is found, that's what gets upregulated.
02:04:49.000So they were designing that the better angels of our nature are appealed to rather than the lower angels of our nature.
02:05:25.000I don't exactly know what the reality is, but what people are insinuating is that there was federal agents that were involved in instigating the violence.
02:05:39.000Instigating the entering into the Capitol and that there's this one guy in specific that they've got him isolated on video.
02:05:46.000They've shown him over and over again.
02:05:56.000All these other guys who got into the Capitol, I mean, so many of them are facing like these massive federal charges and four years plus in jail.
02:06:05.000This one guy is like, we have to go in there.
02:06:25.000What's the best way to break up a peaceful protest?
02:06:27.000You bring in agent provocateurs to turn it into a non-peaceful, a violent protest, smash windows, light things on fire, then you can send in the troops and you can clean up the mess and then you don't have any protest anymore.
02:06:40.000This was the World Trade Organization in, what was it, in Seattle in 99 or whatever it was?
02:07:06.000Because it's a curious case of this one particular individual who's like yelling in these various groups that we have to get in there, and like he did it pre-January 6th, he did it during the January 6th thing, and then these guys face no legal charges whatsoever.
02:07:23.000And people are like, well, what the fuck is going on here?
02:07:26.000Because when you see some kind of organized debacle like that, and then you see people insisting that we have to take this further and we have to go inside, and then if you find out that those people are actually federal agents that are doing that,
02:07:42.000you're like, well, what is happening here?
02:08:08.000Is it that somebody in government actually initiated him doing it as an agent provocateur to shut down the protest or was he someone who happened to be in government who has himself radicalized who acting on his own because of radicalization did the thing?
02:08:22.000Or is he an agent provocateur but he's doing so independently just because he's a fucking psycho?
02:08:38.000I'm looking at it like this, like, what is this video?
02:08:41.000I'm watching this guy, like this one big, beefy-looking federal agent guy, telling them they gotta go inside, and I think he was wearing a MAGA hat.
02:08:49.000And, you know, he's like a guy in his 50s, and he's like, I'll tell you what we gotta do.
02:10:46.000One of the things that was so cool about the C-SPAN was the idea of being able to actually see what was happening inside of proceedings.
02:10:54.000And we know that the idea of a modern liberal democracy is that we want to leave markets to do most of the innovation and provisioning of resources because they do a good job.
02:11:03.000But we still want rule of law because there are places where markets will have a financial incentive for things that really harm everybody like complete destruction of environments or organ trades or whatever it is.
02:11:12.000And so rule of law is intended to be a way that if you have a government that is up for and by the people that – and it's given a monopoly of violence that it can check the predatory aspects of markets where the basis of the law because of voting is the collective values of the people.
02:11:28.000But the state only has integrity and can check the markets if the people check the state.
02:11:35.000Again, at a much smaller scale, it was easier to have transparency and being able to see what was happening.
02:11:55.000What terrifies me is the solution of this is an autocratic government that controls all aspects of society so none of this ever happens.
02:13:06.000There's a lot of videos of this guy, which is really fascinating because I think these methods that they've used forever are kind of subverted by social media because you have 100,000 different cameras pointed at this guy.
02:13:20.000When someone starts screaming loudly, people start filming it, and then you get a collection of these, and you can go, oh, what is happening here?
02:13:28.000Like, I don't think they've realized that people would be so cynical that they would go over all these various videos and find this one guy who's not being prosecuted or arrested.
02:13:38.000He's not being prosecuted or arrested.
02:13:45.000I mean, if you had a guess, if you had like 50 bucks, what are you going to put your chips on, red or black?
02:13:53.000I might put my chips on the result of stochastic terrorism.
02:13:57.000If I was China, I would have wanted to infiltrate the Facebook group that guys like him were in and just radicalize as much as possible so that some of them were motivated to do it earnestly.
02:14:06.000So it was like some patsy, but I don't even know who it is.
02:14:09.000Oh, for sure there's some of that going on there.
02:14:11.000There's a lot of stuff going on with January 6th.
02:14:14.000And it's a lot of sad humans who don't have a lot going on in their life.
02:14:19.000Did you see the, what is it, Into the Storm?
02:14:23.000Is that what it was, the HBO documentary on QAnon?
02:14:56.000Distraction life is that most people don't feel like they live a meaningful existence.
02:15:01.000So when something like this comes up and you get radicalized, whether it's by China or Russia or that guy and he's saying, you know, that guy's just basically incendiary, right?
02:15:10.000He's just throwing gasoline on the fire.
02:15:12.000You're saying, is there something out there that you can connect to that's bigger than you?
02:15:58.000And these fucking people get completely locked into it.
02:16:01.000And at the end of this documentary on HBO, which is really excellent, I can't recommend it enough, you see a lot of them are realizing, like, this is all bullshit.
02:16:11.000And they're like, what have I done with my life?
02:16:13.000There's a Reddit channel called QAnon Casualties, which is like people, especially who struggle with family members, who've fallen down different rabbit holes, and I guess that's one of them.
02:16:21.000And as people come out of it, just what happens?
02:16:23.000I have a friend who just reached out about that, about his own wife.
02:16:29.000Well, I mean, I think what you're pointing to, our friend Jamie Wheel, who's here in Austin, we had him on our podcast to talk about this.
02:16:35.000When we think about social media, a lot of times people think about it as an information problem, misinformation, disinformation.
02:16:40.000It's actually about meaning and identity, which is what you're pointing to.
02:16:44.000People are getting meaning and purpose from a thing.
02:16:46.000And therefore, it's not a matter of like, well, let's just like tell people the true thing or the fact check thing.
02:16:51.000There's a sense of meaning, purpose, narrative, what I'm participating in that's bigger than myself that people are seeking.
02:16:59.000And part of that, which is exacerbated by social media, because it's mass alienation and loneliness.
02:17:04.000And those are exactly the kinds of people that can be pulled in various directions, which includes also some of the decentralized ways that they can use those tools to cause havoc.
02:17:14.000Something I was thinking is, in the founding of this country, it was...
02:17:18.000I understood that both high-quality education and a fourth estate, right?
02:17:22.000A kind of free and open press were considered prerequisite institutions for democracy to work.
02:17:27.000You had that— You see what a fourth estate is?
02:17:30.000Some kind of—but at that time—so both education and newspaper were the result of a printing press where you didn't just have a nobility class who had access to books when books were really hard to get, but we could print a newspaper so everybody could know what was going on.
02:17:44.000We could print textbooks so everyone could get educated.
02:17:46.000If you could have – at least that was the idea, right?
02:17:48.000If we have a population where everyone can make sense of the world, like they've learned how to make sense of the world.
02:17:54.000They've got history and civics and science and like that and they know what's going on currently, then they can all go to the town hall and participate in government.
02:18:02.000So it was – acknowledge that without something like a fourth estate, a shared way to make sense of the world together, democracy doesn't work.
02:18:10.000Facebook in particular is not just a destruction of the fourth estate.
02:18:14.000It's like an anti-fourth estate rather than share something where everybody gets the same information to then be able to go debate.
02:18:20.000Right now, two different people will have Facebook feeds that have almost nothing in common and polarized, right, and are identifying your fellow countrymen as your most significant enemy and that everything they think is wrong and a conspiracy and a lie or something like that.
02:18:40.000So one of the things I was going to say that's interesting is that as we started to scale more, one of the things that newspaper and then with TV and broadcast became able to do was scaled propaganda, give the same message to everybody.
02:18:53.000And there was this whole big debate in World War I and then going into World War II that democracy requires propaganda because people are too dumb to make sense of the world adequately.
02:19:02.000So we have to propagandize them so they aren't fighting the war effort while we're in war.
02:19:07.000One of the things that is interesting, just from a potential, and you'll say, yeah, but how do we get there because how do you incentivize the Zuckerbergs or whatever?
02:19:14.000And the enactment is a real tricky thing.
02:19:19.000You could use the tools of social media which is the ability to personalize a huge amount of content to the individual to actually not – to make real democracy possible where you don't need to give everyone propaganda because they're dumb.
02:19:32.000You can actually help people understand the issue progressively better in a personalized way.
02:19:41.000And you can imagine that, like, real democracy could actually be possible at scale if you could personalize the kinds of education and civic virtue that would be necessary for people to engage in a democracy.
02:19:53.000Let me add on to that, because this example you just showed me, right, with this guy, I had never seen that video.
02:19:58.000Imagine a Thanksgiving dinner happening a few weeks from now where one set of people had been all exposed to this guy, and this is like the central way that they see January 6th, is through the lens of that guy.
02:20:09.000If you're in one of the other filter bubbles, all you see is just the violent, crazy, whatever.
02:20:15.000You are not even operating on a shared reality.
02:20:17.000So when you talk about January 6th, normally if we have a shared printing press or we have a shared fourth estate, we've at least been exposed to some of the same things.
02:20:40.000I was built evolutionarily to assume that your brain is constructing reality from some of the shared stuff that I'm seeing with my eyeballs.
02:20:45.000So all my biases are to assume other people are talking about the same reality.
02:20:49.000And there's a little bit of a magic trick optical illusion because we both saw, quote unquote, January 6th, but we were exposed to completely different media sets.
02:20:57.000So now when we get in a conversation, it completely breaks down, not because we're actually even talking about the same thing.
02:21:03.000But because we don't even get to that layer of detail.
02:21:06.000And one of the things in a humane technology world, I think I mentioned to you in the more uncommon research they found that the more you use social media, the more likely it is that you are not able to predict what someone else's views are on a topic.
02:21:19.000You think all Republicans are racist or something like that if you're on the Democrat side.
02:21:22.000Or if you're on the Republican side, you believe that all Democrats are LGBTQ and only 6% of Democrats are LGBTQ. So we are far off in terms of our estimations of other people's beliefs.
02:21:32.000And in the current world, the more you use social media, the worse your estimations are.
02:21:36.000In a humane future, the more you use social media, the better our shared understanding and my understanding of your understanding would be.
02:21:44.000And so you can imagine there's some sense maker out there who's showing both sides of these different filter bubbles and helping us bridge build.
02:21:51.000So we're actually even able to have a shared conversation.
02:21:53.000Those are the kinds of people that Daniel was just talking about would get kind of upregulated to be at the center of our shared undivided attention.
02:22:00.000Let's say I wanted to say, how do I increase trust in our institutions that are processing things too complex for an individual to figure out on their own, like the reality of climate change or COVID? Well, let's say that C-SPAN-like, I had debates happen inside those institutions where people who had real expertise but had conflicting views had a long-form facilitated debate but not the type of debate that is just oriented towards rhetoric and gotchas to try to win but that is earnestly
02:22:33.000And there's a facilitated process and the people agree to it.
02:22:36.000One of the things they agree to is what would I need to change my mind about this?
02:22:39.000If the answer is nothing, then you don't even engage in the debate.
02:22:42.000If we can't even say what would change our mind, we're not really responsible participants of a democracy because we're not really open to it.
02:22:48.000And each of the debaters has to read each other's content first and agree to a facilitation process that's long form and we start with what do we all agree on?
02:23:00.000That means now, both around the values that are relevant and the facts of the matter, that where we go to what we disagree on, we know what our shared basis to derive a solution looks like.
02:23:10.000Then we try to formalize our disagreement.
02:23:12.000I believe X... I believe not X. And we say, what would it take to solve this?
02:23:16.000Do we need to do a new piece of science?
02:23:17.000Do we disagree because of an intuitive hunch or a different value?
02:23:36.000Different views but were earnest and wanted to know what was true more than hold their own view, be able to engage in a process that could bring us to what is shared knowledge?
02:25:08.000But if that's what we assume is true, that that is the full story of who we are when we look in the mirror, then this story is over and we should just go do something else.
02:25:34.000So I think we have to address it at the root level before we address it even at a social media level.
02:25:39.000That's why you had Johan Harion saying the opposite of addiction is not sobriety, it's connection.
02:25:43.000People need meaning, purpose, connection.
02:25:46.000And you can imagine a world where social media is like, hey, here's some drum circles or dance events or obviously post-COVID or whatever, but just...
02:25:55.000When you look at a screen, it's basically allocating decisions of where time blocks are going to land in your calendar.
02:26:00.000Most of those time blocks are like, spend another five seconds doing this with your phone.
02:26:04.000But imagine social media becomes a GPS router for life choices, where it's actually directing you to the kinds of things that create more meaning.
02:26:10.000Now, of course, the deeper thing is work inequality, meaning not existing in a lot of the work that people do.
02:26:22.000When people get really angry, you accuse them of being incels.
02:26:24.000But we can imagine a world that facilitates ways for people to, you know, go to dance events together where they meet other people in a more, like, facilitated environment as opposed to you're going to sit there at home and, like, let's just get you swiping and Tinder's profiting from the attention of casinos so you match and then you never message someone,
02:26:43.000And then we also have the emergence of the metaverse where people are just going to be more incentivized to go into that because it's going to be very exciting.
02:26:49.000Which is why a humane future is the online world has to care about actually like regenerating the connective tissues of the offline world.
02:26:56.000If it doesn't do that, it's not going to work.
02:26:58.000Apple could be in a position to do that.
02:27:00.000You take it back to similar to people exercising and not eating too much sugar because those are...
02:27:05.000The too much sugar is a hypernormal stimuli, right?
02:27:08.000Remove the sugar, fat and salt from evolutionary food, which are the parts that create the dopamine hit and just make fast food out of it.
02:27:14.000And in the same way of like what is fast food to real food is just the hypernormal stimuli.
02:27:38.000The question of what is a good measure of the health of a society, one metric that I like – no one is applying the metric just as a thought experiment – is the inverse of the addiction in the society as a whole is a good measure of the health.
02:27:50.000A healthy society produces less addiction, meaning more sovereign individuals because addiction creates a spike in pleasure and then an erosion of their baseline of pleasure.
02:27:59.000And baseline of health fulfillment in general.
02:28:02.000One of the reasons we're so susceptible to hypernormal stimuli is what you're saying is because we live in environments that are hyponormal, like not enough of the type of stimuli that we really need, which is mostly human connection, creativity, and meaning.
02:28:15.000And so at the basis of it is like, how do we actually increase those is the only way that we become unsusceptible to the supply side marketing that appeals to...
02:28:27.000And it's interesting to think about if Apple were to take the, you know, small percentage of people who opt into tracking their usage statistics, and they could actually measure for a given country, hey, this is the percentage of people that are addicted based on usage patterns.
02:28:39.000Again, it's privacy respecting and everything, and reporting that back to society so there's a better feedback loop between...
02:28:50.000I mean, again, Apple's in this really unique position where their business model is not addicting people, polarizing people.
02:28:55.000You know, they could actually make their whole system about how do we have deeper choices in the real world.
02:29:01.000Well, there is a movement in society currently to try to get people to recognize through radically induced introspective thought brought on by psychedelics what the problems of our society and Not necessarily the problems of these choices,
02:29:20.000but the problems you're talking about like indulging primarily in these choices, whether it's porn or fast food or gambling or alcohol or whatever these problems are that people have, is that there are certain psychedelic compounds that allow you to see yourself in an incredibly ruthlessly introspective way that'll allow you to make radical changes.
02:29:44.000And there's a lot of great work being done right now with MAPS, where Rick Doblin's organization has worked to try to introduce these compounds to specifically help soldiers deal with PTSD. It's a big one.
02:30:00.000And I think through that and through their advocacy and the understanding that this stuff is very effective, whether it's through MDMA or whether it's through psilocybin, through some of the research they're doing with that, that there's a way to get a view outside of the pattern,
02:30:18.000this deeply cut groove that you're stuck in.
02:30:23.000And I think if we're dealing with anything that is a possible potential real solution for radically re-engaging thought, for changing the way we interface with each other and with society in general, I think that's it.
02:30:39.000And I think the fact that that is illegal currently is one of the big problems, one of the big barriers between us changing the way our culture operates and what we find to be important.
02:30:56.000Yeah, I mean, you remember so many of the, like, founding writings of the country said we need freedom of religion, but we actually need a religious people.
02:31:04.000And what they were saying is, like, we don't care if it's Confucianism or whatever, but you need a people that have some transcendent values and morals that bind them to more than just their own self-interest.
02:32:37.000And I think, in many ways, it's one of the things that Apple does.
02:32:40.000When Apple is talking about this world where they're creating less impact of advertisement by not having you track amongst all apps and allowing you to choose whether or not apps track you.
02:32:56.000That's a bold move in this world where everybody is trying to accentuate the influence that apps have and the amount of engagement they have and to be able to use advertiser money and to be able to generate more of it through that.
02:33:15.000Look, Android is just a big data-scooping machine, right?
02:33:20.000I mean, they're tracking everything and anything.
02:33:22.000And it's one of the things they said about TikTok, when software engineers first looked at it, they're like, Jesus Christ, this is tracking everything.
02:33:30.000And it's one of the most invasive of the applications.
02:33:34.000Why TikTok is not considered a major immediate national security threat, I still don't understand.
02:33:39.000I mean, if Russia in the Cold War was running the media programming for the United States for all of its youth, like, that's insane.
02:33:48.000There's actually a specific strategy China uses called the The CCP uses called the borrowing mouths to speak.
02:33:53.000So you can imagine when anyone says, any Western voice in the U.S. speaks positively of the CCP, you can just add a little inflation.
02:34:00.000They just get a little bit more boost than someone, because you're more trusting of someone who's a Western voice than of someone who's from, say, the CCP or China.
02:34:08.000And so that's one of the invisible ways you can steer culture.
02:34:12.000But going on back to the Apple point, we all sound like we're promoting Apple in this podcast, and I just wanted to say this.
02:34:17.000Well, we're kind of promoting good faith companies that are moving in the right direction.
02:34:22.000Yeah, and you had John Mackey on Whole Foods.
02:34:23.000We went to Whole Foods last night and talked about how that's creating an environment for health and trying to at least couple better.
02:34:30.000It's just coupling better towards we can make money by helping things be healthier.
02:34:34.000Apple could say, we're going to couple our business model, put on the Johnson& Johnson, whatever you think of Johnson& Johnson, but you can...
02:34:39.000We're going to orient our business towards long-term health of people.
02:34:44.000We're going to change the app stores to put down in the shelf space all the toxic social media stuff, if not take it off completely, and put back on what are the things that help people connect with each other and connect with each other in person.
02:34:55.000Part of that is it's actually kind of hard to host.
02:34:58.000There's certain people in a community who are kind of the event hosters.
02:35:01.000They're the people that bring people together.
02:35:03.000And right now, I mean, they're good at it, but imagine that was just like a hundred times easier.
02:35:07.000I don't have any product or anything like here in thinking about this, but there are people who work on how do we make it easier to bring people together in the physical world?
02:35:15.000And if we made that a lot easier than it is today, so that was happening more often, so that when you thought about what you wanted to do, instead of I could open up this app or that app, I felt in my own community, in my physical community, I think we're good to go.
02:35:45.000Again, this is part of a longer-term trend and transition of how you get out of this.
02:35:49.000But I do think that we have to make the choices that are fulfilling as easy to make as the choices that are not fulfilling but have the hypernormal stimuli instant hit.
02:35:57.000I was thinking about something, Joe, when you were asking, like, what are the solutions?
02:36:01.000And jumping quickly to why some proposed solutions don't work, which is true.
02:36:06.000It's like you think about what are the nutrients the body needs?
02:36:09.000You can die just from a vitamin C deficiency even if you have all the B vitamins, vitamin D, etc.
02:36:15.000And so it's like the body doesn't need a nutrient.
02:36:17.000It needs minimum levels of lots of nutrients.
02:36:50.000And all of the kind of single solutions might do something but end up failing.
02:36:55.000And so we have to also – and this is again something that's very hard to do when attention spans are getting shorter and shorter – is how do we actually look at a whole ecosystem of solutions that collectively can start to address it even though any of them individually can't?
02:37:19.000We have to not just stop our forward, it's not even forward momentum, the general direction that we're going in.
02:37:26.000Well, I think with things like The Social Dilemma, which was seen by 150 million people and Francis' stuff coming out and people having a negative reaction to the metaverse.
02:37:36.000I don't know that many people who saw it and was like, yeah, let's totally do that.
02:37:38.000Obviously, they have asymmetric marketing power.
02:37:40.000They're going to put billions of dollars into funding this thing.
02:37:46.000Because that commercial where the tiger is talking to the buffalo and then all the kids are dancing, I don't know what the fuck is happening.
02:37:52.000I mean, I don't know what's happening in that example, but it's a race to control the whole experience.
02:37:58.000I mean, the reason that Facebook is doing the metaverse, Zuckerberg doesn't like the fact that Apple has controlled his destiny by controlling the operating system inside of which Facebook has to sit.
02:38:08.000And then all the various ways that whether they make advertising like they did recently, the privacy tracking stuff, it makes him not have control over his destiny.
02:38:16.000And so if you own the whole platform, bottom to top, it's a classic vertical integration.
02:38:20.000If I own the entire stack, I can control the entire thing.
02:38:27.000And it's going to become a land grab between these companies for who can sort of own the next metaverse platform.
02:38:33.000It's a fascinating thing to observe when you're watching someone who has ungodly amounts of wealth clearly, ambitiously pursuing more in a very transparent way.
02:38:51.000What's interesting to psychoanalyze him a bit is that he has 55% of the ownership and voting structure shares of Facebook.
02:39:34.000No, I think it's, he wants to be seen as an innovator.
02:39:37.000And if the world said the way you can be an innovator is not by...
02:39:42.000Building more stuff that basically hollows out the physical world so we can make this unsustainable virtual world that collapses society.
02:39:50.000You can be an innovator by actually fixing and helping to support the planet that we're on the actual world that we're living in the social fabric that needs to be strengthened.
02:40:17.000About the gap between what his incentives are and what the world needs for basically sustaining it.
02:40:23.000Well, also imagine if you've created something that says, whether or not he created it is a different debate, but you're the controller of something that's so massively influential on a global scale, and maybe he thinks that at least he's not evil.
02:40:38.000Like, he may be trying to make money, and he may be trying to come off as an innovator, but he's not an evil person.
02:40:47.000I don't get an evil sense off of Mark Zuckerberg.
02:40:52.000He's odd in the way he communicates, but maybe that's like a social awkwardness in dealing with his own public image being broadcast to the world and comes off clunky.
02:41:01.000People come off clunky when they're concerned with how people view them.
02:41:11.000I'm gonna back out now like, you know, Jeff Bezos is leaving Amazon and he's gonna like hand it over to another CEO. Imagine him handing over Facebook to some other person and watching them fuck it up or watching them take this insanely powerful thing and actually make it more evil or make it more destructive but more profitable.
02:41:38.000I mean, if they just went full capitalist and some really ruthless CEO got a hold of Facebook and they said, listen, our bottom line is, like, we're trying to increase the amount of money that our shareholders get off of this, and what we're going to do is we're going to make these choices.
02:41:53.000And these choices might not be that popular with analysts and with people that are, you know, sort of trying to examine culture and the impact that social media has on it, but for us, it's going to be a windfall.
02:42:08.000We were speaking with a friend who is in a senior position at Google working on AI and has come to the conclusion that a lot of people in senior positions in AI have come to that something like artificial general intelligence is inevitable and inevitable near term.
02:43:39.000There's no way for it to not be dystopic.
02:43:53.000Actually, the only answer is to jack our brains in so that the meat suit is somewhat useful to the AGI. So now we're in a race to do that.
02:44:03.000When people understand the catastrophic risks and they don't see any good possibility out, then oftentimes they will actually accelerate some version of a catastrophe as the only reasonable solution.
02:44:17.000It's so important to actually define the design criteria right and have people committed to find solutions even though they're really hard.
02:44:23.000And it's why I think something like this is interesting is truly a belief that a lot more people focused on what we need to be trying to solve is actually useful.
02:44:32.000We think there's a lot of super smart people at a lot of universities and in institutions and… So let's start with the right design criteria,
02:44:53.000If you're adding tech that affects society, it has to actually be increasing the quality of democracy.
02:44:59.000It has to be increasing the integrity of markets.
02:45:01.000It has to be increasing the quality of families and mental health.
02:45:04.000You look at what are the foundational things.
02:45:05.000If it's not doing that, it failed the design criteria.
02:45:08.000Similarly, the idea that we have these dystopias and these catastrophes, we need – and the catastrophes come from not being able to create order in the presence of the success of tech.
02:45:19.000The dystopias come from top-down order.
02:45:21.000So what that means is rather than have imposed order or no order, we need emergent order, which is what democracies and markets are supposed to do.
02:45:28.000But they haven't – they have to be upregulated, a new, more perfect union that's upregulated to the level of tech we have because the tech has advanced so far.
02:45:37.000So how do we bring about emergent order of or by the people that can direct the tech to not be catastrophic but isn't dystopic?
02:45:46.000I just want a lot more people thinking about that.
02:45:49.000I want a lot more smart people at MIT and Stanford and the State Department and wherever and in Ethereum working on those issues, proposing things, finding out what's wrong with them so that the collective intelligence of the world is centrally focused on How do we make it through the metacrisis?
02:46:06.000How do we make it through the fact that we are emerging into the power of gods without the ability to steward that well?
02:46:13.000What would it take to steward it well?
02:46:57.000And I think that we have a very short memory when it comes to things that are impactful and really sort of...
02:47:06.000Jog your view of reality, you know, it's so easy to slide right back into it and I think There has to be an effort Where we remind people, we remind each other,
02:47:22.000we remind ourselves, whether it's a hashtag, whether it's some sort of an ethic, a movement, an understanding, like we're moving in the wrong direction.
02:47:31.000And we need to establish that as a real clear parameter, like we've got a problem here.
02:47:48.000Yes, there's a lot of power on the other side of the table, right?
02:47:51.000We've got trillions of dollars of market power.
02:47:53.000The question is, are we the people, the culture, going to be able to identify what we don't want and then steer ourselves in the direction of what we do?
02:47:59.000But are we operating in an echo chamber where we're talking to a lot of people that are aware of it?
02:48:03.000So when you say people are aware of it, like what percentage are we talking about?
02:48:08.000Most people who watch The Social Dilemma walked away with something like tech is a problem.
02:48:13.000It's kind of generally scary and it seems to be bad for teenagers and families.
02:48:18.000They didn't get is fundamentally incompatible with democracy because it polarizes the population, polarizes the representative class, creates gridlock and makes it less effective relative to other forms of government.
02:48:46.000It almost seems ungraspable, you know?
02:48:52.000It just seems like you can nail down all these...
02:48:57.000Problem issues, but then when it comes to real-world application, I'm like, what the fuck do you do?
02:49:03.000Well, this show is going to air by bouncing off of satellites that are in outer space to be able to go to people's phones and computers using the most unbelievably advanced technology.
02:49:19.000It's actually very hard for people to grasp the whole scope of the technological complexity.
02:49:24.000When you have that much technological complexity and that much technological power, we also have to be able to work on complex social systems that can make us able to wield it.
02:49:33.000And we just haven't had the incentive and motive to do that.
02:49:37.000But hopefully, recognizing where it goes if we don't is incentive for enough people to start working on innovation more.
02:49:43.000But this technology, this fascinating and super complex technology is disconnected from human nature, from these thousands and thousands of years of human reward systems that are baked into our DNA. That's part of the problem.
02:49:57.000Only because we have a whole system, trillion dollar market cap system, dependent on hacking and mining from those human vulnerabilities.
02:50:07.000And we instead reflect back in the mirror not the worst angels of our nature but the better angels of our nature.
02:50:12.000We see examples of people doing the hard thing over the easy thing.
02:50:15.000We see examples of people hosting events for each other and being better to each other rather than being nasty to each other.
02:50:20.000We're just not reflecting the right things back in the mirror.
02:50:23.000We do reward when people do those things, right?
02:50:26.000Occasionally, but the social media algorithms don't reward them by and large, right?
02:50:30.000They take a couple examples where the positive thing happens, but mostly we see the most engaging, outrageous, controversial thing.
02:50:35.000And so we have to reflect back something else in the mirror.
02:50:37.000I think it's like, if you remember the 1984 ad, and bring it back to Apple, and if you remember the ad for the Macintosh, the famous thing where there was a woman running down the...
02:50:45.000Like, there's the booming big brother on the screen, and the woman's running down, and she takes this hammer, and she's wearing a Macintosh t-shirt, and she takes this hammer, and she throws the hammer at the screen, and it blows up.
02:50:56.000And it says, on January 24, 1984, Apple will introduce Macintosh, and you will see why 1984 won't be like 1984. Yeah.
02:51:06.000Was it 1984 that Apple came up with that computer?
02:52:22.000We have to not let the Orwell-Huxley two gutters thing happen.
02:52:27.000We have to throw a hammer down the middle and create a future where technology is actually humane and cares about protecting the things that matter to us.
02:52:36.000One thing that gives me hope is that these kind of conversations are very popular.
02:52:40.000You know, like The Social Dilemma is very popular.
02:52:43.000Last one we did got like 9 million views or something like that.
02:52:45.000Yeah, this one will probably be similar.
02:52:47.000It's like people are interested in this conversation because they know it's a real issue, at least the kind of people that are tuned into this podcast.
02:52:55.000And I think it's going to be like little baby steps into the correct direction.
02:53:03.000And, you know, what I said about psychedelics is it's one of those...
02:53:10.000People that don't have any psychedelic experiences, they don't realize the dramatic transformative impact that those things can have on cultures.
02:53:26.000And during this time of this lifespan that we've experienced, we've seen so much change.
02:53:33.000And so much almost unstoppable momentum in the general direction, and it doesn't seem good.
02:53:39.000But recognizing it, discussing it, and having documentaries like The Social Dilemma, having folks like you guys come on and discuss, like, what is really going on?
02:53:49.000And we didn't even really get into a lot of the real technological dilemmas that we have.
02:53:54.000You know, we basically glossed over the idea of drones and the idea of CRISPR and many of these other problems that Just watching that text-to-code thing going, oh my god, the barrier of entry has been eliminated.
02:54:15.000But hopefully through conversations like this and you putting attention on it, I mean, you are part of the sense-making world.
02:54:21.000You are helping people make sense of the world.
02:54:24.000And when you put your attention on it, I mean, I'm grateful for you creating this opportunity to talk about these things because, you know, they're heavy conversations and they're hard to look at.
02:54:33.000Well, and it's actually important that you have these long-form podcasts, right, that are two-plus hours as opposed to...
02:54:40.000Matthew Feeney, Jr.: Five second clips or tweets is when we talk about tech has to enhance the fundamentally important things.
02:54:47.000So we saw how tech kind of specifically social media tech with the polarization algorithms messed up the fourth estate also messing up education.
02:54:54.000It doesn't matter what you learn if you can't remember anything and you have no control of your attention.
02:54:59.000And so one of the things is that tech has to actually be increasing people's attention.
02:55:24.000You just say one cynical perspective and I'm right and that's the only thing I'm going to think.
02:55:28.000Aaron Ross Powell And so imagine that like instead of the short clickbait thing because otherwise I'll bounce, if I actually read the longer thing and if my post had more nuance, that actually got upregulated.
02:55:37.000So it created an incentive to take in other perspectives, to try to parse them and synthesize them.
02:56:12.000If you think about just what it does, it's ranked by what's most engaging.
02:56:15.000So it's like every dramatic event that happened with anyone anywhere, like little drive-by like, oh, you just cut me off on the freeway and I'm upset for a second.
02:56:23.000Anywhere that happens anywhere, it just collects it all into this one efficient feed.
02:56:28.000And then people are responding as if it's all happening at the same time.
02:56:31.000It's already this weird chronologically distorted reality because it's pulling from all these different moments and making it seem as if it's all one moment.
02:57:42.000I had a period where I intentionally went and curated my Facebook algorithm where I followed all of the groups that look at police violence, cop block and those ones.
02:57:52.000And so my feed just became filled with cops killing black guys and killing – escalating violence in ways that didn't look useful.
02:58:02.000Now, of course, those videos also didn't show what happened beforehand to possibly justify it or not.
02:58:07.000So like they were selected for a reason.
02:58:10.000But even where they were egregiously wrong, they might be a very small statistical percentage of all police interactions.
02:58:18.000But even though I knew I was curating my feed for this purpose, it emotionally affected me intensively just watching that many in a row.
02:58:25.000But by the time I've watched 12, it feels like this is everything that's happening.
02:58:31.000And then I got rid of those and I curated ones that were like pro-police, thin blue line kind of ones.
02:58:38.000And you saw people aggressing against cops and you saw what they have to deal with.
02:58:42.000And I was like, man, these guys are heroes.
02:58:44.000And again, it only took like 12 videos.
02:58:46.000And even though I was knowingly doing it to myself… It was that emotionally compelling because we are used to evolutionarily seeing a world that is representative of the world.
02:58:56.000But when there's so much data that 0.01% of it is more information than I can possibly take in and it can be totally statistically unrepresentative but it still affects what I feel the world is, you can see how… Earnest people can get completely polarized.
02:59:16.000And the fact that you are consciously curating it and still having this effect on you, but you at least can objectively express it to other people.
02:59:23.000And, you know, hopefully that gets into some people's brains and they see how dangerous this stuff is.
02:59:31.000And this is also why these troll farms exist.
02:59:35.000Because they can really influence the way our culture moves and behaves and the way it thinks about itself.
02:59:43.000Gentlemen, thank you very much for being here.
03:00:01.000Let's give it a couple of months and hope things don't turn to nuclear war.
03:00:06.000I'll tell you why it feels inspiring to me, and thank you for having us here, is...
03:00:10.000There's a lot of people who are focused on systemic injustice or climate change or economics or AI issues, but how do all these issues fit together and how do we actually deal with the fact that we've created so much technological power and we've had such a huge impact on our environment through the whole industrial use of technology that the world's increasingly fragile.