The Megyn Kelly Show - January 20, 2022


Tristan Harris on Secrets of Social Media Algorithms, Tech Manipulation and Addiction, and How Our Feeds Divide Us | Ep. 244


Episode Stats

Length

1 hour and 41 minutes

Words per Minute

195.8088

Word Count

19,784

Sentence Count

1,171

Misogynist Sentences

9

Hate Speech Sentences

27


Summary

Joe Biden held his second formal news conference in a year, and it didn t go well. In fact, it was one of the worst press conferences he has held in years, and a new piece by National Review's Charles C.W. Cook explains why.


Transcript

00:00:00.520 Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.960 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:15.600 In just a few minutes, we're going to be taking a deep dive into the dangers of big tech with Tristan Harris.
00:00:21.780 Do you remember this guy? He was in the great, great film The Social Dilemma.
00:00:25.380 And he is going to be speaking to us about what it's like on the inside at Google and elsewhere
00:00:32.000 and how we are intentionally being manipulated, even worse now than when they made that movie a few years ago.
00:00:38.340 But we begin today with President Biden, the 46th president of the United States,
00:00:43.060 giving just his second formal news conference in a year, taking questions for just under two hours yesterday.
00:00:50.360 Today, already, his administration is now trying to do some major cleanup after, in that press conference,
00:00:57.160 he not only almost started a ground war in Europe by seeming to greenlight a Russian incursion, minor incursion, in Ukraine,
00:01:08.000 but went on from there, suggesting that the 2022 midterms would probably be illegitimate
00:01:14.500 unless his entire voting rights bill were passed.
00:01:18.260 Charles C.W. Cook is a senior writer for National Review.
00:01:21.580 He has a new piece out today entitled Biden's Year of Failure.
00:01:26.520 Charles, great to have you here.
00:01:27.740 So he claimed last night that he did not overpromise about what he could do as president,
00:01:32.720 that he actually outperformed what was expected of him in his first year as president,
00:01:37.460 saying, quote, no one has ever accomplished more in their first term.
00:01:42.420 Your thoughts on whether that's true?
00:01:45.380 Well, it's not true.
00:01:47.100 None of that's true.
00:01:48.480 He certainly overpromised because he vowed that he could change things that are largely beyond his control.
00:01:56.760 I think on the assumption, especially with coronavirus, that everything would improve once he became president,
00:02:03.940 which it hasn't, he has not overperformed.
00:02:08.060 He has fallen foul to delusions of grandeur that have destroyed his agenda.
00:02:17.080 And as for his supposed productive first year,
00:02:22.220 we can find recent examples that demonstrate that that isn't true.
00:02:27.540 But if you go back through history, especially, you will see that it isn't true.
00:02:32.260 He wanted to be FDR for a reason, and he hasn't been.
00:02:38.320 The words, Mark Thiessen of AEI, and he's a Fox News contributor, used to come to my show every night when I did the Kelly file.
00:02:45.980 He had a piece out before the presser saying, it's a problem for a president when his rhetoric,
00:02:53.860 when his words don't match the experience the American people are having.
00:02:56.920 And he had worked for President George W. Bush and recalled back in 2006 when George W. Bush faced that very problem.
00:03:04.660 He was trying to put a rosy spin on the Iraq war, and the American people knew it wasn't true, and the messaging fell flat,
00:03:11.160 he pointed out, until they decided on the surge in 2007 and things started to turn around.
00:03:17.040 That's exactly what happened last night, right?
00:03:19.160 He was talking about the country in a way that did not seem to reflect the reality on the ground,
00:03:26.440 whether it has to do with our economics, COVID, even Afghanistan.
00:03:33.560 No apologies. No apologies.
00:03:35.320 Meanwhile, two-thirds of the American public disapproves of how he handled Afghanistan, a huge portion of them strongly.
00:03:41.980 So his words didn't match people's experiences or the facts.
00:03:46.200 Yeah, and he squared that circle by saying he doesn't believe the polls,
00:03:50.860 but he doesn't have to believe the polls, given that we have results in New Jersey and Virginia,
00:03:56.740 real results, the product of real people voting, that should show him something important.
00:04:03.360 I thought more broadly, though, the reason that it failed yesterday,
00:04:07.060 beyond the usual Joe Biden shortcomings,
00:04:11.060 is that he promised a reset but didn't change anything.
00:04:14.260 Well, the classic example that is given in political circles of a pivot is Bill Clinton,
00:04:20.940 who was forced to pivot after the 1994 midterms by a Republican wave,
00:04:27.080 what was called the Republican Revolution.
00:04:29.740 But Bill Clinton actually changed course.
00:04:32.340 He dropped the agenda that he'd been trying to pass in 1993 and 1994.
00:04:38.080 He worked with the Republicans on areas of agreement.
00:04:42.400 But what did Biden do yesterday?
00:04:44.340 He said that he hadn't messed up in Afghanistan, which he did.
00:04:49.160 He said that the inflation problem would be helped by spending more money,
00:05:04.140 which nobody believes.
00:05:06.960 He then, oddly enough, channeled President Trump in some important ways by casting doubt on America's election system.
00:05:16.820 And, you know, you looked at his talk of his agenda, of his Build Back Better bill.
00:05:22.720 Well, the furthest he would go is to say it would probably have to be broken up.
00:05:26.100 Well, where's the reset?
00:05:28.320 Where's the change?
00:05:29.360 Where is the pivot?
00:05:31.000 Yeah, there isn't one.
00:05:32.160 His message seemed to me more like what the American people really need is to see more of me.
00:05:37.760 I need to get out there on the road and just do a better job of explaining how amazing I've been.
00:05:43.320 And it's like, well, I mean, again, going back to Thiessen, his column yesterday read as follows, in part.
00:05:50.260 Sorry, but you can't boast about your COVID strategy when 55 percent disapprove it.
00:05:55.000 You can't brag about your economic performance when 60 percent say it's been dismal.
00:05:58.920 You can't crow about your foreign policy when 55 percent believe you're doing a terrible job as commander in chief.
00:06:03.540 Can't talk about how you've united the country when a 49 percent plurality say you've done more to divide us.
00:06:08.660 And you can't say you've had a great year in office when 63 percent say we're on the wrong track.
00:06:14.240 And that same number, about two thirds of the American public, Charlie, according to I think it was the latest Suffolk poll, say he shouldn't run for a second term.
00:06:22.640 So, I mean, can you really say it's just a comms problem?
00:06:26.440 What he really needs to do is just be more persuasive about what he's done.
00:06:30.080 No, it's not a comms problem.
00:06:32.120 We've talked about this on your show before.
00:06:34.280 The problem is that Joe Biden's behavior as president has been at odds with how Joe Biden ran for president and why the American public hired him to be president.
00:06:47.160 Joe Biden ran as almost the anti-Twitter candidate.
00:06:51.020 He ignored Twitter.
00:06:52.000 The moment that he entered office, he's been driven by Twitter.
00:06:55.880 Joe Biden ran promising honor and moderation and competence and experience.
00:07:01.260 But he allowed himself in the early days to be talked into the idea that he would be transformational.
00:07:07.180 Nobody asked for that.
00:07:08.500 Joe Biden expected until January 5th of last year to preside over divided government with a Republican Senate, a narrow Democratic House and, of course, a Democrat himself in the White House.
00:07:22.880 And the moment he got to 50 seats, not 65, not 75, but just 50 seats, he brought out every single agenda item that the Democratic Party has wanted to institute for the last 10 years.
00:07:37.260 This is a profound mistake because the American public, whether it should have or not, wanted a caretaker president.
00:07:44.240 They wanted a president who wasn't Donald Trump and who would return the country to normality after both Trump and COVID and the economic fallout.
00:07:53.420 Biden totally misinterpreted his mandate and he's still suffering from that.
00:07:58.500 And again, there was no sign yesterday that he is going to change that.
00:08:04.640 And until he changes that, he's going to get the same results.
00:08:08.320 I know I was talking to the BBC today, Charlie, back in your your old home country, though you're an American now.
00:08:14.500 And I was struck by the questioning because it was very focused on how Trump and the tweets and, you know, Joe Biden has, you know, things have gotten calmer.
00:08:23.820 And isn't that a better thing for America?
00:08:25.460 America, and they really look at him as not radical, as moderate.
00:08:31.740 And, you know, no president could really bring the country together.
00:08:34.980 It's so divided.
00:08:35.900 And they're very focused on Trump's tweets and so on.
00:08:38.620 And I realized that they were ridiculous.
00:08:40.400 They were terrible and they were absurd.
00:08:42.100 And OK, I get all that.
00:08:43.680 Nobody would dispute that.
00:08:45.640 But they completely miss Joe Biden's radicalism and his divisiveness.
00:08:51.340 And, you know, even lately, his rhetoric's getting even worse.
00:08:55.180 But he still wants us to believe he's the unity president.
00:08:59.840 Yeah.
00:09:00.680 And I will put Donald Trump in a class of his own.
00:09:04.860 He is unique in every way.
00:09:07.040 But oddly enough, the last week, Joe Biden has been more like Donald Trump than he would like to admit.
00:09:13.600 On policy yesterday, he cast doubt on the legitimacy of elections twice.
00:09:19.680 He was weak on the question of a Russian invasion of Ukraine, also twice.
00:09:28.660 And he snapped at a reporter, Philip Wegman, who asked him about comments that he had made about his voting rights agenda, so-called.
00:09:41.000 Which, in turn, was the product of a speech he gave last week that was frankly outrageous, that was so outrageous that it was condemned by the likes of Mitt Romney, who tend not to raise their voice, tend not to indulge in hyperbole.
00:10:00.140 And also acknowledged by his own party, Dick Durbin said that perhaps Biden had gone too far.
00:10:07.940 He did go too far.
00:10:09.120 What he did was to divide the country up into two groups of people.
00:10:13.220 Good people, who agree with Joe Biden, and bad people, whom he likened to insurrectionists and segregationists and really some of the worst people from the darkest periods in American history.
00:10:25.740 And that's not what Joe Biden said he was going to do as president.
00:10:29.760 And Mitch McConnell on the Senate floor compared Biden's comments to his inaugural address unfavorably.
00:10:37.220 Yeah, we actually have that because worse, Joe Biden tried to deny that he had made that comparison between people who oppose his voting rights bill and the George Wallace's of the world.
00:10:50.600 And it's on camera. We all it just happened. It's not like it happened two years ago. We just heard him say that earlier this week.
00:10:57.020 So here is a soundbite showing his denial and then what he said earlier this week.
00:11:02.620 You called folks who would oppose those voting bills as being Bull Connor or George Wallace, but you said that they would be sort of in the same camp.
00:11:13.340 No, I didn't say that. Look what I said. Go back and read what I said and tell me if you think I called anyone who voted on the side of the position taken by Bull Connor, that they were Bull Connor.
00:11:31.340 And that is an interesting reading of English. I assume you got into journalism because you like to write.
00:11:37.340 So I ask every elected official in America. How do you want to be remembered?
00:11:44.340 At consequential moments in history, they present a choice.
00:11:49.380 Do you want to be the side of the side of Dr. King or George Wallace?
00:11:54.780 Do you want to be in the side of John Lewis or Bull Connor?
00:11:59.660 Do you want to be the side of Abraham Lincoln or Jefferson Davis?
00:12:03.280 The indignation at having been called out for something he is on camera doing.
00:12:10.820 And the incoherence. His answer was totally incoherent. He contradicted himself.
00:12:16.660 He rambled. He became angry.
00:12:19.720 And all Philip Wegman had done, very politely, it must be said, is ask him a question that accurately characterized his previous remark.
00:12:30.080 Look, we know why one would invoke Jefferson Davis or Bull Connor in a political speech.
00:12:38.760 We know why people invoke Hitler in political speeches or slavery in political speeches.
00:12:46.620 Once you've done it, you can't backpedal and say, well, I didn't mean it literally.
00:12:50.560 It was clearly heard by a good number of Americans, and not all of them Republicans, certainly not all of them involved in politics, as a manichean exercise in bullying, an attempt to cast bills that really aren't responding to much as the future of the country.
00:13:15.220 And I think Biden should own that. If you want to engage in that sort of language, own it afterwards.
00:13:21.680 But he's trying to have it both ways. And again, where's the reset?
00:13:24.780 Mm hmm. He also tried to blame the Republicans for not getting more accomplished.
00:13:30.100 His you know, on the one hand, he accomplished more than anybody ever in the first year of his presidency.
00:13:34.200 But on the other hand, the reason he hasn't accomplished more is those Republicans who he could never have anticipated would be this way, would be this determined to block his agenda.
00:13:45.840 Meanwhile, I mean, for my first thought on that, Charlie, was that's rich coming from a guy who heads up a party that called themselves the resistance during Trump's presidency.
00:13:53.480 They weren't working with Trump on anything. But secondly, the Republicans have worked with him on a couple of key items.
00:14:00.140 You know, that's why he got his one point nine trillion dollar covid relief plan through.
00:14:05.520 It's why he got his infrastructure bill through because they worked with him.
00:14:09.500 And a lot of Republicans voters didn't like the fact that the Republican lawmakers did that, but they did it.
00:14:14.560 And his most recent defeats were caused not by Republicans, but by Democrats.
00:14:19.400 Yeah. So, as you know, I am a staunch defender of the separation of powers, and I don't like the way that when we have a president of a different party than the Congress, that the press describes the Congress as obstructionist.
00:14:37.840 I didn't like it when Trump was president. The Democrats were obstructionist. I don't like it.
00:14:43.000 Now we have a Democratic president. The Republicans are obstructionist. Congress is in charge of legislation.
00:14:47.800 There's nothing written in stone that Joe Biden should get any of his agenda.
00:14:53.200 And so this framing, which we hear, especially from Democrats, I find irritating as a general rule.
00:14:59.840 But as you point out, it was not just irritating, it was churlish because on November 15, which is two months ago, Joe Biden signed and heralded a bipartisan infrastructure bill that got 69 votes in the Senate and that was endorsed and shepherded through by Mitch McConnell.
00:15:21.400 So to suggest that Mitch McConnell is not likely to do anything that would make Joe Biden look good is not only to falsify the record, but to hide under a blanket the most recent victory that Joe Biden himself trumpeted from the White House.
00:15:40.660 Meanwhile, there is a bipartisan group of senators in the Senate who are working on a reform to the Electoral Count Act, the very instrument that was used by President Trump to try to steal the election in 2020.
00:15:55.700 So that wasn't just an annoying framing that puffed up the role of the president in our system.
00:16:03.160 It was factually wrong and it was ungrateful to boot.
00:16:05.860 Hmm. You point out in your piece today at National Review, Joe Biden did inherit some challenges.
00:16:13.000 No question. It wasn't as bad as when, you know, Trump was dealing with covid, because at that point it was brand new.
00:16:18.700 We were trying to figure it out. We didn't know what it was. And Joe Biden also inherited vaccines.
00:16:22.600 But it's not like we didn't have a covid problem when he took over and inflation was already starting to rear its ugly head.
00:16:29.940 So there were some headwinds against him, though it must be said he was he was in a pretty good position at that point versus Trump when it came to the vaccines.
00:16:38.660 But your point is what? That he inherited those challenges. And yet the American people, what, they don't understand that they're holding it against him, that he didn't overcome those challenges faster.
00:16:51.400 What explains the dismal polling between 365 days ago and now?
00:16:57.080 I think it's that Joe Biden made explicit promises that haven't been kept now as a libertarian type.
00:17:04.240 I wouldn't if I were running for president, not that I'm allowed to, I wouldn't make promises of the sort that Biden did, because I don't believe that the president is the king.
00:17:16.220 I don't believe that he's a pope. I don't think that he has some sort of spiritual control over the country and its economy and infectious diseases and so forth.
00:17:26.980 But Joe Biden ran as if he did believe that he said on television that he was going to shut down the virus.
00:17:35.780 Now, whether or not he could do that, and I don't blame him for the persistence of covid any more than I blame Donald Trump for its arrival.
00:17:43.460 He hasn't done that. He promised that he was going to restore the economy on a broad basis and that the middle class would be better off under him than it was under Trump.
00:17:55.800 Again, I think the president has a limited ability to do that.
00:18:00.000 But Biden made that promise and it hasn't happened.
00:18:03.700 And when you do that, you create a hostage to fortune for which you have only yourself to blame.
00:18:09.420 So, yes, there were many challenges when Joe Biden came in.
00:18:13.720 And yes, he had it more difficult than did, say, you know, John F. Kennedy in 1960, although, of course, he had his own challenges.
00:18:23.300 But people judge you based on what you promise you will do.
00:18:27.640 And Joe Biden, contra his argument yesterday, has overpriced and he is underdelivered.
00:18:34.920 What do you make of? Yes, he was angry in response to that one question.
00:18:39.600 He was meandering at times.
00:18:41.060 Last night on Hannity, they put together a mashup, which is actually quite helpful just to see some of the moments strung together.
00:18:47.820 So we've repurposed it here.
00:18:50.080 Take a look.
00:18:50.660 We passed a lot of things that people don't even understand what's all that's in it, understandable.
00:18:59.380 One of the things that I remember saying, and I'll end this, I think it's extremely realistic to say to people, because let me back up.
00:19:08.140 So whether or not we can actually get election.
00:19:10.340 And by the way, I haven't given up.
00:19:12.080 We haven't finished the vote yet on what's going on, on the, on voting rights and the John Lewis bill and others.
00:19:21.360 The, Alison Harris, please.
00:19:25.640 Very few schools are closing.
00:19:29.020 Over 95 percent are still open.
00:19:32.080 Has, is becoming much more informed on the, the motives of some of the political players and some of the, and the political parties.
00:19:58.900 One, one more question, uh, Mr. President, um, by the way, it's a quarter of guys.
00:20:03.940 So I'm going to do this.
00:20:05.260 Just let's, if you may ask me easy questions, I'll give you quick answers.
00:20:10.240 Charlie, I felt watching him at times the way I feel watching a hurt gymnast on the beam or doing the horse, you know, the pommel.
00:20:18.720 Like it, it's terrifying.
00:20:21.360 You don't, you are not at all certain he's going to be able to land it.
00:20:24.340 And, you know, he's the president, so you're kind of rooting for him, but it's very uncertain.
00:20:29.620 And my main thought in watching that was, what on earth are the Democrats going to do in 2024?
00:20:35.560 Yeah.
00:20:36.500 Well, look, there is a, a broad, uh, prohibition on the, the remote diagnosis of political figures.
00:20:45.340 And I think that's, that's a good thing, but one doesn't need to get into any sort of medical claims, uh, in order to evaluate the man as the speaker of English.
00:20:57.280 And at times yesterday, it was not clear to me as a native speaker of English, what on earth he meant.
00:21:06.320 Um, he, he is decreasingly able to express himself and communicate coherently and in a timely fashion.
00:21:14.920 And I can't imagine that that's going to get better over the next few years.
00:21:19.820 Um, we, we have the oldest president we've ever had that does matter, especially in the modern era.
00:21:26.980 If you look at people who go into that job, uh, and then you look at what they look like when they leave that job, they age far faster than anyone would want to.
00:21:39.840 If Barack Obama did, George W. Bush did, goodness knows how it's going to age Joe Biden.
00:21:45.920 Um, it, it's a stressful role.
00:21:48.220 So by the time that he would run again, um, he's going to be like that plus another two and a half years of stress.
00:21:56.240 And he's going to be, uh, 82, I believe.
00:22:01.460 This is a real predicament for that party.
00:22:05.020 Not least because the vice presidential candidate that they have chosen is even less popular and oddly enough, uh, often even less able to express herself in the English language as well, at least not without sounding as if she's late for a book report.
00:22:20.180 So, uh, they have created a straitjacket for themselves that is going to be really difficult to resolve because they've really only got three choices, haven't they?
00:22:29.460 One is that you stick with Biden.
00:22:30.960 The other issue trying to replace Biden with Kamala Harris.
00:22:33.700 And the third one is you open it up.
00:22:35.780 You have a primary, but that primary would be conducted while Joe Biden and Kamala Harris are serving their term out.
00:22:43.780 Um, and that would be brutal, I think, for the, uh, the Democratic Party, just as it was in, um, 1979, 80, when Ted Kennedy challenged Jimmy Carter, and as it was in 91, too, when Pat Buchanan, uh, challenged George H.W. Bush, both of which, it should be noted, ended up with those presidents losing.
00:23:04.920 Mm-hmm.
00:23:05.840 And he did say last night that if he runs again, which he has said he will do, Kamala Harris will be his running mate, though there was a long pause that, that's given other people pause in deciding whether they believe that, since she seems even less likely to win than he does.
00:23:22.780 Um, assuming they really are prepared to run an 82-year-old, uh, to run a second term.
00:23:28.300 And, and, and could even he do it, given the, the fall in his poll numbers and the way things are going.
00:23:32.980 It's a long ways away.
00:23:33.820 Well, we'll, we'll find out.
00:23:35.680 Charles, it's always a pleasure.
00:23:36.940 Thank you so much.
00:23:37.700 And I encourage you to talk to your friend, Rich Lowry, to whom I sent a text today about a very funny exchange I heard on your podcast, The Editors.
00:23:45.040 You've got a, you've got a couple.
00:23:46.120 I love that one.
00:23:47.000 And I love Mad Dogs and Englishmen, too.
00:23:49.220 Um, but there is a very funny moment between Charles Cook and Jim Garrity, I think it was last Friday's show, that my husband and I have been laughing about for a week.
00:23:59.020 And I'll just leave it there as a tease.
00:24:00.980 Don't forget to stay tuned now, because up next, Tristan Harris.
00:24:05.540 Tristan is from the huge Netflix hit, The Social Dilemma.
00:24:09.400 He was on the inside of Google for years and has ever since been demanding more ethics from big tech.
00:24:17.520 An insider's view on what they're doing to us then, now, and in the future.
00:24:22.620 Don't miss that.
00:24:30.980 The very popular Netflix documentary, The Social Dilemma, pulled back the curtain on the tech industry and the ways we all can become addicted to our phones, our social media, and just instant gratification.
00:24:43.280 A very prominent player in that documentary is Tristan Harris.
00:24:47.060 He has been called the closest thing Silicon Valley has to a conscience, a former design ethicist at Google who has since gone on a mission to raise awareness against the everyday devices, about and against, that we have become addicted to.
00:25:04.400 He joins me today to discuss it all.
00:25:06.380 Tristan, thank you so much for being here.
00:25:08.580 Real pleasure to be here with you, Megan.
00:25:10.580 Fascinating stuff.
00:25:12.180 So I was just looking at your background, just to set it up.
00:25:15.540 You're from the San Francisco Bay Area, as I understand, raised by a single mom, and very young when you started practicing magic, which would become relevant to what you're doing today.
00:25:27.000 Tell us how.
00:25:27.860 Yeah.
00:25:28.800 I love talking about being a kid and studying magic.
00:25:32.680 Actually, my mom used to take me to a little magic shop in San Francisco growing up, and I was just fascinated that,
00:25:38.760 independent of the age or education or PhD level of the person you're doing magic with, that magic is about understanding the vulnerabilities that are universal to all human minds, right?
00:25:51.800 And even sometimes if you know how the trick works, the psychology is so powerful that it still works anyway.
00:25:58.660 And that really plays into how technology is designed, because when I was later at Stanford, and actually I was classmates with the founders of Instagram and many of the people who joined the early ranks of Facebook and Twitter and a lot of these companies, so I really know the culture and the people intimately.
00:26:13.100 And we, many of us studied at a lab called the Stanford Persuasive Technology Lab, which is part of a whole space and discipline of persuasive technology.
00:26:23.720 How do you design technology to persuade people's attitudes, beliefs, and behaviors?
00:26:29.620 When I say that, I don't mean political persuasion.
00:26:31.700 I mean things like, can I persuade someone to fill out a form?
00:26:35.120 Can I persuade someone to tag their friend in a photo on Facebook?
00:26:38.780 Can I persuade someone to add a filter to their, you know, to their photo on Instagram?
00:26:44.500 And persuasive technology is a whole discipline that is at the root of changing, I think, how we see our relationship to technology, which is it's not just this mirror that's, you know, people often think, oh, there's all these problems with social media and polarization and addiction, but we're just holding up a mirror to society.
00:27:02.260 Those are your addictive people.
00:27:03.520 Those are your extreme, you know, folks, and that's how people behave.
00:27:08.780 But I think what that picture misses is that technology is actively persuading us and eliciting certain things from us, and those are design choices made by technology companies.
00:27:18.820 So when I was later at Google, I became a design ethicist.
00:27:22.400 They actually acquired a small company.
00:27:24.000 I used to be a tech entrepreneur.
00:27:25.140 They acquired that company.
00:27:26.180 And I became interested in how do you ethically shape when you know more about their mind than they might know about their own, and you're designing persuasive technology.
00:27:36.460 What does it mean to be ethically persuasive?
00:27:38.660 And I became very interested in that.
00:27:40.200 I tried to change Google from within for a few years, and I just saw the incentives that were fundamental to this industry about capturing human attention.
00:27:48.160 That they, you know, how much have you paid for, you know, your Facebook account or your Twitter account in the last year?
00:27:54.720 Nothing.
00:27:55.240 But how are they worth a trillion dollars?
00:27:57.200 It's because they mine what?
00:27:58.820 Well, they mine our attention.
00:27:59.800 People think it's just their data, but they actually make money the more time you spend because you have to look at the ads.
00:28:03.800 And the more time you spend, the higher their stock price, but there's only so much attention, so it becomes this race to the bottom of the brainstem.
00:28:11.460 Who can go lower in the brainstem to elicit responses persuasively from you and get, you know, outcomes from you?
00:28:18.280 So I think that's really the situation we find ourselves, and I think that lens of persuasive technology and magic are critical to understanding what's really going on with how technology is influencing us as opposed to we're actively using it.
00:28:31.080 Oh, it's fascinating because watching The Social Dilemma, my biggest takeaway was you are being manipulated, right?
00:28:39.720 I mean, that's really the message of it.
00:28:41.760 It's not totally it's not your fault, though it's not entirely your fault if you have an addiction to your phone or your social media.
00:28:49.060 There is culpability and intentionality on the side of big tech.
00:28:54.420 They are trying to addict you.
00:28:56.800 And so you're there, a normal human with all the vulnerabilities of a normal human thinking, oh, what a fun device.
00:29:03.420 I can talk to my friends and I can take pictures and I can look at my calendar.
00:29:07.060 Oh, and there's this thing that lets me connect with people or follow a news feed.
00:29:10.240 And before you know it, huge portions of your life are devoted to this little device by design.
00:29:18.360 Yeah.
00:29:18.540 And I think just to make that really concrete, because a lot of people might hear that as kind of an extreme statement.
00:29:23.020 Oh, they're manipulating us.
00:29:24.040 Well, what do you mean it sounds like a conspiracy theory or exaggerated?
00:29:27.560 Let's make it very concrete.
00:29:28.520 I mean, people are really at home and they're looking at their kids and the kids are sucked into their phones and they think that if they're addicted, that's that's their responsibility.
00:29:37.220 Right.
00:29:37.340 But let's just make that example concrete.
00:29:38.640 So you're you have a couple of daughters.
00:29:40.540 Is that I have three kids, 12, 10 and a boy, girl, boy.
00:29:44.220 Ah, OK, got it.
00:29:46.520 Well, so like, let's say, you know, one of your kids watches The Social Dilemma and says, wow, I really don't want to get sucked into that anymore.
00:29:55.480 I want to, you know, use this less.
00:29:57.000 And so they stop using, let's say, Instagram.
00:29:59.040 Well, as we depicted in The Social Dilemma, the AI kind of wakes up and it notices that one of the users goes dormant.
00:30:07.860 There's actually a name for this.
00:30:08.820 There's a feature called user resurrection or comeback emails.
00:30:13.140 So like a digital drug lord, it notices that you stopped using.
00:30:16.780 And instead, if it was a neutral product and we were responsible for our own addictive behavior, then they wouldn't actively say, hey, user four, five, six, seven, eight, eight, two, five, seven, three.
00:30:27.360 They stopped using the product.
00:30:29.080 We're going to find out what were the things that used to keep them here and keep them coming back.
00:30:33.140 And it just calculates with their A, with their big artificial intelligence supercomputer.
00:30:37.820 These are the ex-boyfriend photos that that had that person coming back.
00:30:41.340 So we're going to show the ex-boyfriend photos.
00:30:43.460 And it works to draw us back in.
00:30:45.660 And, you know, notice if you stop using Facebook, if I stop using one of these systems, they get more aggressive.
00:30:50.760 They actually start, you know, doing more text messages, more notifications, more emails.
00:30:55.000 It's like a digital drug lord.
00:30:56.440 And that's the part where we can be very clear at assigning responsibility at the manipulative aspect.
00:31:01.920 Right.
00:31:02.120 It's like you try to quit alcohol because you've become addicted to it.
00:31:06.700 And yet somehow the people at Seagram's find a way to keep a bottle in your pocket, to uncap it, to have it spill a little on the table in front of you.
00:31:15.380 I mean, it's like, of course, it makes it even harder for anybody who's got an addiction to get away from it.
00:31:20.360 And worse than that is the entanglement.
00:31:23.080 So actually, one of the things that I know we'll talk about later, Frances Haugen, who was the Facebook whistleblower and the Facebook file, she leaked, you know, thousands of documents of Facebook's own internal research.
00:31:33.920 And one of the things that in Facebook's case, but really when we talk about Facebook or Instagram, you can apply it to all of them.
00:31:38.700 You know, Twitter, TikTok, it's very similar across the entire social media industry, is that they actually know that kids get entangled.
00:31:48.300 So, for example, Megan, you know, you and I probably use what texting is our primary way of talking to your friend, right?
00:31:54.220 I'm assuming you open up your iPhone and you fire off the text.
00:31:57.620 What parents don't realize is that for kids, a lot of kids, either in TikTok or Instagram, that's their primary messaging medium.
00:32:04.460 That's where they message their friends.
00:32:05.820 It's not just like the feed, it's also where you kind of message your friends.
00:32:10.040 So if you say, hey, I don't want to get sucked into that addictive feed, they have bundled and entangled those two things together.
00:32:16.420 And they don't want to separate them because, so to counter your example about alcohol, alcohol wasn't baked into a fundamental need of the way that you communicate, right?
00:32:25.120 But imagine that the only place you could communicate is the place where they can put that alcohol and pour you a glass.
00:32:30.920 And they always pour you a glass every single time you want to open your mouth and say something to someone else.
00:32:34.940 So and the companies know that parents are bad at giving their own kids advice about this because they know that parents will say things like, oh, you know, honey, just stop using it.
00:32:44.120 As if it's a matter, it's like telling you, Megan, or me, don't don't text your friends like when they entangle us.
00:32:50.220 That's really the where the abuse comes from.
00:32:52.500 Well, it is a big problem, whether you're addicted or not, because you do.
00:32:56.160 I mean, I would love to step away from my iPhone more, but I suffer from the same problem.
00:33:00.100 I mean, every that's how everyone communicates.
00:33:02.360 That's where my news is.
00:33:04.040 That's how I text my team.
00:33:05.620 My team texts me.
00:33:06.520 So you'd have to you have to reinvent society, you know, to go back to the way I grew up.
00:33:12.240 Right.
00:33:12.480 Well, the iPhone, the cell phone didn't even exist really until the early 1990s.
00:33:16.720 I remember seeing somebody walk down the street with it in Chicago in 1995.
00:33:19.860 She was having a conversation on the on the sidewalk and being like, what a moron who needs to have a conversation while they're walking from A to B.
00:33:25.800 That was in my lifetime.
00:33:27.240 Right.
00:33:27.620 That was 1995.
00:33:29.020 But we're so far, all of us away from that now.
00:33:32.580 How can one exist without this device?
00:33:36.340 Well, you know, so I run an organization called the Center for Humane Technology that's been trying to ask and answer these these questions and at least point to a direction, which is really clear.
00:33:46.720 This is not about vilifying all of technology or creating a moral panic and saying everything is going off the rails and we should stop using our iPhones or stop using technology overall.
00:33:57.760 I love technology.
00:33:59.140 I grew up on it.
00:34:00.880 I think it can be an empowering tool.
00:34:02.780 In fact, my co-founder, Aza Raskin, his father, Jeff Raskin, actually invented the Macintosh project at Apple.
00:34:08.940 Back in those days, the idea of a computer is it's a bicycle for your mind.
00:34:13.220 In the same way that a bicycle uses more of us in getting even more leverage out of the kind of distance that we can travel, technology can be a bicycle for our creative powers, for our communication powers, for our science powers.
00:34:26.360 But that's not what the business model of these social media, I think these social media companies are going to look back in history and see them as a parasite, that their goal is to suck as much attention out of society as possible and suck it into these engagement and arrangement machines that polarize us, that sort of want us to not be able to have a conversation over Thanksgiving dinner because they want to personalize these news feeds to each other so that we each get different information from each other.
00:34:54.280 Even when we try to have a conversation, we can't do that.
00:34:58.020 That is that the key difference here is the business model.
00:35:00.600 Notice if you do a FaceTime call to your son or your daughter, Apple doesn't make money the more time you use FaceTime.
00:35:08.320 So when you stop using FaceTime, it doesn't aggressively message you.
00:35:11.280 It doesn't put hearts and likes and comments floating all over the screen to keep you jazzed up and endangled.
00:35:17.340 It doesn't do the beautification filters to plump up your lips or your eyes or your cheeks, which the TikToks and the Instagrams do.
00:35:23.200 In fact, TikTok was found recently to, without even asking users to do a 5% kind of beautification filter, even if you didn't turn it on actively, because the apps that give you the, it's like the mirror, mirror on the wall.
00:35:36.880 Who's the prettiest of them all?
00:35:38.160 The one who reflects back the most positive self-image is the one you're going to get addicted to.
00:35:42.680 And so TikTok actually invisibly was doing that and plumping, you know, kids, you know, lips and, you know, eyebrows and all of that.
00:35:49.520 And it's as these really serious consequences that we saw in Francis Haugen's Facebook files, including the fact that you have kids like, you know, teenage girls who will say, I'm worried I'll lose my boyfriend if I don't have the beautification filter on because they've become accustomed to seeing me with that filter.
00:36:05.880 And it creates an anchor of who we are, where the virtual us, they will only like us if we look different than who we actually are.
00:36:14.120 And that's the perversion that comes from this business model, which again is separate from email or FaceTime or text messaging.
00:36:20.320 Those things are fine because their business model is not maximizing attention.
00:36:23.120 Wow. This is so chilling when you think about the creation now of the so-called metaverse.
00:36:30.020 They're basically in the process of creating a new, more in-depth, more time-consuming universe online, which I don't totally understand, but they're trying to suck even more time from us as a world online.
00:36:43.180 They want an alternate universe online that's even more involved and time-staking than it is today.
00:36:48.840 We'll get into that much, much more when we squeeze in a quick break and more with Tristan right after it.
00:36:54.480 Wow.
00:36:55.320 Don't forget, folks, programming note, you can find The Megyn Kelly Show live on Sirius XM Triumph Channel 111 every weekday at noon east and the full video show and clips by subscribing to our YouTube channel, youtube.com slash Megyn Kelly.
00:37:09.320 Don't get addicted, but enjoy.
00:37:11.940 It can be done in moderation.
00:37:12.840 If you prefer an audio podcast, subscribe and download on Apple, Spotify, Pandora, Stitcher, or wherever you get your podcasts for free.
00:37:20.960 And there you will find our full archives with more than 240 shows.
00:37:31.020 So, Tristan, you say in the film that it's almost like these tech companies create an avatar voodoo doll of us.
00:37:39.820 I'm going to set it up with a clip of you going there and then get you to expand on it.
00:37:44.480 This is Soundbite 7.
00:37:45.940 On the other side of the screen, it's almost as if they have this avatar voodoo doll, like model of us.
00:37:52.940 All of the things we've ever done, all the clicks we've ever made, all the videos we've watched, all the likes, that all gets brought back into building a more and more accurate model.
00:38:01.460 The model, once you have it, you can predict the kinds of things that person does, where you're going to go.
00:38:07.480 I can predict what kind of videos will keep you watching.
00:38:10.100 I can predict what kinds of emotions tend to trigger you.
00:38:13.380 At a lot of these technology companies, there's three main goals.
00:38:16.440 There's the engagement goal to drive up your usage to keep you scrolling.
00:38:20.180 There's the growth goal to keep you coming back and inviting as many friends and getting them to invite more friends.
00:38:27.120 And then there's the advertising goal to make sure that as all that's happening, we're making as much money as possible from advertising.
00:38:34.760 Each of these goals are powered by algorithms whose job is to figure out what to show you to keep those numbers going up.
00:38:41.440 Hmm. It's chilling.
00:38:44.140 And I love little Pete Campbell from Mad Men in the background is the guy who's on the computers.
00:38:49.600 But it's so it's not, in fact, like the algorithm is effectively the three people in that room.
00:38:54.660 It's not actual humans standing there, right?
00:38:56.820 It's like they figured out algorithms that can figure everything out in an instant.
00:39:00.720 Well, you see Google and Facebook figured out how to clone Pete Campbell, the advertising guy, and just sit him inside of the Google.
00:39:06.360 No, I'm kidding. I'm just kidding.
00:39:08.920 No, I think people looked at this metaphor.
00:39:10.840 So in the film, The Social Dilemma, which I really recommend everyone watches, it was the second most popular documentary, I think, in Netflix history, won two Emmy Awards.
00:39:18.520 And it really just lays this out in a way that I think everybody on all political sides can kind of understand as well.
00:39:24.560 And what we talk about in the film, as you said, Megan, is that, you know, behind the screen, you know, there's you.
00:39:31.280 There's this piece of glass.
00:39:32.280 And when you scroll up with your finger, right, there's there's going to be another rectangle that comes up next.
00:39:37.200 Do you think that that rectangle that comes up next is just the next thing that one of your friends posted?
00:39:43.400 No.
00:39:43.600 What they do is they fork it off to that supercomputer, which is that Pete Campbell character and that character, which is, like you said, you know, character embodiment.
00:39:53.940 It's not actually like that.
00:39:54.780 It's just a computer and it's calculating a number.
00:39:56.780 And it looks at every possible thing it could show you next, like within the space of things it could show you.
00:40:01.500 It could show you something that will outrage you politically.
00:40:03.800 It'll show you something that'll your ex-boyfriend or your ex-girlfriend, because that's what you clicked on last time.
00:40:09.640 It can show you a live video because Facebook wants to, like, dial up that live video.
00:40:14.460 It tries to calculate which thing would be most likely to keep you scrolling, because obviously it doesn't want to show you the thing that will stop you from scrolling.
00:40:22.240 And it's a supercomputer pointed at your brain to figure out how to basically light up your nervous system.
00:40:27.680 And the voodoo doll idea, one of the reasons we use that metaphor is that if I talk about, hey, Megan, you know, they have your data.
00:40:34.520 They have your data.
00:40:35.540 And where does that hurt you?
00:40:37.680 If you think about it just as a person, like, there you are, you hear that phrase, they have my data.
00:40:41.040 It doesn't feel like, what's the problem with that?
00:40:43.820 But if I say, look, that data is being used to assemble a model of you, a more and more accurate model that can be used to predict things about you.
00:40:50.820 And it gets more accurate the more information they have.
00:40:53.420 But it's like a voodoo doll.
00:40:54.420 So all the clicks you've ever made, that puts a little hair on the voodoo doll.
00:40:57.060 So it's a little bit more accurate when I prick and try to figure out what would activate the voodoo doll.
00:41:01.040 If all the likes, all the watch time and all the videos you've ever made, that also makes the voodoo doll more accurate.
00:41:06.600 Adds little shirts and pants in the voodoo doll.
00:41:08.720 But then what the point is that as that data gets more and more accurate over time, and it looks at 100 other people who saw those same political, you know, enragement videos that you've seen.
00:41:18.220 And it says, well, for people just like you, this is the thing that tends to keep them scrolling, watching, clicking, commenting.
00:41:24.900 Because all of that activity is engagement.
00:41:27.960 It's attention.
00:41:28.580 It's the thing that's sort of the parasite that, you know, makes these companies worth trillions of dollars.
00:41:33.780 And that's essentially the system that we're in.
00:41:35.920 But the problem is that it leads to basically all of these negative externalities that dumped onto the balance sheet of society.
00:41:43.000 We have shortening of attention spans.
00:41:45.020 We have more political polarization because affirmation is more profitable than information.
00:41:49.660 So giving us more confirmation bias of our existing tribal beliefs and why the other side is so bad.
00:41:54.380 Obviously, this trend existed in other kinds of media.
00:41:57.940 But now you have a supercomputer that's like literally, you know, figuring out this is the next fault line in society.
00:42:03.060 And these keywords emerge.
00:42:04.380 And whether it's mRNA or masks or vaccines or, you know, no matter what it is, it finds the one that works on buckets of users just like you.
00:42:12.460 And it knows that you're going to click before you know you're going to click.
00:42:15.400 And I think some people hear that.
00:42:16.760 And they think that sounds like a conspiracy theory.
00:42:18.340 Like technology knows us better than we know ourselves.
00:42:21.400 But Yuval Harari, the author of Sapiens, is a friend of mine.
00:42:25.700 He's gay.
00:42:26.360 And he jokes that, you know, his partner, Itzik, when he uses TikTok, it only took Itzik, you know, one or two clicks for TikTok to figure out exactly which rabbit hole to send his partner Itzik down.
00:42:36.280 And that's the thing about all of us is it knows exactly what works.
00:42:39.600 But the problem is what works on us isn't the same thing as what's good for society.
00:42:44.640 Or for us even or for us.
00:42:47.340 And that's why I mean, honestly, Twitter came out with a thing.
00:42:49.880 I don't know if they do it every year or whatever, but they just popped up in the feed.
00:42:55.340 This is how many conservative sites you follow.
00:42:58.880 This is how many liberal sites you follow.
00:43:00.920 And it just sort of volunteered, you know, your information.
00:43:04.000 And on mine, I was very pleased that I had a 51-49% ratio on my income, incoming, you know, news and people I follow.
00:43:11.600 And that's important.
00:43:12.880 So it just makes me a little less easy to manipulate in the information game because you're definitely getting you're getting propaganda from both sides.
00:43:20.100 But at least I mean, it's propaganda, but at least you're getting it from both sides.
00:43:23.500 You're a little less easy to manipulate.
00:43:25.600 Right.
00:43:25.740 So that's one step that definitely I mean, that's I actually have not seen that specific feature from Twitter.
00:43:30.240 It's obviously better for each of us to maintain more broad information diets.
00:43:34.580 But the second problem, Megan, is that the business model is we think of it like a parallel system of incentives to capitalism.
00:43:41.640 Instead of getting paid in money, you get paid in more likes, more views, more attention, more comments.
00:43:47.320 And when you say something that basically outgroups the other side and say, here's yet another example about why the other side is awful.
00:43:53.560 We'll pay you more likes, more followers, because that was better for generating engagement for the machine.
00:43:59.860 Now, no one at Twitter or Facebook has a big, long mustache and they're twirling it saying, gosh, how can you create the next civil war and and, you know, drive this up as much as possible.
00:44:09.440 But that's the inadvertent side effect of a machine that's values blind.
00:44:13.620 All it knows is what increases people's likes, followers and get them to invite more people.
00:44:18.740 And the problem is that those things tend to be conflict.
00:44:21.340 So even if you have a broad diet and you're looking at information from both sides, quote unquote information, what it really is, is basically people, you know, shitposting on the other side and building on the boogeyman.
00:44:32.160 So whatever your boogeyman is for you, like, oh, they're doing, you know, this next in my hometown.
00:44:37.960 Now you can sort of carry that to the worst next conclusion.
00:44:40.740 You can find evidence for every stereotype.
00:44:43.040 And in fact, one of the groups that we interviewed, we have a podcast called called Urine Divided Attention.
00:44:47.700 We interviewed Dan Vallone, who runs More in Common.
00:44:51.460 And what it really shows is that we completely see the other side in stereotypes.
00:44:56.580 If you ask Democrats to estimate what percent of Republicans make more than $250,000 a year, they think more than a third of Republicans make more than $250,000.
00:45:06.580 I think the answer is more like 2%.
00:45:08.220 If you ask Republicans, what percent of Democrats are LGBTQ, you know, and they'll estimate more than a third of Democrats are LGBTQ.
00:45:16.500 The actual answer is 6%.
00:45:17.920 If you ask Democrats what, you know, to estimate what percent of Republicans do they believe, still believe racism is a problem in the United States, they think less than 25% of Republicans would believe that racism is still a problem.
00:45:30.420 The actual answer is something like 70%.
00:45:32.400 And so we're seeing ourselves with stereotypes.
00:45:35.280 And the second thing they found is the more you use social media, the worse you are at predicting what the other side believes, not the better.
00:45:42.460 Because the extreme voices on social media participate more often than the silent sort of, you know, calm, moderate majority, right?
00:45:51.500 Like the calm, moderate people, they don't actually say that much.
00:45:54.900 So that's really the problem that we're dealing with when we look at our, you know, our polarization ecosystem.
00:45:59.180 Wow. This is reminding me that when we closed out the year, we went to Christmas break.
00:46:03.980 The last piece I did was on Democrats.
00:46:07.920 And, you know, I have a lot of Republican listeners.
00:46:09.740 I have some Democrats, too, mostly people in the center.
00:46:12.440 But it was a reminder that, you know, the people who are trying to get everybody canceled and so on, they don't represent all of the left.
00:46:19.180 And that it's not, quote, the left that is the enemy of reason.
00:46:21.960 It's like activists who are pushing agendas.
00:46:24.240 Because, yes, we can fight on that.
00:46:25.820 But remember, your neighbor, who's a Democrat, is not your enemy if you're a Republican.
00:46:30.160 And it's not necessarily against the things that you're against as well.
00:46:33.880 All right. Let me pause it there.
00:46:34.880 I'll squeeze in another ad and we'll come back.
00:46:36.380 My God, there's so much more I want to talk to you about.
00:46:39.160 Tristan Harris is here and we are lucky to have him.
00:46:41.260 Former Google design ethicist.
00:46:42.800 We've got to talk about that podcast.
00:46:44.560 Crazy interesting stuff.
00:46:45.980 Don't go away.
00:46:46.520 Let's spend a minute on your podcast, Your Undivided Attention.
00:46:57.260 This is the description that my team gave to me and I'm dying to know more.
00:47:01.860 Okay. Interviews experts in invisible aspects of human nature from casino designers to hypnotists,
00:47:10.640 ex-CIA propaganda experts, tech whistleblowers, researchers on cults, and on.
00:47:17.800 This is what's...
00:47:18.920 So I guarantee you there are people out there listening to this right now that are saying,
00:47:22.760 not me. I'm too smart.
00:47:24.720 I understand what a manipulation looks and feels like.
00:47:27.260 Yeah. Well, yeah. With our podcast called Your Undivided Attention, what we're really
00:47:32.360 trying to do is just inoculate people from this manipulation.
00:47:36.780 And one of the best ways to do that is for people just to understand the truth behind
00:47:40.840 the people behind that piece of glass screen.
00:47:43.580 So we had Natasha Dow Shull who studied casino design.
00:47:46.780 She wrote a 700 page book on how casinos are designed.
00:47:49.740 So for example, the classic example is your phone.
00:47:52.200 It's like a slot machine.
00:47:53.020 Every time you scroll your phone or you pull down to refresh to see, did I get some new email?
00:47:57.680 Just like a rat seeking a pellet, you're playing that slot machine to see if I got something new.
00:48:02.700 So we interview casino experts, people who study attention spans, what's happening to the inner
00:48:06.640 workings of our attention, effects on children, hypnotists, all these sort of invisible access.
00:48:12.500 We had a Rene Diresta who actually was one of the two teams given access to the data sets on what
00:48:18.440 Russia and China are doing in sort of social media, which by the way, is really one of the
00:48:22.560 biggest concerns that I actually have about this.
00:48:26.280 That's more subtle is that I think social media is any of these platforms specifically.
00:48:31.880 So TikTok, Facebook, Twitter, et cetera.
00:48:34.240 It's it breaks our democracy.
00:48:36.560 It's actually incompatible with our democracy in a more fundamental way.
00:48:39.340 And in the competition with China and these digital authoritarian societies, I don't want to go
00:48:44.820 that direction. But I worry that we put we put this brain implant in our democracy called social
00:48:49.760 media and it rewires our collective psyche so that each neuron maximally influences every other
00:48:56.580 neuron, right? Because it wants each of us to reach as many people as possible. That's what keeps each
00:49:00.400 of us coming back. We're addicted to influencing so many other people. But if you think about what
00:49:05.160 would happen in a brain if I took each neuron and maximally fired every other neuron, you'd get kind
00:49:10.260 of like a social seizure attack, right? And when I look at our country right now and I look at how
00:49:15.960 it's just this cacophony of anger and confirmation bias and we're so right and we just have to escalate
00:49:22.580 that conflict, it's like we're foaming at the mouth having a seizure as a country while China is
00:49:29.940 actually employing the full suite of all these technologies to make a stronger authoritarian society.
00:49:34.440 We can notice that democracies are not employing the full suite of all these new technologies to make
00:49:39.540 a stronger democracy. Instead, we've allowed the business model of maximizing attention for
00:49:45.220 enragement and making us angry at each other to sort of actually collapse our capacity as a democracy
00:49:51.020 to agree on anything, to recognize that we have much more in common with our fellow countrymen and
00:49:55.600 women, and that there's actually real challenges we have to face. Meanwhile, China is gerrymandering
00:50:00.440 Africa, getting access to supply chains, doing foreign policy. I should also talk about what they're
00:50:05.660 doing with regard to their tech platforms. I was actually meeting- Their tech moves lately,
00:50:09.480 you tell me, Tristan, when I read them in the news, I'm like, well, that's very China, right? To sort of
00:50:12.780 the big hand of government now controls it. But I was also like, yay, China, for the first time in my life.
00:50:17.160 I was like, you know what? Maybe we should consider the Chinese way.
00:50:22.300 Yeah. Well, so I was meeting with a senator who's deep in the foreign policy world, and he was meeting
00:50:27.820 with his counterpart in the EU who said, you know, who does China consider to be the largest threat
00:50:34.860 to its national security? Who's its biggest geopolitical rival? And of course, you would say
00:50:39.240 the United States, right? You would think that's the answer. They said, no, they consider their own
00:50:45.320 technology companies to be the biggest rival to the CCP, the Chinese Communist Party's power.
00:50:50.640 Now, why is that? Because the technology that runs their society is really the new source of power,
00:50:57.240 right? It's controlling what kids are feeling, thinking, and believing. It controls their identity,
00:51:01.440 their educational development. It controls loans that get made, Jack Ma, Alibaba. So they're going
00:51:06.980 after their billionaires. They're doing all these things, but they're really realizing that technology
00:51:11.040 is the power structure. It's the brain implant that is guiding their society. Now, I'm not trying to
00:51:16.980 idealize it now, but here's a couple of things that you were mentioning that they're doing to
00:51:20.500 deal with the problems of the social dilemma. So let me give you a couple examples. One of the
00:51:25.280 things they do is on TikTok, their version of it called Doyin, when you're scrolling TikTok,
00:51:30.780 if you're under the age of 14, you can only use it until 10 p.m. at night, and then it's closing
00:51:36.040 hours. It opens again at six in the morning. They actually limit you to 40 minutes a day.
00:51:41.260 And when you scroll, instead of showing you videos of the best influencers, they show you science
00:51:45.420 experiments, museum exhibits, patriotism videos, because they realize that TikTok is conditioning
00:51:50.920 kids' behavior. And now I'm not saying that we should be doing Pledge of Allegiance videos
00:51:56.160 to the United States on our version of that. But what we have to also see is that China is controlling
00:52:02.200 their number one adversary's children's TV programming education. I mean, imagine in the Cold War,
00:52:08.300 the Soviet Union controlled Saturday morning cartoons for its number one geopolitical adversary.
00:52:12.700 You know, I actually talk to people in our defense and national security apparatus quite a bit
00:52:17.280 these days. My concern is that our generals and our heads of the Department of Defense know
00:52:21.800 everything about hypersonic missiles and drones and the latest tech, you know, physical advances in
00:52:28.100 warfare. But how much do they know about TikTok and how their own children are being influenced on
00:52:33.200 TikTok? And I'll give you a concrete example. A TikTok insider told me this. He says, the thing that
00:52:38.100 people don't realize is that TikTok is an alternate incentive system to capitalism. Instead of paying
00:52:43.760 you in money, I can pay you in likes, followers, and attention. I can give you a sense of boost of
00:52:49.560 all those things. So now let's say, and China is known to do this. They have a national security
00:52:53.980 strategy called borrowing mouths to speak. So I want to borrow those Western voices who say positive
00:52:59.720 things. Whatever anyone in the West says something positive about China and the Uyghurs are not a human
00:53:04.040 rights problem and it's all fine. China can just say, we're going to dial up those people. So they
00:53:08.460 get paid in more likes, more followers, and more views. Then other people on TikTok look at that and
00:53:14.980 say, well, why are those TikTok influencers so successful? And they start replicating their
00:53:19.000 behavior. So you're creating an alternative system of influence on top of your number one
00:53:25.260 geopolitical adversary. And you're being able to adjust those dials anytime you want. And you don't
00:53:30.940 even have to get them to trust the Chinese Communist Party's voices. You can take Western
00:53:36.160 voices who happen to be pro-China for whatever reason, and just make them the ones that are heard
00:53:40.180 the most, right? And my colleague, Rene DiResta, calls this amplifaganda. It's not propaganda.
00:53:46.480 It's amplification propaganda. I'm taking your voices, but the ones that I want to hear. And
00:53:51.340 similarly, we know what Russia did, you know, and not just in our elections, but ongoingly,
00:53:55.220 is they take the most divisive voices, especially the ones that focus on race, on guns, on immigration,
00:54:00.120 these topics, and the ones who want to do civil war and secession movements and things like this.
00:54:05.220 And they amplify those voices because they want to amplify propaganda, amplifaganda,
00:54:09.400 the ones that are most divisive. There is a World War III information war that Marshall McLuhan
00:54:14.560 predicted in 1968 when he said, World War III is a global information war that will make no
00:54:20.500 distinction between civilian and military combatants, because now we are in that war, but we don't really
00:54:26.140 see it or feel it that way. And I've heard you talk about in the past, the difference between we have
00:54:29.800 these huge oceans on both sides that make us a global superpower. We have this physical,
00:54:34.800 you know, kinetic, asymmetric position, you know, compared to our adversaries. But those
00:54:39.320 huge oceans and borders go away in the digital world. You know, we have Patriot missiles to shoot
00:54:44.300 down a missile that, you know, a plane that comes in from Russia or China physically. But if they try to
00:54:48.860 fly an information bomb into our country, they're met with a white glove algorithm from Facebook or Twitter
00:54:54.180 or TikTok that says, yes, exactly which, you know, minority group would you like to target?
00:54:58.900 And a recent MIT tech review article said that actually at the top Facebook pages, 15 pages that
00:55:06.880 are for Christian Americans, all 15 of those Christian American pages are actually run by
00:55:12.680 Macedonian troll farms. Of the top 15 African American pages on Facebook, these are basically bots,
00:55:18.960 right? Of the top 15 African American pages, two thirds of those African American pages reaching
00:55:24.180 something like 80 million Americans a month are run by Macedonian troll farms. So we have to realize
00:55:29.860 that, again, we're not even really living in a real reality. The metaverse is a virtual reality. But
00:55:34.820 even within that virtual reality, it's a virtual representation of our fellow citizens. They're not
00:55:39.880 even our fellow citizens.
00:55:41.180 That so that's and I just want to pause and underscore to the audience at home. This is something
00:55:45.180 different. This is something Russia did do. Okay, they did do this. This is not Russia gate stuff.
00:55:49.420 This is not like weird. This is totally different. Russia did do this. They used bots to amplify
00:55:54.540 disinformation. This is it. Does it come as news to anybody really that they're trying to
00:55:58.780 so you don't have to believe that this influenced the election to just know that what they're trying
00:56:02.760 to do is drive up division ongoingly. Right. And that's that is part of a, you know, a deep warfare
00:56:07.360 strategy. Right. Because we're falling over incoherently, constantly disagreeing with each other
00:56:11.960 and then forced to see the more extreme perspectives our society. Well, these countries are not doing
00:56:17.160 that. They're not faced with that problem. And just to pick up on the other point you were making
00:56:21.100 about how they limit the children's access to tick tock and so on in China, because they came out with
00:56:25.500 a couple of sweeping reforms within the past few months along those lines, trying to stop the
00:56:29.800 children from spending all their time. And they're limiting the some of the time on the apps to just
00:56:33.660 the weekends. Yes. And they limit to 40 minutes a day and on and only on the weekends, Saturday and
00:56:40.780 Sunday. And tick tock, like you're saying, they also do only only 40 minutes a day. And like I said,
00:56:46.140 they have opening hours and closing hours. So at 10 p.m., it just shuts off. And the reason for
00:56:50.960 that, by the way, I thought my thought in reading about that was, OK, so great. We've unleashed these
00:56:55.440 unhealthy bombs on our children in their country, in our country, across the globe. But China has
00:57:00.560 actually stepped in to try to stop that bomb from doing too much damage on its own children,
00:57:04.420 whatever its motivations. They do not want a bunch of, you know, missing the frontal lobe children
00:57:09.560 to grow up addicted to technology, just when you need to play their game, needing to play. They want
00:57:13.800 their kids to be smart and to be the next generation's leaders and so on. Meanwhile, we left our kids
00:57:17.780 twisting on the vine. There's there's no attempt over here at all, as far as I can see, by big tech
00:57:25.040 to protect our children in any way. In fact, the more addicted, the better.
00:57:31.100 Exactly. Exactly. And, you know, this this perhaps I think is one of also the major issues that in our
00:57:36.380 country we can actually agree on. Right. I mean, who wants our children systematically
00:57:40.680 warped and deranged with comeback emails that like a digital drug lord, when you stop using,
00:57:45.400 I figure out how to more aggressively get you to come back. And, you know, Francis Haugen,
00:57:49.340 the Facebook whistleblower, you know, people point to her credibility. It's not her credibility
00:57:53.340 that matters. She was just leaking Facebook's own research where it actually had their own
00:57:58.500 documents. And they found that 13 percent for among teens who reported suicidal thoughts,
00:58:03.300 13 percent of British users and six percent of American users trace their desire to kill
00:58:08.320 themselves on Instagram. And they said that we make body image issues worse for one in three
00:58:13.920 teenage girls. I know personally some Instagram insiders who actually left the company after seeing
00:58:19.200 that research because they couldn't justify staying there, knowing that that's the case.
00:58:24.140 And but this is all obvious because the whole business model is designed around this kind of
00:58:29.020 predation on our kids. But again, I think what we need to do, Megan, is we instead of focusing on,
00:58:33.480 you know, just these light reforms, like how do we make social media slightly more privacy protecting
00:58:38.100 or 10 percent less toxic by removing the anorexia thing? I worry that this is a competition of two
00:58:44.180 systems. We have democracy and we have authoritarianism and authoritarianism. That model,
00:58:51.040 they're using the full suite of technologies to make a kind of super authoritarian, stronger sense
00:58:56.800 making environment. They have many problems. I don't I don't admire it. I don't want it to be
00:59:00.300 the future. But meanwhile, we can notice that our democracy is not employing all these technologies
00:59:05.920 to say, how would we make democracy even better? How do we do even more consensus based decision
00:59:10.240 making? How do we invite people? There's actually a model of this in Taiwan, where when when instead
00:59:15.660 of posting on social media, when you hate something about, say, the tax system or potholes or
00:59:19.640 masks, when you post about something that you say, I want to fix in our democracy, instead of that
00:59:25.080 just turning into a long comment thread, but then get shared more virally in the more clever,
00:59:29.360 stubborn thing you can say, the more attention you get in their system. When you say, I want to fix the
00:59:35.020 tax, the tax system has a problem. You get invited into a zoom call, a stakeholder group that actually
00:59:40.260 talks about how you would improve it. And you actually get other citizens and you're actually
00:59:44.020 designing the improvement to that system. And then that's taken to the digital minister
00:59:47.920 to actually implement. Well, we could have a whole basis of technology that's about strengthening our
00:59:53.040 democracy. And that's my concern about what we need to do. We don't need 5% less toxic social media.
00:59:57.760 We need to reinvigorate the values of the Declaration of Independence for a 21st century
01:00:03.080 age. So we're not antagonistic technology to technology. We're using it to make a stronger
01:00:07.280 democracy. My gosh, it just makes me think I don't really like to crack down on alleged hate speech
01:00:12.340 or what have you. And I don't like big tech censorship. And I never thought Mark Zuckerberg
01:00:17.380 should have been pulled into that. He was originally like, it's not my job to police the internet and
01:00:21.580 conversations having it. And I was like, right on. That's the American way. But then he did submit and so
01:00:26.580 on. But we're focused on the wrong stuff. That's that is not the problem of big tech. I mean,
01:00:31.700 it's irritating, but it's not the problem. Their sins are so much more nefarious and ingrained and
01:00:38.360 deep and part of the business model than all that stuff, which is a noise distraction.
01:00:43.900 Exactly. And in fact, Facebook, after Frances Haugen came out, we actually now know from Wall Street
01:00:48.820 Journal reporting, they were consciously trying to frame. So she released all this research about how much
01:00:54.140 it's dividing and polarizing us and hurting kids. And then Facebook actually used their PR department
01:00:58.980 to sow stories, saying that this was all about censorship, that what Frances wants is censorship.
01:01:04.760 Whenever they talk about censorship, they do that because they know it just creates more division
01:01:08.900 because the conversation about free speech or censorship will never resolve. It's the same
01:01:12.840 800 page, just like law textbook conversation. Everyone brings up the same examples and it never
01:01:17.780 yields any results. It's not about freedom of speech. We all want that. In fact, we should have
01:01:22.400 less censorship on that. What we need is we have to be careful about reach. We've decoupled power and
01:01:29.200 reach from responsibility. Typically in the past, the greater the broadcasting capacity you would have,
01:01:34.560 the more responsibility you would have because you're reaching a large number of people.
01:01:38.980 Now we have a single TikTok influencer. There's actually an example in China. In one day,
01:01:44.800 because you're reaching a billion people, you can actually create a billion dollars of sales.
01:01:49.700 There's actually an article in MIT Tech Review, I believe it is, that a single individual in China
01:01:54.560 in one day generated a billion dollars in sales. Because when you say something and you reach a
01:01:58.740 billion people, this could be a 15-year-old or a 16-year-old.
01:02:01.860 Or a Kardashian.
01:02:03.180 Or a Kardashian, right? But instead of the Kardashians where it was only a few people in
01:02:06.860 the past, in the 20th century, we had like a few big celebrities that could do it. We're moving to a
01:02:11.260 world where each tech company wants each of us to be a Kardashian. They want each of your kids
01:02:16.160 to be the influencer. They want that to be the model for what being human and being a kid is
01:02:20.860 about. And when they do that, notice that has an effect on the other kids. The other kids say,
01:02:24.780 well, they're way more popular and successful and getting the attention than I want that attention.
01:02:29.340 And they're transforming the cultural basis for what our kids even want. And again, you zoom out and
01:02:34.880 you say, you know, China's playing chess and we're allowing these business models to collapse our
01:02:40.740 ability to think and act well in the 21st century. Because we've got a lot of problems that we have to
01:02:44.580 figure out. I know that, you know, Frances, she came on your podcast, right? The whistleblower
01:02:50.320 from Facebook. Yes. Yes. And I really recommend people check that out because I think, yeah,
01:02:55.360 there's a long history. Go on. Yeah. I mean, she's well worth listening to it. She, I guess,
01:02:59.780 said that working at Facebook was one of the most important jobs in the world. She hopes that her
01:03:03.400 papers don't discourage people from working there. Because the tech changes that we're talking about
01:03:10.080 have to happen from within. It can't be, I don't know that, can they be legislated? Can,
01:03:15.120 or does there just have to be a genuine will by the tech companies to have more Tristan Harris's and
01:03:19.800 Frances Haugen's working there? You know, you're bringing up such an excellent question, which is
01:03:25.540 how are we going to make these companies accountable? We can look to the market so you
01:03:29.700 could compete with them, right? You could start a new social network, but the problem is they have a
01:03:33.980 monopoly on the network effect. The reason we don't see other social networks succeeding is because
01:03:38.340 they have already owns the means by which people reach all of their other friends in a network.
01:03:42.760 And so networks are really hard to compete against. So we can't use the market mechanism.
01:03:46.740 You also have to go lower in the race to the bottom of the brainstem. You have to make it
01:03:49.540 more addictive, more engaging, more polarizing. So if you use market mechanisms, it's A, it's hard.
01:03:55.060 And B, you end up creating usually something worse. Like TikTok is competing with Facebook,
01:03:58.320 but it's creating something that has all these even more negative effects. The other way is with,
01:04:03.440 let's say, culture. We could do a user boycott. We could do, let's not use it,
01:04:08.160 but notice that that doesn't really work because they've owned the mechanism by which we can
01:04:11.860 communicate with other people. So boycotts don't really work. Also, Facebook has a project called
01:04:17.160 Project Amplify where they can actually just turn up the dial, just like China can turn up the dials
01:04:21.620 on positive voices about China. Facebook has a project called Project Amplify where they can turn
01:04:26.680 up the dials on things that people, when people like something about Facebook, like it helps them find
01:04:31.480 their lost horse, like a horse owner loses their horse. They found it because they found someone else on
01:04:35.260 Facebook. Whenever that happens, they can dial up positive stories about Facebook. So Facebook can
01:04:39.920 control what people are feeling and thinking about Facebook. So the culture mechanism, boycotts,
01:04:45.720 public sentiment, they control that. Then we go to states and regulation, which is what you're now
01:04:50.180 talking about. We could use that mechanism, but then we know that Facebook can actually divide the
01:04:54.480 population about any kind of proposed regulation. And as you know, privacy section 230 reform, it's not
01:05:00.800 enough to deal with it. And then lastly, as advertisers, we could take all those advertisers
01:05:05.260 that have the powerful pressure that they're the financing of these whole systems. And we could try
01:05:09.120 to say, we're going to do an advertiser boycott. But the reason that doesn't work, Megan, is that
01:05:13.880 there's not, unlike other industries where there's like an 80-20 rule, there's a small number of
01:05:17.720 advertisers that make up the massive billion dollars of your revenue. Because Facebook makes money from
01:05:22.860 these millions and millions of small businesses and everything all around the world, you couldn't get enough
01:05:27.600 advertisers to pull out. And also, where would they go? They also have to use these platforms to reach
01:05:32.460 their customers. So this is the kind of quagmire situation that we're in. And the reason I bring
01:05:37.600 up national security is I think that national security is the way we need to see this. That much
01:05:42.360 like an EMP attack is, like an electromagnetic pulse attack, where it's like I blow up a bomb above
01:05:48.520 your city, but it kind of fries the electricity grid. So you get to keep your buildings, you get to
01:05:53.240 keep your people, you don't hurt anyone, but all of your infrastructure is fried. I think that these
01:05:58.040 social media companies are like a cultural EMP attack. They don't, you know, China and Russia
01:06:03.280 haven't, you know, killed anyone. They're not spilling any blood in the streets. They're not
01:06:07.480 bombing any infrastructure. But everything got fried. Our trust broke down. Our, you know, we don't
01:06:13.400 have trust in it. We can't do consensus. So it's a cultural EMP for democracy. I think we need a deeper,
01:06:19.160 more, you know, fundamental response that, and frankly, we don't have anyone in the political
01:06:23.480 stage who's proposing this right now. And this is why I'm really excited that we're talking. I think
01:06:27.200 that every one of your listeners and all of us need to see this as the challenge of our time.
01:06:31.800 Yeah. More and more, they're trying to introduce legislation here in the United States to try to
01:06:36.360 crack down. But to your point, does that, does that actually do it? Is that the right way in? Or do we
01:06:42.060 need the next generation of Tristan Harris's to be thinking about getting hired at these companies and
01:06:47.200 changing them from within and demanding more responsible tech and the stopping of the
01:06:53.000 exploitation of human vulnerabilities, as you said, when you were at Google?
01:07:02.320 So two years after the Social Dilemma was released, are the social media companies slowing
01:07:08.080 down? Did they dial back? Were they shamed out of some of their bad behavior?
01:07:11.240 There's a reason Facebook changed its name to Meta. When Mark Zuckerberg did that, I think
01:07:18.440 everybody was like, well, what the hell is that? Why is he changing? Everybody knows Facebook. What
01:07:21.540 the hell is Meta? Why is he doing that? Well, Meta ties into this thing called the Metaverse,
01:07:26.260 which he's invested in. He likes it. He wants it to become a thing. And he is not the only one.
01:07:31.040 There was some massive deal today where it was Microsoft, I think, bought the company that created,
01:07:36.920 oh, gosh, the Grand Theft Auto. Yeah. Grand Theft Auto, right? $70 billion deal. And there was
01:07:44.720 there was some reporting that that's because they want to be well prepared to enter the Metaverse
01:07:49.580 through gaming. So what the hell is the Metaverse and why do we need this?
01:07:55.900 Yeah, well, you know, as you eloquently put, Megan, the main reason that I think they announced
01:08:02.040 that obviously it had been in the works for a while, but they wanted to distract attention
01:08:05.160 from Francis Haugen's whistleblower leaks about what how toxic the company really is for democracy,
01:08:11.560 for kids. And they did this, you know, I think just a few weeks after those releases happened.
01:08:18.060 They also need to excite investors. They need to say Facebook isn't just this product that,
01:08:22.200 you know, with the blue bar at the top and the infinite feed and the thing that's just
01:08:25.320 zombifying everybody that, you know, it's this exciting vision of where, you know, we're going to
01:08:30.440 be immersed in these virtual realities. A lot of people ask about the Metaverse. I,
01:08:34.860 I think that they've, they've already been in a race to create the Metaverse for a long time,
01:08:40.340 because you can think of it as we, their job, when you're harvesting attention, you make more,
01:08:45.440 their stock price goes up, the more time people spend, the more time people spend, the more it
01:08:51.100 becomes your reality. Like your reality actually is the technology feed. Right. And in fact, Megan,
01:08:56.400 actually, just to go back to the point on teen mental health, when you look at the stats on what
01:09:01.820 are called high depressive symptoms, like self-cutting, self-harm, teen suicide, you know,
01:09:06.120 when those numbers actually start to tick up in the graph, they tick up in 2009, 2010, it's like
01:09:11.000 kind of flat line. It's kind of flat. And then it goes up like an elbow and it goes up 2009, 2010.
01:09:15.660 What changed? We actually had social media before 2009, 2010, what happened, what changed in those
01:09:21.660 years is it went to mobile. It became something that your kids are now 24 seven immersed in. In other
01:09:27.740 words, when it gets full on to the brain helmet of like, I'm living in this reality, that's when it
01:09:33.580 becomes more toxic because it actually is the fundamental way that you say, what's going on
01:09:38.480 in the world? Are things peaceful? Do I like my neighbor? Or is it just this like, you know,
01:09:42.240 infinite polarization, close to civil war thing, that feeling we, we, we've forgotten how to escape it
01:09:47.320 because we're so immersed in these technologies. So they've been racing to create and own this virtual
01:09:53.400 reality creating environment for a long time. The metaverse is just this extreme, more extreme
01:09:58.220 version of that at this virtual world. And we're going to see these other companies, you know,
01:10:01.980 compete with that. But if we haven't been able to get, and you know, as Francis Haugen said,
01:10:05.840 if we haven't been able to stick the landing on how our existing products could be good for people,
01:10:11.280 good for democracy, good for society. And instead of fixing the actual problems we have,
01:10:15.820 they're just jumping to this next thing. Yeah. That's, that's really going to take it next level.
01:10:20.240 Because what I heard on the daily, they did a podcast on it just today. And they were talking
01:10:25.240 about how you, you in the metaverse, your avatar, you know, you can make your avatar be as beautiful
01:10:30.560 or not as you want. You, you will actually be in the position where you could potentially be paying
01:10:35.700 Nike for a pair of sneakers to wear in the metaverse. So you buy, you buy actual sneakers you wear in
01:10:42.980 your, when you're in your real body walking around the real earth, but then you have to pay these
01:10:46.640 companies for your snazzy outfit in the metaverse where you're jetting off to Paris for a lunch.
01:10:51.640 Meanwhile, you're really just sitting in your damn basement, not actually interacting with a real
01:10:55.060 live human being face to face. That's right. And you know, I think that, um, you know, there's this
01:11:01.540 principle when you're developing systems that depend on another system. If you build a virtual reality,
01:11:06.180 that's not caring for the real reality underneath, you're just sort of building a disassociation
01:11:12.700 layer. That's going to not care for the underlying substrate. Like if our global economic system
01:11:17.780 isn't also making sure it's protecting the environment, clean air, et cetera, then we're,
01:11:22.720 it's not really a humane system. If you're creating a virtual reality that depends upon people's
01:11:27.300 embodied experiences, depends upon people being healthy, depends on people having real relationships.
01:11:31.580 It depends on having a real democracy underneath the virtual democracy you create. If you're creating
01:11:35.780 a system that's not caring for intending to the things that enable that system in the first place,
01:11:40.940 that's, that's not a humane system. And I think that's my worry, right? We're virtualizing
01:11:45.980 everything. And, you know, this is going to get a little bit spookier and spookier, Megan,
01:11:49.540 because there's these new technologies that are coming, um, where I can synthetically make
01:11:54.040 videos of people, of faces, of audio, and have people say, feel, or think anything. And I can also
01:11:59.380 do it with text. There's a technology called GPT-3 where I can actually just say, write me,
01:12:05.580 create me a, um, you know, a news article written in the, in the voice of Megan Kelly or Tristan Harris.
01:12:10.940 And it will create a, you know, an essay about why technology is bad written in my voice.
01:12:15.680 That sounds pretty close to things that I would say. And in the future, if we're worried about
01:12:19.820 misinformation now in the future, I can say, um, and I'm sorry, I'm not trying to just spook your
01:12:23.620 audience. I just, I think it's important for us to be inoculating ourselves for where these trends
01:12:26.900 are going that, um, you know, I can say in the future, write me an article about why the vaccine
01:12:32.280 is not safe using real charts, real graphs, real evidence, and write me a 200 page paper.
01:12:38.580 That'll take statisticians, you know, weeks to decode for months. And it can just flood the
01:12:44.040 internet with posting things about whether the vaccine is safe or not safe, no matter what it is
01:12:47.620 you believe, I can just flood the internet with things that'll take forever to sort of adjudicate.
01:12:51.980 And that's the world that we don't realize that the humane world fundamentally depends on recognizing
01:12:56.600 the limits and vulnerabilities of human nature. We have finite time. We have confirmation bias.
01:13:01.840 We are more likely to believe our tribe versus another tribe. We have these fundamental truths
01:13:06.640 about human nature and to be humane means to be designing for that. And that's, that's really
01:13:11.660 the thing where this choice point as a species, we're building technology that's, that's, that's
01:13:15.900 changing things faster than we're able to keep up with or understand those changes. We've already
01:13:19.300 gone through that in the whole conversation we've had. This is about understanding ourselves so we can
01:13:23.840 understand how to limit that technology to fit with how we really work and what would make us healthy
01:13:27.980 and strong. We've seen more governments try to crack down on it. I understand Italy's given it
01:13:33.900 a shot. Some more of our European counterparts are, are trying, um, their best to, to come down on these
01:13:41.040 tech firms when they get a little bit too big or a little bit too aggressive or a little bit too
01:13:44.940 programmy. You know, you're, you're not allowed to do the targeted ad stuff or legislation that would
01:13:50.080 make us have to consent, you know, make Facebook have to make us or Twitter or whoever have to
01:13:55.480 consent to them tracking us and our preferences and so on. We could opt in or opt out. Is, is that,
01:14:01.380 I mean, I imagine you would say at least that's a, that's a part of it, but what else needs to be
01:14:06.160 done? Yeah. You know, it's, it's so tricky. Um, I think we need a, um, you know, a Geneva
01:14:12.220 convention for the arms race. That is who can go lower into the human brainstem and human
01:14:16.700 vulnerabilities to get attention out of us. You know, like for example, just to make it clear,
01:14:20.760 you know, a, a Geneva convention on no more beautification filters, right? Cause right now
01:14:25.640 it's an arms race. If, if TikTok does the 5% without even you asking, just like beautifies
01:14:30.180 your kids' faces and that causes them to use it more. If Instagram doesn't match them at the 6%
01:14:36.020 beautification filter mark to outcompete them, it becomes an escalatory dynamic. If, if one of the
01:14:41.540 companies does these comeback emails, so it shows you your ex-boyfriend or your ex-girlfriend and the
01:14:45.760 other, and, and, and they're successful at getting people to come back. If the other ones don't do
01:14:49.900 that, they're going to get, you know, etched out. So I really, what we have are classic problems in
01:14:55.300 economics, right? We have arms races and tragedy of the commons. If I don't do it, the other guy will.
01:14:59.940 And this is where we need regulation to protect that. This is not about speech. It's not about
01:15:03.960 saying, do we like that guy's speech or not? Should we de-platform them or not? It's about making sure
01:15:08.820 that we're designing the attention commons to serve our society, to serve mental health, to serve, uh,
01:15:15.760 democracy, to enable us to be a better society again, in competition with real geopolitical actors
01:15:20.840 who are building a different kind of society in a different system. So yes, you know, it's great
01:15:25.140 when you have certain things like maybe banning micro-targeted advertising, there's some proposals
01:15:29.180 like the Honest Ads Act, things like that. Francis Haugen has proposed, you know, this is also an
01:15:34.100 international problem, publishing for every market, for every country, the top links that are getting
01:15:38.900 the most engagement and traffic above a million views so that every country with their own
01:15:43.520 investigative journalists can look at how is this rewarding the most engaging stuff? Because
01:15:47.700 you have especially these vulnerable countries where they're, you know, Philippines is going
01:15:51.160 completely off to full-on, you know, authoritarian kind of crazy town because it rewards the most
01:15:57.200 crazy stuff and there's no content monitor. There's very few content moderators for these
01:16:00.520 languages. Facebook wants to invest, like think about the U.S. as we're getting the best
01:16:04.140 of these experiences because they're, they're, they have so much pressure in the U.S. that they're
01:16:08.500 putting the most resources into protecting election integrity or, you know, whatever they
01:16:13.680 can. But if I'm Russia or China, I'll just throw my disinformation into Central America
01:16:17.940 and Haiti and say, hey guys, there's like a free border opening. You can just run across
01:16:22.040 the border so I can weaponize the rest of the countries that have less investment from Facebook
01:16:26.580 and less protection and start steering people in Spanish, right? So you get a sense of the global
01:16:32.440 nature of this problem and, and that yes, you know, we have certain efforts that we want to
01:16:36.400 celebrate when governments do take small actions if they're the right ones. But fortunately, I don't,
01:16:40.820 unfortunately, I don't think these are the comprehensive things that we need. We really
01:16:43.480 need a 21st century democracy protection program. And, and it's not the things like censorship or
01:16:48.840 free speech. It's really about how do we incentivize the right kinds of technology to get built.
01:16:53.460 So what about my, my listeners, my viewers now, like what, what can they do, right? What in their
01:16:58.240 own lives? Cause I, I know I read that you are quote, slightly obsessive about what equals time well
01:17:04.800 spent in your life. Right. And I love that. And I, and I saw that you did the tango. I'm like,
01:17:09.360 that's amazing, right? You're you as somebody who's lived this firsthand, understand the importance
01:17:13.720 of prioritizing life away from the device that I've gleaned, but walk us through practical advice
01:17:20.200 for the individual humans and what they can and cannot do. Yeah. You know, um, uh,
01:17:26.260 there's a bunch of things. I mean, the easiest thing you can hear this conversation, people say things
01:17:31.560 like, Oh, just delete your social media apps off your phone. You can still go to the website if you
01:17:35.240 really had to, but you could delete the social media apps off your phone. Now notice that when
01:17:41.060 I say that a person's receiving that information, they might take it in, but are you really about
01:17:45.040 to do it? No, think to yourself right now in this moment, am I really going to lose something?
01:17:49.540 If I actually just delete the app, the app itself off my phone, I can hold it down. It starts to
01:17:55.640 wait a hole. You can hit that delete button. You can actually do it. There's nothing they can do to
01:18:00.380 stop you from doing that. You can turn off notifications in general, turning off all
01:18:04.420 notifications. There's really very few things that are life worthy notifications that are time
01:18:09.080 well spent that genuinely notify you to most people don't change their notification settings.
01:18:13.400 I forgot the stat, but it's like almost no one goes into their settings and tries to tweak all
01:18:17.380 these things, right? It's a ton of work. This is also where Apple can do so much more, right? Apple is
01:18:22.060 kind of like a central bank or kind of a governor oversight body for the attention economy, and they can set
01:18:27.120 better defaults. So it's less noisy, less toxic. But there are things like this, you know, there's
01:18:32.260 an organization called Wait Until Eighth. That's about waiting for kids not to get a smartphone
01:18:36.480 until I think 13 years old. Recognize as much as, you know, obviously your kids, if they're using
01:18:41.360 social media already and all their friends on it, I just want to acknowledge and be compassionate to
01:18:45.800 how difficult that must be as a parent, because you can't tell your kid to not participate where their
01:18:50.740 friends are. Can you organize a group of parents or a school to say, can we get the kids from
01:18:55.980 chatting with each other, at least on Instagram, which is the text medium for them? Can we get
01:19:00.200 them to move to something like a big iMessage thread or a WhatsApp thread, or ideally not
01:19:05.060 WhatsApp, maybe Signal or something that's more, that doesn't have that incentive, right? Doing
01:19:09.340 FaceTime calls between kids, as opposed to sending beautified photos back and forth. I mean,
01:19:15.220 there's so many issues, Megan, we didn't cover cyber bullying, the way that, you know, nude photos
01:19:18.620 get shared between kids and the pressure that that puts. There's so many other aspects. But what we
01:19:22.900 really need is a great film, by the way, called Childhood 2.0 that I recommend as well. That's
01:19:27.040 really more about how kids are facing these dynamics. And we can all just advocate for a
01:19:33.120 better world, right? Recognizing that these systems are not built for us. We are the product,
01:19:36.940 not the customer. So long as their business model is selling, you know, atomized blocks of human
01:19:42.360 attention, just like trees are worth more as two by fours and lumber. So long as that's true,
01:19:47.160 we're going to cut down trees and turn them into two by fours. Well, we are worth more as dead
01:19:51.140 slabs of human behavior when we're addicted, outraged, polarized, narcissistic, anxious,
01:19:55.840 and not sleeping because that's profitable for these companies. Even the CEO of Netflix said,
01:20:01.300 our biggest competitor is sleep. You know, keeping you up till two in the morning,
01:20:04.400 three in the morning is more profitable than your kids going home at night. So just realizing
01:20:07.900 the asymmetry of power between what the technology companies are doing and what, you know, what we're
01:20:13.760 capable of and just honoring that. And then saying, how do we bring and restore power and agency
01:20:19.500 to ourselves? I have to say, I would be dishonest for me not to mention the cable news model right
01:20:25.920 now. Um, it's not dissimilar. I mean, one of the reasons I left cable news is because I was just
01:20:31.340 tired of being part of the outrage machine. It was every day, all day. Uh, it's not spoken out loud,
01:20:39.080 but it's obvious that they're looking to press people's buttons and make them upset and they're
01:20:43.220 experts at it. They know exactly how to do it. You know, I'm one of the reasons I left Fox and went to
01:20:48.780 NBC, a place where I didn't ultimately belong, um, was I was just desperate to get away from it.
01:20:54.060 I was very attracted to the idea of not doing politics every day and doing something that
01:20:57.900 felt better and was more Oprah esque and just like, you know, uplifted people as opposed to
01:21:03.680 outraged people. Um, but I'm at heart, a news woman and I had to get back to news, but this,
01:21:09.300 this is one of the reasons I like my current job is yes, there's, there's room for outrage. And
01:21:13.020 there are some things that really piss people off. And I understand that, but you, that can't be your,
01:21:17.260 your only diet in life. If you want to be a, a well-rounded, occasionally joyful human,
01:21:22.480 you have to have other meals. Exactly. Exactly. And I think just realizing like,
01:21:28.300 you know, look, I, I like all of us, you know, we have to look at the news and I do
01:21:32.740 dip into Twitter to try to figure out what's going on in the world. But I think just recognizing it
01:21:37.160 makes us feel awful because you're just seeing the things that are meant to enrage you. And I think
01:21:42.120 we just have to realize how, um, strange that kind of stimulus we've never before had a
01:21:47.220 supercomputer assemble everything that would maximally make you angry and then deliver this
01:21:52.060 like sort of, you know, lab generated political red meat to just plop on your plate every single
01:21:57.240 day. It just realizing this is not healthy for us. And we can choose to a very, really limit that.
01:22:03.340 And, and most, I mean, there's ways people can set up lists on Twitter. I mean, in general,
01:22:06.760 try not to use it, try to focus on what are long form news that you trust, who are people that are
01:22:10.780 doing a good job of steel manning the other position. Like, why did that, can I understand
01:22:14.880 and even say why the other side values what they value? There's a great quote by, um, John Perry
01:22:19.920 Barlow that never assumed that someone else's motives aren't as noble to them as yours are to
01:22:25.780 you. And I think we really need to honor each other. Like we are fellow countrymen and women
01:22:29.680 in a democracy. I really worry about civil war. I really think that this is the key to it. I think
01:22:34.460 we've got to all become conscious and, and step up. I also think that you, there are certain brokers
01:22:38.760 of anger who wear it on their sleeves, who you can tell they're angry and they want you to be angry
01:22:44.600 too. You know, I used to joke that that was the motto of New Yorkers. Welcome to New York. We're
01:22:48.900 angry and we want you to be angry too. Um, but you can make a smart choice in terms of, you know,
01:22:53.460 delivery and you can tell when someone's upset constantly and you tune into them because you
01:23:00.160 want to be upset, like make a different choice. You don't actually have to go that hardcore to stay up
01:23:04.860 with news and information. Cause this is the field in which you're, you know, you get attacked often
01:23:08.920 with the anger bombs. I guess I, my imagination tells me, cause I don't go on Tik TOK that that that's
01:23:14.240 more of a manipulation in a different way. They're not trying to make our kids angry. They're just trying
01:23:18.560 to make them addicts. That's right. It's really just about addicts and make them, make them
01:23:22.840 influencers. And yet their idea there is to make you addicted, not just to using Tik TOK, but addicted
01:23:29.020 to getting attention from other people to turn you into like an attention vampire that's never
01:23:34.660 satisfied and always wants attention from other people because that kind of human being, that kind
01:23:38.900 of kid is a way more profitable kind of kid than a kid who's self-satisfied, who's sovereign in their
01:23:44.120 identity, who's taking responsibility for themselves and who's developing healthy relationships
01:23:48.000 outside the screen. I mean, that's the basic thing, right? It's like life is better when we're
01:23:52.260 having dinner with each other and going hiking with each other and going on, you know, camping trips or
01:23:56.780 whatever it is that we love doing. But none of those things that I just mentioned are profitable
01:24:00.820 to Tik TOK. They don't make money when you go on hiking trips. They don't make money when you
01:24:04.220 start a language project. You don't have, you don't have kids yet, correct? I don't have kids yet. No.
01:24:08.860 Okay. So if you, if you have kids, would you let them do any social media? I mean, would you let
01:24:13.600 them do Tik TOK or Facebook in particular? I personally would really not. Um, and I, I know that we're
01:24:19.860 very far down that land. I think what I would tell people is just notice. And we said this in the
01:24:24.060 social dilemma. Most of the people I know in the tech industry do not let their own kids use social
01:24:29.440 media. That should tell you everything. The CEO of Lunchables Foods, Megan, did not let their own
01:24:34.300 kids eat Lunchable Foods. Um, and that was one of the most successful food product lines in the
01:24:38.740 country. It is frequently the case that when people aren't eating their own dog food, there's a real
01:24:43.200 problem, right? And by the way, that's simple ethical and moral standard. If we lived in a world where
01:24:47.440 the only technology we made was the ones that we happily endorse that our own children use for long
01:24:52.440 periods of time per day, think about how much better the world would be. I grew up in a world
01:24:56.760 where technology was empowering. I grew up learning how to program, how to make things, graphic design,
01:25:01.280 music, making music on technology. Those are things that if we're making them, we would say we would
01:25:05.840 want our kids to use. When you have a whole portion of an industry, that's the dominant one.
01:25:09.500 That's worth a trillion dollars that people don't want their own kids using. That's, I think that's
01:25:14.300 such an easy standard to apply for the world that we would want that would make our society stronger.
01:25:18.780 Sure. You have a sign I read, is it on your laptop that reads, do not open without intention.
01:25:26.800 So what is the intention that you keep in mind that we should keep in mind when flipping open our
01:25:32.200 laptops? You know, I just want to say, first of all, I am like every other human being. I'm a,
01:25:40.240 you know, I'm a meat suit with paleolithic emotions and I'm easily hacked. And, you know,
01:25:46.440 I care about what other people say about me. And if someone says I'm exaggerating, right?
01:25:49.920 Like we're just human. I think that first step of self-compassion. And yes, I do. I do have a
01:25:54.960 little sticker on my laptop that says, do not open without intention. It's a subtle way to try to
01:25:58.920 remind myself, why am I here? What do I want to do here? What's time well spent for me in my life?
01:26:05.020 And we're living at very interesting times. And I think we should each ask ourselves, you know,
01:26:08.720 what is time well spent for us? It's a way of asking what's a life well lived time well spent added up over
01:26:14.140 a lifetime is what is a life well lived? And I think we, we're each capable of asking ourselves
01:26:18.920 that question. And we have to just, you know, honor when we go off the rails, it's fine. Just,
01:26:22.740 just come back. It's just like a mindfulness exercise. You know, you notice that your attention
01:26:26.300 wanders and you just come right back. I think about, I say this to my audience too, but remember that
01:26:31.320 when it comes to tech, the laptop, the iPhone, news consumption, garbage in, garbage out, you know,
01:26:37.200 you're in control of what you expose yourself to in the same way you would protect your child,
01:26:41.420 ideally from what kind of movies he or she is going to watch at age six, you need to protect
01:26:46.400 yourself against sourcing that wants to mislead you, anger, you upset, you manipulate you. You
01:26:53.600 know, you have to be the parent to yourself. Tristan is sticking with us to take your calls. I'm very
01:26:58.980 excited about that. The lines are lighting up. So I'll squeeze in a quick break. We'll come back
01:27:02.980 and start talking to you. So our first caller is Janice from California. Janice, what's your
01:27:13.820 question? Well, I have one of my children that works for Facebook and we've gone from having a
01:27:21.700 relationship where I would speak to her on a regular basis throughout the week till I barely
01:27:26.360 hear from her anymore. And it's an argument because the philosophy that she has taken on is so much
01:27:33.700 different. And she made a comment to me last week that really sent chills up my spine. And that was
01:27:41.220 that the metaverse is real, that she can own a house in the metaverse that she can't own in the actual
01:27:48.280 world. And I find that to be chilling because at the end of the day, the metaverse isn't real.
01:27:55.060 It's virtual. It's virtual. It's the matrix, basically. And the real world is where we all
01:28:00.780 live and should live. And it has taken our children from living outdoors and enjoying life
01:28:06.960 to being sucked into this screen and living inside and having no life and very few physical contacts
01:28:16.420 with anybody. Are we so far gone that we can't fix that, I guess is the question.
01:28:22.700 Oh, my goodness. Thank you for that, Janice. My gosh, I'm sorry you're going through that. Go
01:28:26.020 ahead, Tristan.
01:28:27.300 Yeah. Thank you. Thank you, Janice. Yeah. I also know a lot of people who work at the company,
01:28:32.160 have worked at the company. Here's how I think about it. I think human beings are always tempted
01:28:40.700 by faster, better, more efficient, right? So when Uber makes it so that you can order a taxi and it
01:28:50.280 becomes faster, better, more efficiently, more reliably, why wouldn't you switch from a taxi to
01:28:55.620 an Uber? When Instacart makes it faster, better, cheaper, more efficient to do that, you're going
01:29:02.620 to switch. When your phone presents a 24-set... Think about it right now. Just forget the metaverse
01:29:07.840 even. Just our phone. If I'm sitting with someone and the conversation is starting to get boring in a
01:29:13.180 group conversation, the reality that I'm in isn't as sweet as the reality that I can taste by just
01:29:19.240 quickly pressing, pulling my slot machine and seeing what I'm going to get. I can instantly
01:29:23.080 access a sweeter feeling, a sweeter taste. What you're talking about with the metaverse at the
01:29:26.700 house is this... I think I have the same example, right? I can get a better self-image with my
01:29:32.660 beautification filter in my metaverse than I can by looking in the mirror and seeing that I haven't been
01:29:37.620 very healthy the last... I've been living in the metaverse. I can get a sweeter looking house that
01:29:43.720 I can virtually live in there than I might be able to afford in the real world. So I think what this
01:29:49.320 is presenting to us as a species is a choice to really recognize in ourselves just because
01:29:54.900 it might taste sweeter to go look at my phone and run away from my anxiety and run away from the kind
01:30:02.700 of boring conversation. It doesn't mean I actually can't... I don't have to do that. I can be aware of that
01:30:07.520 feeling and I can say, yes, but what do I actually really want to invest in here? And I can take a
01:30:11.940 breath. I can be present with someone. I can redirect the conversation. I think that that's
01:30:16.160 really the reckoning that we're in the middle of. Because as you said, we're right at this precipice.
01:30:20.020 I think I understand the feeling that your daughter who works at Facebook was mentioning.
01:30:27.100 But we have to make a choice. I think the good news is I think a lot of people feel very skeptical
01:30:31.460 and afraid of the metaverse as a vision for the future. I think that was most people's response to
01:30:36.280 Mark Zuckerberg's announcement in that video. So I think that's the good news. We're really
01:30:39.960 recognizing that.
01:30:41.660 Can I ask you something, Tristan? You mentioned a few times. I just want to circle back the slot
01:30:45.620 machine aspect of the iPhone. When I was training my dog, my friend who's a dog trainer said,
01:30:52.820 don't give the dog a treat every time he sits when you tell him to sit. Only give it sporadically
01:30:57.460 because it's more exciting for the dog if he doesn't know which time he's going to get the treat.
01:31:02.260 And she was saying that that's why the slot machines work so well. It's better. It's more
01:31:06.540 addictive. It works better on controlling behavior if it's up in the air. They know that.
01:31:11.960 Well, it's such an interesting example you mentioned because this is not a story that
01:31:16.080 I've really told publicly, I don't think, before. I know one of the very first designers of the
01:31:21.060 Facebook newsfeed. A lot of people don't remember this. Facebook didn't used to be just an infinitely
01:31:25.220 scrolling newsfeed. That was a design that they later moved to. At the very beginning, it was more like
01:31:29.480 a contact book, an address book. You'd be able to search for, click on friends and browse through
01:31:34.100 profiles and search for a friend. There wasn't an aggregated feed that said, here's everything
01:31:38.360 that changed since you've last been here. What she told me, the thing that transformed the use of
01:31:44.760 Facebook was making that feed infinitely scrolling. So it used to be you have to hit next or things like
01:31:51.060 that. And the second was that on the mouse, remember the mouses in the trackpad, they didn't
01:31:55.640 used to have double finger scrolling or like you could just scroll. It used to be you have to move
01:31:59.760 your mouse to the scroll bar, click on the scroll bar and then drag. People don't really remember
01:32:06.800 the details of this, but it used to work that way. On a current computer, typically you just take two
01:32:11.160 fingers, you put it on the thing and you just go like this, right? With your finger. That makes it so
01:32:15.780 your hand never has to leave its resting position. And then it hit me because I heard the designer of
01:32:20.700 slot machines say your hand never has to leave its resting position. Then they designed the slot
01:32:25.920 machine so that you just go click, click, click to get the next thing. The key thing that made the
01:32:30.900 Facebook feed so addictive and absorbing is that you're, they moved it to, you don't have to move
01:32:36.040 your hand. You just do this. And the same thing on the phone right now, obviously it's one, you know,
01:32:40.540 scroll, scroll, scroll. And that's the thing that makes it like a slot machine is you just,
01:32:44.700 you're just stuck in that one absorbing kind of experience. Um, so anyway, it was just fascinating
01:32:49.120 that she had told me this designer that that was a really key change in the way that Facebook
01:32:53.480 was designed. What are the other things they do in the casinos? I mean, we know that they don't put
01:32:57.220 clocks in them and that there's like, you can't get out. You can never find the exit. Um, but they
01:33:01.760 are masters of manipulators. Yeah. I mean, I think it's about designing for absorption. They want to
01:33:06.500 design for flow. They want the fewest number of interruptions. Actually, one of the things they do,
01:33:10.200 as I understand it from Natasha, um, uh, I think her book is called addiction by design,
01:33:14.600 uh, is when you start losing, like if you start losing money, uh, or you're about to leave,
01:33:20.240 you're about to walk away from the machine. I think like someone will come up to you and give
01:33:23.020 you a coupon. That's like, you know, get the free buffet at the, you know, so you stay in the
01:33:27.480 environment and it's very similar to Facebook, right? Like you're about to leave, you're about
01:33:30.560 to scroll away. And so it starts to email you more aggressively. Here's the ex-boyfriend,
01:33:34.820 here's the ex-girlfriend. And you're just designing for re-engagement, for flow, for hooking people.
01:33:39.820 And there are, by the way, there are books, there's whole conferences called
01:33:42.040 hooked that, that teach people these kinds of techniques. This is not like a fair situation.
01:33:46.760 And nowhere was it written that this is how technology should be, right? Again, in the 80s
01:33:50.980 and the 90s, when I was growing up, technology wasn't designed to just addict and manipulate
01:33:54.960 you. It was a creative tool. It was a bicycle for the mind. It can be that again. Um, and we're
01:34:00.200 trying to, you know, emphasize that vision for technology with, uh, with our work, but
01:34:04.080 obviously it's going to take a while to get there. One other question. Do you, do you limit the time
01:34:08.400 you can spend online when you, cause you, if you have to practice the tango and so on,
01:34:11.400 like, do you say, I haven't seen it in a while because of the pandemic, but, um,
01:34:17.220 but do you say like no more than 30 minutes or half an hour or an hour a day?
01:34:22.100 I try. I mean, I, listen, I, I, you know, one of the things people should know is your willpower,
01:34:26.880 um, you know, your ability to kind of really be aware and take responsibility and resist,
01:34:30.960 you know, the second marshmallow person versus the first marshmallow person. Um, uh, that ability
01:34:36.520 wanes, especially as you, uh, it gets late at night. So, um, you know, one of the things is
01:34:41.920 don't use social media right before bed. Two reasons. One, it'll make you depressed and ruin
01:34:46.320 your dreams. Second, um, your ability to sort of have your brain re-engage and wake up actually
01:34:52.620 diminishes late at night, as opposed to early in the morning, you have more willpower, right?
01:34:56.300 There's many studies on this. And so just to be aware of the battery of our own volition
01:35:01.460 and that free will is, does exist, but it's something we have to protect and our sovereignty,
01:35:06.440 our very ability to make free choices is the thing that's under assault, uh, in, in these
01:35:11.560 technologies and pointing a supercomputer at our brain and keeping us, uh, scrolling like the,
01:35:15.860 the rats in the searching for pellets. Right. You know, it's funny cause I had a woman on, um,
01:35:20.440 the other week who talked about the future and she's somebody who studies, you know, trends and
01:35:25.060 where we're going when it comes to the future. And one of the things she was saying was you're
01:35:28.260 going to, you're going to not walk around with a phone. Your shirt is going to have the abilities
01:35:32.400 that your phone has today. It's going to be able to Google things or tell you directions or even
01:35:38.660 potentially game. You'll have glasses on that, that can do all of that. You're not going to be
01:35:42.580 having to hold a device and it sounded very cool. And, and now I'm thinking it's sounding very toxic,
01:35:49.400 very scary and something again, like we don't want. Yeah. Well, I think that the difference is going
01:35:55.340 to be the immersing. So especially the visual system, so much of our brains are devoted to
01:35:59.260 vision. Right. And so, um, when information comes in visually, it really takes up the kind of full
01:36:04.040 space of our minds. One of the reasons I love podcasts is it's, it's, it let, I can be cleaning
01:36:08.880 the dishes. I can be going on a hike and I can listen to something. Right. So I actually, I'm excited
01:36:13.000 about the potential for, um, auditory technology, right. That's more blended into our lives, but is
01:36:17.660 more passive. It's like not taking over the full visual system. I want things to take over the visual
01:36:22.120 system when I'm doing something creative, if I'm making music, if I'm writing code, if I'm writing
01:36:26.060 an essay, but I don't want any of that addiction engagement complex, these just massive behavior
01:36:31.160 modification empires that treat us like the product and just want to suck it all out of us.
01:36:35.560 We would never want that to be built into these visual environments. And that's really the mistake
01:36:40.280 that we made. If we don't have these, these visual environments occupied with these horrible business
01:36:45.280 models, I'm sitting on a computer right now with you. This would be a fantastic machine. It's just
01:36:49.540 that it's those business models that really ruin those visual mediums.
01:36:53.300 I'm glad to hear you say that. Cause I love podcasts. We were just talking about this with
01:36:56.960 my producers about how you get these alarming notifications from your phone. Like your usage
01:37:01.140 was up 14%. You listened for nine hours. You're on your phone for hours, nine hours a day. Well,
01:37:05.600 in our case, a, we're in news and we program a news show, so it's going to be a lot, but,
01:37:09.460 but B, we all listen to a ton of podcasts. We get our news there and, and it's an important
01:37:14.680 source and it's a delightful way of getting your news.
01:37:17.140 Totally. And it provides that, that space for complexity and nuance. So we can really talk
01:37:22.260 about, you know, the issues aren't as black and white as simple. We're not fitting it into
01:37:25.320 just shitposting on each other because we get to really just talk and debate. Like, well,
01:37:28.560 why would that perspective be valid? Let's talk about it. What's, what's a, you know,
01:37:32.000 what would the other side say to that perspective? We can really work it out. I love podcasts as
01:37:35.480 well. I think podcasts are a humane technology.
01:37:37.640 Yeah. And you can go to so many different places. You know what I mean? I like your podcast sounds
01:37:41.360 amazing. I'm, I'm downloading this and becoming a fan today. Um, but I also think, you know,
01:37:46.600 I like, well, I like crime. That doesn't stress me out unless I'm, you know, directly related to
01:37:51.120 the people involved, but I do think it's interesting to the listening to crime investigate,
01:37:54.560 investigation techniques and so on. Somehow that's soothing to me because I'm an odd bird,
01:37:58.520 but I like that you can take a show like this and you can take a break and do, you know, do an
01:38:03.600 interview like this. Right. Or we did, Oh God, we've done a lot of feature interviews that I,
01:38:07.240 that just sort of take the focus off of the intensity of nasty news, politics, back and forth.
01:38:12.660 It's one of the beauties of technology, right? Cause we are advancing in a way that makes
01:38:17.180 some consumption more enjoyable and less toxic. Totally. Totally. If you do listen to our podcast,
01:38:23.960 there's one I would really recommend. And it's the one with Audrey Tang, the digital minister of
01:38:29.140 Taiwan. I know that sounds like a bizarre thing. Like why should we look to Taiwan for how they're
01:38:32.840 doing democracy? But it's, it's really inspiring what she's done there. And it's also inspiring
01:38:37.260 because the reason why China, China can broadcast to its own people, Hey, look how dysfunctional
01:38:41.820 democracy is a broadcast, you know, all the dynamics here and in Europe. But they, the reason that
01:38:46.940 Taiwan is so threatening to China beyond the fact that it's strategically important, they want it
01:38:51.480 is that it's an example of a really well-working democracy for people who look and talk just like
01:38:56.760 them, but are under a completely different governance model. So it's a really big threat
01:39:00.920 to China. And I think there's actually reasons why we should be interested in that Taiwan being a very
01:39:06.240 strong digital democracy that you know, we should dial up as much as we can because it shows that
01:39:11.340 there's a different model than the thing that they're projecting into the world. And that
01:39:14.440 interview is just a really good and inspiring interview. So.
01:39:16.500 Well, let's hope Audrey has the ability to continue broadcasting for the rest of her life
01:39:21.040 without any interference or any takeovers. So interesting. Tristan, thank you for being so brave,
01:39:26.820 for doing what you do, for calling our attention to all of this. You've done a huge public service.
01:39:32.080 I'm very grateful.
01:39:33.000 Really my pleasure to talk with you, Megan. Thank you so much for making time for your audience. Yeah.
01:39:37.260 Yeah. Let's do it again. Wow. He was amazing. And tomorrow's guest is amazing too. Do not forget
01:39:42.800 to download tomorrow's show because the one and only Goldie Hawn is going to be here. I love this
01:39:49.760 woman. I adore her like every normal person. And she is so much more interesting than you even
01:39:56.680 suspect. She's tough. She came up in Hollywood at a time when it was not so easy for very young,
01:40:03.140 beautiful gals like Goldie. And she's got some stories that will shock you, but it will make
01:40:10.620 you appreciate her grit. I mean, she never lost her joy, her sense of humor, her ability to laugh
01:40:16.760 and make us laugh despite a lot of crap that that industry, that disgusting industry that lectures
01:40:21.560 us all the time on how to be better people threw at her. So I think you're going to love her insider's
01:40:27.520 perspective, her push toward mindfulness because she spent a lifetime working on that too.
01:40:32.180 Her beautiful relationship with Kurt Russell. I've said before with the last time I interviewed
01:40:38.080 her, she had a great line about him. Of course, most of American women love Kurt Russell. And
01:40:43.040 she said, I said, how'd you make it work for so long? And she said, well, they say the grass is
01:40:48.140 always greener, but it never was for me. She's great. You'll hear her tomorrow. Don't miss that.
01:40:53.700 And until then, we'll see you soon. Thanks for listening to the Megan Kelly show. No BS,
01:41:00.340 no agenda and no fear.