Joe Biden held his second formal news conference in a year, and it didn t go well. In fact, it was one of the worst press conferences he has held in years, and a new piece by National Review's Charles C.W. Cook explains why.
00:05:32.160His message seemed to me more like what the American people really need is to see more of me.
00:05:37.760I need to get out there on the road and just do a better job of explaining how amazing I've been.
00:05:43.320And it's like, well, I mean, again, going back to Thiessen, his column yesterday read as follows, in part.
00:05:50.260Sorry, but you can't boast about your COVID strategy when 55 percent disapprove it.
00:05:55.000You can't brag about your economic performance when 60 percent say it's been dismal.
00:05:58.920You can't crow about your foreign policy when 55 percent believe you're doing a terrible job as commander in chief.
00:06:03.540Can't talk about how you've united the country when a 49 percent plurality say you've done more to divide us.
00:06:08.660And you can't say you've had a great year in office when 63 percent say we're on the wrong track.
00:06:14.240And that same number, about two thirds of the American public, Charlie, according to I think it was the latest Suffolk poll, say he shouldn't run for a second term.
00:06:22.640So, I mean, can you really say it's just a comms problem?
00:06:26.440What he really needs to do is just be more persuasive about what he's done.
00:06:32.120We've talked about this on your show before.
00:06:34.280The problem is that Joe Biden's behavior as president has been at odds with how Joe Biden ran for president and why the American public hired him to be president.
00:06:47.160Joe Biden ran as almost the anti-Twitter candidate.
00:07:08.500Joe Biden expected until January 5th of last year to preside over divided government with a Republican Senate, a narrow Democratic House and, of course, a Democrat himself in the White House.
00:07:22.880And the moment he got to 50 seats, not 65, not 75, but just 50 seats, he brought out every single agenda item that the Democratic Party has wanted to institute for the last 10 years.
00:07:37.260This is a profound mistake because the American public, whether it should have or not, wanted a caretaker president.
00:07:44.240They wanted a president who wasn't Donald Trump and who would return the country to normality after both Trump and COVID and the economic fallout.
00:07:53.420Biden totally misinterpreted his mandate and he's still suffering from that.
00:07:58.500And again, there was no sign yesterday that he is going to change that.
00:08:04.640And until he changes that, he's going to get the same results.
00:08:08.320I know I was talking to the BBC today, Charlie, back in your your old home country, though you're an American now.
00:08:14.500And I was struck by the questioning because it was very focused on how Trump and the tweets and, you know, Joe Biden has, you know, things have gotten calmer.
00:08:23.820And isn't that a better thing for America?
00:08:25.460America, and they really look at him as not radical, as moderate.
00:08:31.740And, you know, no president could really bring the country together.
00:09:07.040But oddly enough, the last week, Joe Biden has been more like Donald Trump than he would like to admit.
00:09:13.600On policy yesterday, he cast doubt on the legitimacy of elections twice.
00:09:19.680He was weak on the question of a Russian invasion of Ukraine, also twice.
00:09:28.660And he snapped at a reporter, Philip Wegman, who asked him about comments that he had made about his voting rights agenda, so-called.
00:09:41.000Which, in turn, was the product of a speech he gave last week that was frankly outrageous, that was so outrageous that it was condemned by the likes of Mitt Romney, who tend not to raise their voice, tend not to indulge in hyperbole.
00:10:00.140And also acknowledged by his own party, Dick Durbin said that perhaps Biden had gone too far.
00:10:09.120What he did was to divide the country up into two groups of people.
00:10:13.220Good people, who agree with Joe Biden, and bad people, whom he likened to insurrectionists and segregationists and really some of the worst people from the darkest periods in American history.
00:10:25.740And that's not what Joe Biden said he was going to do as president.
00:10:29.760And Mitch McConnell on the Senate floor compared Biden's comments to his inaugural address unfavorably.
00:10:37.220Yeah, we actually have that because worse, Joe Biden tried to deny that he had made that comparison between people who oppose his voting rights bill and the George Wallace's of the world.
00:10:50.600And it's on camera. We all it just happened. It's not like it happened two years ago. We just heard him say that earlier this week.
00:10:57.020So here is a soundbite showing his denial and then what he said earlier this week.
00:11:02.620You called folks who would oppose those voting bills as being Bull Connor or George Wallace, but you said that they would be sort of in the same camp.
00:11:13.340No, I didn't say that. Look what I said. Go back and read what I said and tell me if you think I called anyone who voted on the side of the position taken by Bull Connor, that they were Bull Connor.
00:11:31.340And that is an interesting reading of English. I assume you got into journalism because you like to write.
00:11:37.340So I ask every elected official in America. How do you want to be remembered?
00:11:44.340At consequential moments in history, they present a choice.
00:11:49.380Do you want to be the side of the side of Dr. King or George Wallace?
00:11:54.780Do you want to be in the side of John Lewis or Bull Connor?
00:11:59.660Do you want to be the side of Abraham Lincoln or Jefferson Davis?
00:12:03.280The indignation at having been called out for something he is on camera doing.
00:12:10.820And the incoherence. His answer was totally incoherent. He contradicted himself.
00:12:19.720And all Philip Wegman had done, very politely, it must be said, is ask him a question that accurately characterized his previous remark.
00:12:30.080Look, we know why one would invoke Jefferson Davis or Bull Connor in a political speech.
00:12:38.760We know why people invoke Hitler in political speeches or slavery in political speeches.
00:12:46.620Once you've done it, you can't backpedal and say, well, I didn't mean it literally.
00:12:50.560It was clearly heard by a good number of Americans, and not all of them Republicans, certainly not all of them involved in politics, as a manichean exercise in bullying, an attempt to cast bills that really aren't responding to much as the future of the country.
00:13:15.220And I think Biden should own that. If you want to engage in that sort of language, own it afterwards.
00:13:21.680But he's trying to have it both ways. And again, where's the reset?
00:13:24.780Mm hmm. He also tried to blame the Republicans for not getting more accomplished.
00:13:30.100His you know, on the one hand, he accomplished more than anybody ever in the first year of his presidency.
00:13:34.200But on the other hand, the reason he hasn't accomplished more is those Republicans who he could never have anticipated would be this way, would be this determined to block his agenda.
00:13:45.840Meanwhile, I mean, for my first thought on that, Charlie, was that's rich coming from a guy who heads up a party that called themselves the resistance during Trump's presidency.
00:13:53.480They weren't working with Trump on anything. But secondly, the Republicans have worked with him on a couple of key items.
00:14:00.140You know, that's why he got his one point nine trillion dollar covid relief plan through.
00:14:05.520It's why he got his infrastructure bill through because they worked with him.
00:14:09.500And a lot of Republicans voters didn't like the fact that the Republican lawmakers did that, but they did it.
00:14:14.560And his most recent defeats were caused not by Republicans, but by Democrats.
00:14:19.400Yeah. So, as you know, I am a staunch defender of the separation of powers, and I don't like the way that when we have a president of a different party than the Congress, that the press describes the Congress as obstructionist.
00:14:37.840I didn't like it when Trump was president. The Democrats were obstructionist. I don't like it.
00:14:43.000Now we have a Democratic president. The Republicans are obstructionist. Congress is in charge of legislation.
00:14:47.800There's nothing written in stone that Joe Biden should get any of his agenda.
00:14:53.200And so this framing, which we hear, especially from Democrats, I find irritating as a general rule.
00:14:59.840But as you point out, it was not just irritating, it was churlish because on November 15, which is two months ago, Joe Biden signed and heralded a bipartisan infrastructure bill that got 69 votes in the Senate and that was endorsed and shepherded through by Mitch McConnell.
00:15:21.400So to suggest that Mitch McConnell is not likely to do anything that would make Joe Biden look good is not only to falsify the record, but to hide under a blanket the most recent victory that Joe Biden himself trumpeted from the White House.
00:15:40.660Meanwhile, there is a bipartisan group of senators in the Senate who are working on a reform to the Electoral Count Act, the very instrument that was used by President Trump to try to steal the election in 2020.
00:15:55.700So that wasn't just an annoying framing that puffed up the role of the president in our system.
00:16:03.160It was factually wrong and it was ungrateful to boot.
00:16:05.860Hmm. You point out in your piece today at National Review, Joe Biden did inherit some challenges.
00:16:13.000No question. It wasn't as bad as when, you know, Trump was dealing with covid, because at that point it was brand new.
00:16:18.700We were trying to figure it out. We didn't know what it was. And Joe Biden also inherited vaccines.
00:16:22.600But it's not like we didn't have a covid problem when he took over and inflation was already starting to rear its ugly head.
00:16:29.940So there were some headwinds against him, though it must be said he was he was in a pretty good position at that point versus Trump when it came to the vaccines.
00:16:38.660But your point is what? That he inherited those challenges. And yet the American people, what, they don't understand that they're holding it against him, that he didn't overcome those challenges faster.
00:16:51.400What explains the dismal polling between 365 days ago and now?
00:16:57.080I think it's that Joe Biden made explicit promises that haven't been kept now as a libertarian type.
00:17:04.240I wouldn't if I were running for president, not that I'm allowed to, I wouldn't make promises of the sort that Biden did, because I don't believe that the president is the king.
00:17:16.220I don't believe that he's a pope. I don't think that he has some sort of spiritual control over the country and its economy and infectious diseases and so forth.
00:17:26.980But Joe Biden ran as if he did believe that he said on television that he was going to shut down the virus.
00:17:35.780Now, whether or not he could do that, and I don't blame him for the persistence of covid any more than I blame Donald Trump for its arrival.
00:17:43.460He hasn't done that. He promised that he was going to restore the economy on a broad basis and that the middle class would be better off under him than it was under Trump.
00:17:55.800Again, I think the president has a limited ability to do that.
00:18:00.000But Biden made that promise and it hasn't happened.
00:18:03.700And when you do that, you create a hostage to fortune for which you have only yourself to blame.
00:18:09.420So, yes, there were many challenges when Joe Biden came in.
00:18:13.720And yes, he had it more difficult than did, say, you know, John F. Kennedy in 1960, although, of course, he had his own challenges.
00:18:23.300But people judge you based on what you promise you will do.
00:18:27.640And Joe Biden, contra his argument yesterday, has overpriced and he is underdelivered.
00:18:34.920What do you make of? Yes, he was angry in response to that one question.
00:20:36.500Well, look, there is a, a broad, uh, prohibition on the, the remote diagnosis of political figures.
00:20:45.340And I think that's, that's a good thing, but one doesn't need to get into any sort of medical claims, uh, in order to evaluate the man as the speaker of English.
00:20:57.280And at times yesterday, it was not clear to me as a native speaker of English, what on earth he meant.
00:21:06.320Um, he, he is decreasingly able to express himself and communicate coherently and in a timely fashion.
00:21:14.920And I can't imagine that that's going to get better over the next few years.
00:21:19.820Um, we, we have the oldest president we've ever had that does matter, especially in the modern era.
00:21:26.980If you look at people who go into that job, uh, and then you look at what they look like when they leave that job, they age far faster than anyone would want to.
00:21:39.840If Barack Obama did, George W. Bush did, goodness knows how it's going to age Joe Biden.
00:21:48.220So by the time that he would run again, um, he's going to be like that plus another two and a half years of stress.
00:21:56.240And he's going to be, uh, 82, I believe.
00:22:01.460This is a real predicament for that party.
00:22:05.020Not least because the vice presidential candidate that they have chosen is even less popular and oddly enough, uh, often even less able to express herself in the English language as well, at least not without sounding as if she's late for a book report.
00:22:20.180So, uh, they have created a straitjacket for themselves that is going to be really difficult to resolve because they've really only got three choices, haven't they?
00:22:35.780You have a primary, but that primary would be conducted while Joe Biden and Kamala Harris are serving their term out.
00:22:43.780Um, and that would be brutal, I think, for the, uh, the Democratic Party, just as it was in, um, 1979, 80, when Ted Kennedy challenged Jimmy Carter, and as it was in 91, too, when Pat Buchanan, uh, challenged George H.W. Bush, both of which, it should be noted, ended up with those presidents losing.
00:23:05.840And he did say last night that if he runs again, which he has said he will do, Kamala Harris will be his running mate, though there was a long pause that, that's given other people pause in deciding whether they believe that, since she seems even less likely to win than he does.
00:23:22.780Um, assuming they really are prepared to run an 82-year-old, uh, to run a second term.
00:23:28.300And, and, and could even he do it, given the, the fall in his poll numbers and the way things are going.
00:23:37.700And I encourage you to talk to your friend, Rich Lowry, to whom I sent a text today about a very funny exchange I heard on your podcast, The Editors.
00:23:47.000And I love Mad Dogs and Englishmen, too.
00:23:49.220Um, but there is a very funny moment between Charles Cook and Jim Garrity, I think it was last Friday's show, that my husband and I have been laughing about for a week.
00:23:59.020And I'll just leave it there as a tease.
00:24:00.980Don't forget to stay tuned now, because up next, Tristan Harris.
00:24:05.540Tristan is from the huge Netflix hit, The Social Dilemma.
00:24:09.400He was on the inside of Google for years and has ever since been demanding more ethics from big tech.
00:24:17.520An insider's view on what they're doing to us then, now, and in the future.
00:24:30.980The very popular Netflix documentary, The Social Dilemma, pulled back the curtain on the tech industry and the ways we all can become addicted to our phones, our social media, and just instant gratification.
00:24:43.280A very prominent player in that documentary is Tristan Harris.
00:24:47.060He has been called the closest thing Silicon Valley has to a conscience, a former design ethicist at Google who has since gone on a mission to raise awareness against the everyday devices, about and against, that we have become addicted to.
00:25:12.180So I was just looking at your background, just to set it up.
00:25:15.540You're from the San Francisco Bay Area, as I understand, raised by a single mom, and very young when you started practicing magic, which would become relevant to what you're doing today.
00:25:28.800I love talking about being a kid and studying magic.
00:25:32.680Actually, my mom used to take me to a little magic shop in San Francisco growing up, and I was just fascinated that,
00:25:38.760independent of the age or education or PhD level of the person you're doing magic with, that magic is about understanding the vulnerabilities that are universal to all human minds, right?
00:25:51.800And even sometimes if you know how the trick works, the psychology is so powerful that it still works anyway.
00:25:58.660And that really plays into how technology is designed, because when I was later at Stanford, and actually I was classmates with the founders of Instagram and many of the people who joined the early ranks of Facebook and Twitter and a lot of these companies, so I really know the culture and the people intimately.
00:26:13.100And we, many of us studied at a lab called the Stanford Persuasive Technology Lab, which is part of a whole space and discipline of persuasive technology.
00:26:23.720How do you design technology to persuade people's attitudes, beliefs, and behaviors?
00:26:29.620When I say that, I don't mean political persuasion.
00:26:31.700I mean things like, can I persuade someone to fill out a form?
00:26:35.120Can I persuade someone to tag their friend in a photo on Facebook?
00:26:38.780Can I persuade someone to add a filter to their, you know, to their photo on Instagram?
00:26:44.500And persuasive technology is a whole discipline that is at the root of changing, I think, how we see our relationship to technology, which is it's not just this mirror that's, you know, people often think, oh, there's all these problems with social media and polarization and addiction, but we're just holding up a mirror to society.
00:27:03.520Those are your extreme, you know, folks, and that's how people behave.
00:27:08.780But I think what that picture misses is that technology is actively persuading us and eliciting certain things from us, and those are design choices made by technology companies.
00:27:18.820So when I was later at Google, I became a design ethicist.
00:27:22.400They actually acquired a small company.
00:27:26.180And I became interested in how do you ethically shape when you know more about their mind than they might know about their own, and you're designing persuasive technology.
00:27:36.460What does it mean to be ethically persuasive?
00:27:40.200I tried to change Google from within for a few years, and I just saw the incentives that were fundamental to this industry about capturing human attention.
00:27:48.160That they, you know, how much have you paid for, you know, your Facebook account or your Twitter account in the last year?
00:27:59.800People think it's just their data, but they actually make money the more time you spend because you have to look at the ads.
00:28:03.800And the more time you spend, the higher their stock price, but there's only so much attention, so it becomes this race to the bottom of the brainstem.
00:28:11.460Who can go lower in the brainstem to elicit responses persuasively from you and get, you know, outcomes from you?
00:28:18.280So I think that's really the situation we find ourselves, and I think that lens of persuasive technology and magic are critical to understanding what's really going on with how technology is influencing us as opposed to we're actively using it.
00:28:31.080Oh, it's fascinating because watching The Social Dilemma, my biggest takeaway was you are being manipulated, right?
00:28:39.720I mean, that's really the message of it.
00:28:41.760It's not totally it's not your fault, though it's not entirely your fault if you have an addiction to your phone or your social media.
00:28:49.060There is culpability and intentionality on the side of big tech.
00:29:28.520I mean, people are really at home and they're looking at their kids and the kids are sucked into their phones and they think that if they're addicted, that's that's their responsibility.
00:29:46.520Well, so like, let's say, you know, one of your kids watches The Social Dilemma and says, wow, I really don't want to get sucked into that anymore.
00:30:08.820There's a feature called user resurrection or comeback emails.
00:30:13.140So like a digital drug lord, it notices that you stopped using.
00:30:16.780And instead, if it was a neutral product and we were responsible for our own addictive behavior, then they wouldn't actively say, hey, user four, five, six, seven, eight, eight, two, five, seven, three.
00:31:02.120It's like you try to quit alcohol because you've become addicted to it.
00:31:06.700And yet somehow the people at Seagram's find a way to keep a bottle in your pocket, to uncap it, to have it spill a little on the table in front of you.
00:31:15.380I mean, it's like, of course, it makes it even harder for anybody who's got an addiction to get away from it.
00:31:20.360And worse than that is the entanglement.
00:31:23.080So actually, one of the things that I know we'll talk about later, Frances Haugen, who was the Facebook whistleblower and the Facebook file, she leaked, you know, thousands of documents of Facebook's own internal research.
00:31:33.920And one of the things that in Facebook's case, but really when we talk about Facebook or Instagram, you can apply it to all of them.
00:31:38.700You know, Twitter, TikTok, it's very similar across the entire social media industry, is that they actually know that kids get entangled.
00:31:48.300So, for example, Megan, you know, you and I probably use what texting is our primary way of talking to your friend, right?
00:31:54.220I'm assuming you open up your iPhone and you fire off the text.
00:31:57.620What parents don't realize is that for kids, a lot of kids, either in TikTok or Instagram, that's their primary messaging medium.
00:32:04.460That's where they message their friends.
00:32:05.820It's not just like the feed, it's also where you kind of message your friends.
00:32:10.040So if you say, hey, I don't want to get sucked into that addictive feed, they have bundled and entangled those two things together.
00:32:16.420And they don't want to separate them because, so to counter your example about alcohol, alcohol wasn't baked into a fundamental need of the way that you communicate, right?
00:32:25.120But imagine that the only place you could communicate is the place where they can put that alcohol and pour you a glass.
00:32:30.920And they always pour you a glass every single time you want to open your mouth and say something to someone else.
00:32:34.940So and the companies know that parents are bad at giving their own kids advice about this because they know that parents will say things like, oh, you know, honey, just stop using it.
00:32:44.120As if it's a matter, it's like telling you, Megan, or me, don't don't text your friends like when they entangle us.
00:32:50.220That's really the where the abuse comes from.
00:32:52.500Well, it is a big problem, whether you're addicted or not, because you do.
00:32:56.160I mean, I would love to step away from my iPhone more, but I suffer from the same problem.
00:33:00.100I mean, every that's how everyone communicates.
00:33:12.480Well, the iPhone, the cell phone didn't even exist really until the early 1990s.
00:33:16.720I remember seeing somebody walk down the street with it in Chicago in 1995.
00:33:19.860She was having a conversation on the on the sidewalk and being like, what a moron who needs to have a conversation while they're walking from A to B.
00:33:29.020But we're so far, all of us away from that now.
00:33:32.580How can one exist without this device?
00:33:36.340Well, you know, so I run an organization called the Center for Humane Technology that's been trying to ask and answer these these questions and at least point to a direction, which is really clear.
00:33:46.720This is not about vilifying all of technology or creating a moral panic and saying everything is going off the rails and we should stop using our iPhones or stop using technology overall.
00:34:02.780In fact, my co-founder, Aza Raskin, his father, Jeff Raskin, actually invented the Macintosh project at Apple.
00:34:08.940Back in those days, the idea of a computer is it's a bicycle for your mind.
00:34:13.220In the same way that a bicycle uses more of us in getting even more leverage out of the kind of distance that we can travel, technology can be a bicycle for our creative powers, for our communication powers, for our science powers.
00:34:26.360But that's not what the business model of these social media, I think these social media companies are going to look back in history and see them as a parasite, that their goal is to suck as much attention out of society as possible and suck it into these engagement and arrangement machines that polarize us, that sort of want us to not be able to have a conversation over Thanksgiving dinner because they want to personalize these news feeds to each other so that we each get different information from each other.
00:34:54.280Even when we try to have a conversation, we can't do that.
00:34:58.020That is that the key difference here is the business model.
00:35:00.600Notice if you do a FaceTime call to your son or your daughter, Apple doesn't make money the more time you use FaceTime.
00:35:08.320So when you stop using FaceTime, it doesn't aggressively message you.
00:35:11.280It doesn't put hearts and likes and comments floating all over the screen to keep you jazzed up and endangled.
00:35:17.340It doesn't do the beautification filters to plump up your lips or your eyes or your cheeks, which the TikToks and the Instagrams do.
00:35:23.200In fact, TikTok was found recently to, without even asking users to do a 5% kind of beautification filter, even if you didn't turn it on actively, because the apps that give you the, it's like the mirror, mirror on the wall.
00:35:38.160The one who reflects back the most positive self-image is the one you're going to get addicted to.
00:35:42.680And so TikTok actually invisibly was doing that and plumping, you know, kids, you know, lips and, you know, eyebrows and all of that.
00:35:49.520And it's as these really serious consequences that we saw in Francis Haugen's Facebook files, including the fact that you have kids like, you know, teenage girls who will say, I'm worried I'll lose my boyfriend if I don't have the beautification filter on because they've become accustomed to seeing me with that filter.
00:36:05.880And it creates an anchor of who we are, where the virtual us, they will only like us if we look different than who we actually are.
00:36:14.120And that's the perversion that comes from this business model, which again is separate from email or FaceTime or text messaging.
00:36:20.320Those things are fine because their business model is not maximizing attention.
00:36:23.120Wow. This is so chilling when you think about the creation now of the so-called metaverse.
00:36:30.020They're basically in the process of creating a new, more in-depth, more time-consuming universe online, which I don't totally understand, but they're trying to suck even more time from us as a world online.
00:36:43.180They want an alternate universe online that's even more involved and time-staking than it is today.
00:36:48.840We'll get into that much, much more when we squeeze in a quick break and more with Tristan right after it.
00:36:55.320Don't forget, folks, programming note, you can find The Megyn Kelly Show live on Sirius XM Triumph Channel 111 every weekday at noon east and the full video show and clips by subscribing to our YouTube channel, youtube.com slash Megyn Kelly.
00:37:45.940On the other side of the screen, it's almost as if they have this avatar voodoo doll, like model of us.
00:37:52.940All of the things we've ever done, all the clicks we've ever made, all the videos we've watched, all the likes, that all gets brought back into building a more and more accurate model.
00:38:01.460The model, once you have it, you can predict the kinds of things that person does, where you're going to go.
00:38:07.480I can predict what kind of videos will keep you watching.
00:38:10.100I can predict what kinds of emotions tend to trigger you.
00:38:13.380At a lot of these technology companies, there's three main goals.
00:38:16.440There's the engagement goal to drive up your usage to keep you scrolling.
00:38:20.180There's the growth goal to keep you coming back and inviting as many friends and getting them to invite more friends.
00:38:27.120And then there's the advertising goal to make sure that as all that's happening, we're making as much money as possible from advertising.
00:38:34.760Each of these goals are powered by algorithms whose job is to figure out what to show you to keep those numbers going up.
00:39:08.920No, I think people looked at this metaphor.
00:39:10.840So in the film, The Social Dilemma, which I really recommend everyone watches, it was the second most popular documentary, I think, in Netflix history, won two Emmy Awards.
00:39:18.520And it really just lays this out in a way that I think everybody on all political sides can kind of understand as well.
00:39:24.560And what we talk about in the film, as you said, Megan, is that, you know, behind the screen, you know, there's you.
00:39:43.600What they do is they fork it off to that supercomputer, which is that Pete Campbell character and that character, which is, like you said, you know, character embodiment.
00:39:54.780It's just a computer and it's calculating a number.
00:39:56.780And it looks at every possible thing it could show you next, like within the space of things it could show you.
00:40:01.500It could show you something that will outrage you politically.
00:40:03.800It'll show you something that'll your ex-boyfriend or your ex-girlfriend, because that's what you clicked on last time.
00:40:09.640It can show you a live video because Facebook wants to, like, dial up that live video.
00:40:14.460It tries to calculate which thing would be most likely to keep you scrolling, because obviously it doesn't want to show you the thing that will stop you from scrolling.
00:40:22.240And it's a supercomputer pointed at your brain to figure out how to basically light up your nervous system.
00:40:27.680And the voodoo doll idea, one of the reasons we use that metaphor is that if I talk about, hey, Megan, you know, they have your data.
00:40:37.680If you think about it just as a person, like, there you are, you hear that phrase, they have my data.
00:40:41.040It doesn't feel like, what's the problem with that?
00:40:43.820But if I say, look, that data is being used to assemble a model of you, a more and more accurate model that can be used to predict things about you.
00:40:50.820And it gets more accurate the more information they have.
00:40:54.420So all the clicks you've ever made, that puts a little hair on the voodoo doll.
00:40:57.060So it's a little bit more accurate when I prick and try to figure out what would activate the voodoo doll.
00:41:01.040If all the likes, all the watch time and all the videos you've ever made, that also makes the voodoo doll more accurate.
00:41:06.600Adds little shirts and pants in the voodoo doll.
00:41:08.720But then what the point is that as that data gets more and more accurate over time, and it looks at 100 other people who saw those same political, you know, enragement videos that you've seen.
00:41:18.220And it says, well, for people just like you, this is the thing that tends to keep them scrolling, watching, clicking, commenting.
00:41:24.900Because all of that activity is engagement.
00:42:04.380And whether it's mRNA or masks or vaccines or, you know, no matter what it is, it finds the one that works on buckets of users just like you.
00:42:12.460And it knows that you're going to click before you know you're going to click.
00:42:26.360And he jokes that, you know, his partner, Itzik, when he uses TikTok, it only took Itzik, you know, one or two clicks for TikTok to figure out exactly which rabbit hole to send his partner Itzik down.
00:42:36.280And that's the thing about all of us is it knows exactly what works.
00:42:39.600But the problem is what works on us isn't the same thing as what's good for society.
00:43:12.880So it just makes me a little less easy to manipulate in the information game because you're definitely getting you're getting propaganda from both sides.
00:43:20.100But at least I mean, it's propaganda, but at least you're getting it from both sides.
00:43:23.500You're a little less easy to manipulate.
00:43:25.740So that's one step that definitely I mean, that's I actually have not seen that specific feature from Twitter.
00:43:30.240It's obviously better for each of us to maintain more broad information diets.
00:43:34.580But the second problem, Megan, is that the business model is we think of it like a parallel system of incentives to capitalism.
00:43:41.640Instead of getting paid in money, you get paid in more likes, more views, more attention, more comments.
00:43:47.320And when you say something that basically outgroups the other side and say, here's yet another example about why the other side is awful.
00:43:53.560We'll pay you more likes, more followers, because that was better for generating engagement for the machine.
00:43:59.860Now, no one at Twitter or Facebook has a big, long mustache and they're twirling it saying, gosh, how can you create the next civil war and and, you know, drive this up as much as possible.
00:44:09.440But that's the inadvertent side effect of a machine that's values blind.
00:44:13.620All it knows is what increases people's likes, followers and get them to invite more people.
00:44:18.740And the problem is that those things tend to be conflict.
00:44:21.340So even if you have a broad diet and you're looking at information from both sides, quote unquote information, what it really is, is basically people, you know, shitposting on the other side and building on the boogeyman.
00:44:32.160So whatever your boogeyman is for you, like, oh, they're doing, you know, this next in my hometown.
00:44:37.960Now you can sort of carry that to the worst next conclusion.
00:44:40.740You can find evidence for every stereotype.
00:44:43.040And in fact, one of the groups that we interviewed, we have a podcast called called Urine Divided Attention.
00:44:47.700We interviewed Dan Vallone, who runs More in Common.
00:44:51.460And what it really shows is that we completely see the other side in stereotypes.
00:44:56.580If you ask Democrats to estimate what percent of Republicans make more than $250,000 a year, they think more than a third of Republicans make more than $250,000.
00:45:17.920If you ask Democrats what, you know, to estimate what percent of Republicans do they believe, still believe racism is a problem in the United States, they think less than 25% of Republicans would believe that racism is still a problem.
00:45:30.420The actual answer is something like 70%.
00:45:32.400And so we're seeing ourselves with stereotypes.
00:45:35.280And the second thing they found is the more you use social media, the worse you are at predicting what the other side believes, not the better.
00:45:42.460Because the extreme voices on social media participate more often than the silent sort of, you know, calm, moderate majority, right?
00:45:51.500Like the calm, moderate people, they don't actually say that much.
00:45:54.900So that's really the problem that we're dealing with when we look at our, you know, our polarization ecosystem.
00:45:59.180Wow. This is reminding me that when we closed out the year, we went to Christmas break.
00:46:03.980The last piece I did was on Democrats.
00:46:07.920And, you know, I have a lot of Republican listeners.
00:46:09.740I have some Democrats, too, mostly people in the center.
00:46:12.440But it was a reminder that, you know, the people who are trying to get everybody canceled and so on, they don't represent all of the left.
00:46:19.180And that it's not, quote, the left that is the enemy of reason.
00:46:21.960It's like activists who are pushing agendas.