The Megyn Kelly Show - July 13, 2022


Biden's Inflation Crisis, and Elon Musk vs. the Bots, with David Sacks, Renee DiResta, and Todd Henderson | Ep. 356


Episode Stats

Length

1 hour and 35 minutes

Words per Minute

192.57182

Word Count

18,358

Sentence Count

1,149

Misogynist Sentences

6

Hate Speech Sentences

15


Summary

Inflation hits a new 40-year high of 9.1%, but President Biden insists things are not as bad as they seem. Plus, Elon Musk is being sued in Delaware by the government for a billion-dollar deal they want him to complete. We're joined by David Sachs, a venture capitalist who runs Kraft Ventures and is co-host of the tech podcast All In.


Transcript

00:00:00.440 Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.460 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:14.940 Wow, oh wow, those inflation numbers.
00:00:17.900 They are worse than expected and they were expected to be terrible.
00:00:22.000 Hitting yet another 40-year high at 9.1%.
00:00:25.940 My God, that's just an eye roller.
00:00:30.400 Much of that is due to higher prices at the pump, at the grocery store, and at home.
00:00:35.800 The stock market is, of course, dropping on the news.
00:00:39.040 Meantime, President Biden just arrived in the Middle East where gas prices will certainly be on the agenda.
00:00:44.020 But in a statement, he's insisting, things are not as bad as they seem.
00:00:48.580 All is well. Remember Kevin Bacon and Animal House?
00:00:51.100 All is well. Remain calm. They're not as bad as they seem.
00:00:55.700 And is once again blaming, guess who? Putin.
00:00:59.160 Fair? We'll get to it.
00:01:01.680 Plus, breaking up may be hard to do.
00:01:04.240 Elon Musk has been sued officially now by Twitter in Delaware in an effort to force him to complete the deal.
00:01:10.880 They want that, quote, specific performance that he promised.
00:01:14.020 They don't just want their billion-dollar breakup fee.
00:01:16.400 They want him to buy Twitter.
00:01:17.740 And they're trying to force it.
00:01:19.700 We're going to be joined by a legal expert on that who's going to tell us whether Elon's likely to win that case or not.
00:01:27.780 And then a bit later, we're going to talk to an expert in bot behavior.
00:01:31.720 And we'll ask her what Twitter's bot situation looks like.
00:01:35.320 But first, we are joined today by David Sachs, a venture capitalist who runs Kraft Ventures and is co-host of the tech podcast, All In.
00:01:47.580 David, welcome back to the show.
00:01:49.780 Hey, Megan. Good to be here.
00:01:50.780 So, my God, 9.1.
00:01:53.480 And it really is shockingly high.
00:01:57.460 And Biden's out there already saying, don't believe your lying eyes.
00:02:00.600 It's really not that bad.
00:02:02.300 Things have gotten a lot better since those numbers were calculated over the past 30 days.
00:02:07.360 It's out of date.
00:02:08.800 Energy alone comprised nearly half of the monthly increase in inflation.
00:02:13.180 And this data doesn't reflect the full impact of nearly 30 days of decreases in gas prices, says the president, also pointing out that other commodities like wheat have fallen sharply since this report.
00:02:26.140 And then goes on, of course, to say other countries are suffering from inflation and battling, quote, this COVID-related challenge made worse by Putin's unconscionable aggression.
00:02:39.120 What do you make of it?
00:02:40.540 Well, you're right.
00:02:41.020 So, the expectation on the part of analysts was that this number, this inflation number, would come in at 8.8%.
00:02:47.180 Like you said, it was 9.1.
00:02:49.260 Last month, it was 8.6%.
00:02:51.020 So, the number still hasn't peaked.
00:02:53.020 I remember a couple of months ago, the belief was that inflation would have peaked by now because inflation is measured on a year-over-year basis.
00:03:02.160 And so, as you start to lap bigger and bigger numbers from last year, then you would expect the inflation rate to go down.
00:03:08.760 But that has not happened.
00:03:09.620 And we're still setting new highs each month in inflation.
00:03:13.900 And you're right that gas is the biggest culprit here.
00:03:17.920 However, it's also groceries are up 12% in the past year.
00:03:21.420 That's the biggest annual increase in 1979.
00:03:24.620 Chicken is up 19% in the past year.
00:03:26.860 That's the biggest increase ever.
00:03:28.580 Electricity is up 14%.
00:03:30.060 That's the biggest increase since 2006.
00:03:32.740 Rent is up about 6%.
00:03:34.300 That's the biggest increase since 1986.
00:03:36.380 So, it's not just gas.
00:03:38.180 It's a broad-based inflation problem.
00:03:40.780 And, yeah, I think we're still in the midst of dealing with it.
00:03:44.340 And the problem is that any wage growth we've seen, any employment numbers that look good on paper, all get dented.
00:03:53.760 They all get dinged up by inflation.
00:03:56.160 It's like, who cares if you get, you know, a 5% wage increase when your inflation rate on all of your groceries and so on is 9.1%.
00:04:05.240 That's right.
00:04:06.520 I mean, workers' real wages are not keeping up with the rate of inflation.
00:04:10.520 And so, they can really feel it when they go to the pump or buy groceries.
00:04:14.200 And I think this is going to be foremost on voters' minds in November.
00:04:17.740 Just to give the president his due, gas prices have come down about 15% to 20% over the past month.
00:04:23.340 So, if you were to measure inflation today in light of that decrease, it would be a little bit lower than this 9.1% number.
00:04:30.860 But it still wouldn't be good.
00:04:32.060 You're talking about 8%, roughly, inflation numbers.
00:04:36.000 I think you can depend on between now and the November election.
00:04:40.080 So, the numbers just aren't going to get good enough, fast enough to help the administration.
00:04:44.560 I think that they've got a big problem here coming into November.
00:04:48.320 It's, of course, so you have Biden coming out today and saying, out of date, out of date.
00:04:53.220 But what we've been told, even in the face of the 8.8 number last month, is really not to believe our lion eyes.
00:05:00.240 In addition to the Putin price hike stuff that he keeps saying.
00:05:04.860 Here was the White House press secretary just a couple of days ago on how strong our economy is at SOT1.
00:05:10.980 When you look at inflation, when we look at where we are economically, and we are in a strong, we are stronger economically than we have been in history.
00:05:21.480 When you look at the unemployment numbers at 3.6%, when you look at the jobs numbers, more than 8.7 million of new jobs created.
00:05:30.740 That is important.
00:05:32.700 Stronger than we've been in history, and citing the unemployment rate, which is, I mean, it's just such cherry picking and lacking context.
00:05:40.400 In any fair press, we'd have an immediate fact check.
00:05:43.280 But since it's the Biden White House, we won't.
00:05:46.460 Yeah, I mean, the unemployment rate, as it stands today, is low.
00:05:49.920 But the labor participation rate is also low.
00:05:52.200 We've got millions of people who haven't gone back to work, and that's not really counted in the unemployment number.
00:05:57.300 The other thing that the administration should be really worried about is the economy is slowing down really fast.
00:06:01.920 And this is a result of the rate increases that the Fed is now having to push in response to inflation.
00:06:08.740 So I think that we're very – you already saw in Q1, we had a negative growth rate for GDP.
00:06:17.760 In another couple of weeks, we'll get the growth rate for Q2.
00:06:21.060 If that number is negative, we will officially be in a recession.
00:06:25.120 But regardless of whether that number is technically negative or not, if you poll most Americans right now, most Americans already believe we're in a recession.
00:06:33.480 So they are feeling the pinch from this inflation.
00:06:36.920 I think that you're seeing companies slam on the brakes in response to the Fed rate increases.
00:06:44.160 And so the economy is definitely slowing down.
00:06:46.560 And I think there's a pretty good chance that if we're not in recession by the end of this month, we will be by the end of this year.
00:06:52.880 So it's a weird recession in a way because I remember the recession of 1991.
00:06:58.020 I was in college and, you know, the graduating class a year or two ahead of me, they were all struggling to find work.
00:07:05.760 You know, I mean, when you think of recession, you think about a tough job market.
00:07:09.560 As you point out, this is a low – this is not a tough job market.
00:07:13.480 You can find a job in this job market.
00:07:15.880 It's just the question of when you get your salary and you take it home, what can you buy?
00:07:20.440 And how does it compare to what you could have bought with the same number 12 months ago?
00:07:25.860 But this unemployment rate is very interesting and it's kind of frustrating.
00:07:30.800 I mean, I'll tell you, I've experienced it myself personally.
00:07:32.940 I've read accounts of other people who are going through the same thing.
00:07:35.740 We come to New Jersey during the summer months.
00:07:38.580 And you cannot go out to a restaurant here because there's no staff.
00:07:43.560 Like, all of these restaurant owners are begging for the college students, for anybody who can work, to apply for a chef or a cook job, for a waiter or waitressing job, for, you know, a bar back or a, you know, busboy type job.
00:07:58.860 They can't – I mentioned this guy before.
00:08:01.520 He's clearly a Republican here at the Jersey Shore.
00:08:04.500 I was kind of laughing because on the Upper West Side you have AOC action figures and Dr. Fauci superhero dolls.
00:08:10.700 And here on the Jersey Shore, where it's a little redder, you got this guy who posted in his – he runs like a mall, like a strip mall kind of place.
00:08:19.620 And this is what his sign reads.
00:08:22.260 Please be patient.
00:08:23.220 We are short-staffed.
00:08:24.640 Hopefully, the government will soon cease in their endeavor to enslave people through handouts and crush small businesses, hopefully, but don't hold your breath.
00:08:33.940 So this guy's basically saying you can't get staff.
00:08:37.240 And I think people are feeling this all over the country where you go and you can't get service because they just can't find employees.
00:08:42.520 So why – how can we both be in a recession and have a shortage of workers?
00:08:47.220 Well, we have a very low, by historical standards, labor participation rate.
00:08:53.260 And so a lot of workers have not gone back into the workforce.
00:08:57.000 And you could lay some of the blame for that at the $2 trillion – that last $2 trillion of stimulus that Biden passed last year, the American Rescue Plan, along straight party lines.
00:09:07.120 And the Republicans were accused of being cold-hearted when they pointed out that the stimmy checks and the super-extended unemployment insurance would encourage people not to go back to work or delay them from going back to work.
00:09:18.100 And so I think we're still seeing the residual effect of all of the stimulus money flowing through the economy.
00:09:24.800 There are – the technical unemployment rate is low, but there's a lot of unfilled jobs.
00:09:30.480 There's a lot of people not participating in the economy.
00:09:33.480 I think what you're going to see now, though, is that the unemployment rate is going to start to rise.
00:09:39.340 There's no question that the economy is slowing down.
00:09:42.120 And it's – I think most analysts now believe it's just a matter of time before we're in a recession.
00:09:47.000 So I think you are actually going to see a lot of increased joblessness claims over the next six months or so.
00:09:54.000 And you're going to start to see these things normalize and behave more like you'd expect.
00:09:58.900 Remember, the unemployment rate is really a lagging indicator of economic success.
00:10:03.900 And so, you know, it's still reflecting the economy we had last year.
00:10:07.840 I think that the number will change over the next year or so.
00:10:11.060 Yeah, because haven't these checks stopped?
00:10:12.660 I mean, when you say it's a residual effect, it's like people saved up.
00:10:15.400 They put their stimulus checks in the bank, and now they're still living on mom's couch, just, you know, enjoying the remnants of those checks.
00:10:22.820 I mean, they were big, and they were unnecessarily large and generous for a portion of the population.
00:10:29.600 But they weren't that big.
00:10:32.560 You can just retire at age 27 forever.
00:10:36.420 No, it's true.
00:10:37.040 I mean, it's a residual effect that's lagging.
00:10:39.740 I mean, people are using up those savings, but Americans still have, I think, quite a bit of excess savings stored up.
00:10:45.400 You also had a lot of things around eviction moratoriums, rent abatement, things like that, which is most people's number one expense is their rent.
00:10:51.960 So if you don't have to pay rent, you can basically live off those to me checks for a lot longer.
00:10:57.800 Look, I don't think this is the predominant thing happening in the economy, but this is one variable at the margins that is creating that feeling that you described of not being able to staff up some of these service jobs.
00:11:11.500 I wonder about it, too, because it's like part of me wondered whether, you know, the teenagers who used to fill these jobs, like, where are they?
00:11:19.280 They didn't get any stimulus checks.
00:11:20.440 So where are they?
00:11:20.980 But part of it is I feel like there's so much pressure on young people today to like you have to be in 10 clubs and you have to be the president or the captain of four sports.
00:11:28.020 It's like they can't work.
00:11:30.060 They have to they're doing stupid model UN as if that's going to prepare them for life, you know, instead of like doing the things that would probably cause a David Sachs to want to hire them, like working, shoveling fish guts for a summer and figuring out what it's like to get your fingernails dirty, you know, for a living.
00:11:45.440 Yeah, I mean, I haven't searched for anyone with fish gut experience, but I would I would value that, I guess, if someone's willing to do that.
00:11:55.860 Yes, me, too.
00:11:57.300 That's what I want my kids to do.
00:11:58.580 I definitely do not want model UN.
00:12:01.140 So what do you see happening now?
00:12:02.560 Because we're already in your industry tech.
00:12:05.000 We're seeing some scaling back in the employment.
00:12:07.980 So you say you think it's going to happen elsewhere.
00:12:10.200 Lyft just announced it's going to slow hiring.
00:12:12.720 Instacart said it's going to roll back growth.
00:12:14.860 Microsoft announced a small cutback on jobs.
00:12:16.800 Tesla revealed it's going to cut salaried staff by 10 percent.
00:12:19.300 Meta says it's going to reduce hiring over economic concerns.
00:12:22.100 And now today we got an announcement.
00:12:23.440 Google's going to slow hiring for the rest of this year.
00:12:27.100 And they say they are there.
00:12:30.000 They're heading for, quote, another rough patch.
00:12:33.280 So that's tech.
00:12:34.700 And number one, why is tech getting it so bad?
00:12:36.780 Why are they sort of the leader on all these rollbacks?
00:12:39.420 Well, startups in particular are kind of the canary in the coal mine.
00:12:42.160 So that's why I've been warning on your show for months now that we are seeing a slowdown.
00:12:46.800 What basically happened is that if you look, if you go back to November of last year, the stock market, specifically growth stocks, peaked in November last year.
00:12:55.160 That's when the Fed finally got serious about inflation, admitted it wasn't transitory, and started projecting a regime of interest rate increases.
00:13:04.660 Those expected rate hikes then caused growth stocks to go down.
00:13:08.660 And we've seen a huge decrease in the stock market.
00:13:11.700 I mean, really across the board.
00:13:12.760 But if you look in particular at the growth stocks and the new listings, the SPACs, the IPOs, they're down 60%, 70%, 80%.
00:13:19.340 So what happened is over the last six months, venture investors obviously started noticing that the public markets and the valuations are set in the public markets are the exit for all of us.
00:13:31.540 And so we realized that valuations were way off, and that caused a constriction in the amount of venture capital that was available.
00:13:39.480 That's been happening over the last six months.
00:13:41.640 And founders are seeing that it's hard to raise money.
00:13:44.840 The valuations are lower, so raising money is more dilutive.
00:13:47.720 In any event, all these forces basically cause startup founders to burn money more gradually.
00:13:54.760 They want to extend their runway.
00:13:55.880 So this is what I've been saying for several months now, that we've been in a slowdown.
00:14:01.400 And what I saw in all my board meetings going back several months is these companies were slamming on the brakes.
00:14:07.220 They were not hiring as quickly.
00:14:08.800 So that was sort of, again, the canaries in the coal mine are these startups.
00:14:11.700 But now it's now spread to these big tech companies, and then it will then spread to other kinds of companies.
00:14:19.700 And I just think it's based on how sensitive they are to economic changes in the economy.
00:14:26.400 Startups are the most sensitive, then tech, then sort of the more traditional value companies.
00:14:31.840 You know, switching it to politics for a minute, the New York Times slash Siena College came out with really shocking polling this week about President Biden's very low approval rating, 33 percent now.
00:14:42.780 And the fact that some 63 percent of the Democratic Party wants a different nominee for a second time around, they don't want him for a second term and how low his rating has fallen with independence with the white working class.
00:14:55.760 He only has 20 percent support.
00:14:58.060 Even his core base, which The New York Times describes as black voters, more of the black voting base would prefer a different candidate other than Joe Biden on the Democratic ticket.
00:15:07.620 So it's not good news for him. But today on their daily podcast, they called The Daily, they were pointing out another side of this poll showing that when it comes to the congressional midterms, the Democrats are doing a little better than they were and better than expected.
00:15:24.040 You know, if you looked at these polls three months ago, it was predicted bloodbath.
00:15:28.440 And now it's getting tighter in the wake of Roe being overturned, in the wake of some of these mass shootings and some Supreme Court decisions on guns.
00:15:36.860 And they they believe in the wake of the January 6th hearings, which may not be really pulling the heartstrings of a ton of Republican voters, but some of them, but certainly seems to be amping up the Democratic base, which was, I think, in part their purpose.
00:15:50.500 So what do you make of that possibility? They're saying now Democrats have a one point lead on the generic congressional midterm ballot among registered voters, and they have a one point deficit on likely voters.
00:16:04.400 So pretty tight. I mean, surprisingly tight, given these economic numbers.
00:16:10.200 Well, I think that if you look at the House, I mean, everyone's forecasting that Republicans are going to win back the House.
00:16:14.640 I think the question is the Senate. And there is you could call it a candidate quality issue there in a few races or Republicans haven't necessarily fielded the best candidates.
00:16:24.360 And so that matters quite a bit at the margins. I do think that this should be a red wave in November.
00:16:30.920 If you look at if you poll likely voters on what are the top issues that they care about, number one and two are inflation in the economy.
00:16:38.160 It's true that Roe is an issue for especially for the Democratic base, but for voters as a whole, it's something like a five percent issue.
00:16:47.260 So, you know, the Democrats are going to go out with the best issues that they have.
00:16:51.540 But I think that the paramount issues for most voters can be the economy and inflation.
00:16:56.180 And I don't see a big positive change happening in those numbers before November.
00:17:01.700 In fact, overall, it could get worse. So I would expect Republicans to do very well in November.
00:17:05.520 One of the things they were pointing out, again, this is The New York Times talking about this issue within the Democratic Party is something that you've been pointing out as well, which is the Democratic Party is now a party of college educated white elite people, elite so-called elites.
00:17:22.040 Right. Meaning well educated, you know, well off. And that the working class has switched, has switched.
00:17:31.040 They're now Republicans and people of color have migrated to the Republican Party in numbers never seen before.
00:17:39.800 They are in no way a reliable voting bloc as they used to be Hispanics.
00:17:45.200 And as I pointed out, even he's losing support amongst black voters.
00:17:49.060 So it's it's really shocking. And you've been on to it. You've been pointing it out.
00:17:52.420 A lot of people have. But I know you've seen it in a way a lot of others haven't.
00:17:56.720 So what do you make of that sort of this switch and who represents the elites and who represents the working class?
00:18:03.000 Well, I've been on to this trend for a while because I read Roy Tushara, who is a Democratic political scientist.
00:18:11.320 He back 20 years ago, he wrote a book called The Emerging Democratic Majority, in which he argued that demographic trends were working in the Democrats favor and would basically create Democratic majorities and Democratic presidents as far as the eye can see.
00:18:23.720 And he and he was basically hailed as a prophet when Obama got elected in 2008, based largely on the coalition that he was talking about, basically young voters, women, people of color.
00:18:36.540 But for the last few years, Tushara has been warning that the Democrats, that demographics are no longer working in favor of the Democrats, that they are losing their historic base because of what he calls.
00:18:47.320 He calls it basically professional class hegemony, that the Democratic Party is basically catering to the college graduate elites who run the think tanks and the foundations and the big woke tech companies and the Fortune 500.
00:19:02.600 And they are catering to that narrow group of voters and the issues they care about.
00:19:06.360 And they've lost sight of what matters to the average working class voter.
00:19:10.520 And that's why you saw in that special election in Texas, Maya Flores got elected a Republican for the first time by a largely Hispanic district that I think went 18 points for Biden, and then they just voted her in by a huge majority.
00:19:25.980 So you can see now that working class voters of all races are migrating from the Democrats to the Republican Party because the Republican Party is speaking to their concerns about economic issues, inflation, and so on, whereas the Democrats are really, you know, they are focused on these sort of elitist, progressive, woke policies.
00:19:46.600 And it's not just the economy.
00:19:47.760 It's also issues like crime.
00:19:49.420 You know, the Democrats are very wedded to this progressive agenda of deprosecution and defunding the police and allowing rampant homelessness.
00:20:00.320 And the average working class parent, they don't want their kids to have to get off the school bus and walk through a phalanx of drug addicts and junkies and homeless on campus to get to their school.
00:20:10.340 So, you know, these are ordinary quality of life issues that are motivating the electorate and they're motivating the working class to move to the Republican Party.
00:20:20.340 Yeah. Yeah.
00:20:21.060 That video out of San Francisco, we played some of that yesterday.
00:20:23.720 It was just horrifying of these young kids getting off the bus and having to walk through exactly that in this so-called progressive city that claims to care about the homeless.
00:20:31.460 OK, how about the six year olds?
00:20:33.160 Do we care about them?
00:20:34.000 And then, yeah, we've seen we've seen that in place after place, especially a lot of a lot of families of color, Hispanic families and black families standing up saying we don't want this CRT nonsense in our schools.
00:20:46.300 Don't tell us we're second class or our kids arrive here, you know, with the behind the eight ball just because of their skin color.
00:20:52.900 We don't accept any of your presumptions about our children based on their melanin.
00:20:57.580 And I do think it's pissing people off and it's it's causing a shift, especially with these economic numbers.
00:21:01.980 It's like, what do you have?
00:21:04.000 To lure us over like which what are you selling?
00:21:07.220 And in the meantime, you get messaging from Joe Biden as they, you know, go play almost five dollars at the pump of this is this is an amazing opportunity.
00:21:14.360 How do you put it?
00:21:15.020 It was a this is an incredible transition.
00:21:17.540 Don't worry.
00:21:18.040 It's an incredible transition for all of us in the U.S. economy away from fossil fuels.
00:21:23.480 So you're welcome.
00:21:25.720 Right.
00:21:25.800 And this is why I think Biden does bear substantial responsibility for the inflation, the economic mess we're in, is he basically baked this cake last year.
00:21:33.780 Remember, his first day in office, he cancels the Keystone Pipeline.
00:21:36.580 He made it harder to drill and transport energy.
00:21:39.780 So, you know, number one, you know, he basically contributes to the higher gas prices we have this year.
00:21:45.360 He also pushed for this extra four trillion in stimulus last year, deficit spending.
00:21:51.580 Again, we mentioned the the two trillion American rescue plan, which was stimulus that we didn't need.
00:21:56.520 COVID was basically winding down, at least as an economic issue.
00:21:59.480 You had economists like Larry Summers and his own party warning that if you pass this ARP, it could create inflation.
00:22:05.600 There's substantial risk there.
00:22:06.760 He did it anyway.
00:22:07.760 So that was the second thing he did.
00:22:09.140 And then, you know, let's talk about this Putin price hike idea.
00:22:11.940 Biden took throughout all of 2021, he basically took a very tough on Russia position in which he would not use diplomacy to try and find an off ramp to this Ukraine crisis.
00:22:23.280 As a result, earlier this year, we had a war.
00:22:25.540 Now, we could debate whether that was a wise foreign policy.
00:22:28.980 I personally don't think it was, but even if you do, even if you want to take a tough on Russia stance, why would you alienate the Saudis?
00:22:37.180 Why would you cancel America's energy independence?
00:22:39.700 If you knew you're going to be taking on Putin in the year 2022, you would want to use 2021 to create an energy glut.
00:22:47.800 And instead, Biden did the opposite.
00:22:49.740 And now he's going hat in hand to Saudi Arabia on this trip.
00:22:52.820 It's very humiliating.
00:22:53.880 And he's basically having to beg for forgiveness to get the Saudis to pump more after he basically said last year he's going to treat him like pariahs.
00:23:00.920 So there was no overall grand strategy or coherence to this administration's policies.
00:23:07.140 If they want to get tough on Russia, they should have maintained good relationships with the Saudis and they should have basically encouraged domestic energy production.
00:23:14.920 Yeah, they went a different way.
00:23:16.240 You're right.
00:23:16.520 Now he's over there with an unbended knee to the so-called pariahs.
00:23:21.340 And I guess there was some news that he wasn't going to shake the hands.
00:23:23.880 Of leaders over there because of COVID.
00:23:25.520 And everybody knows it's because he doesn't want that photograph.
00:23:28.280 You know, some pariah.
00:23:29.740 Look at the two of you buddying up.
00:23:31.960 I don't know.
00:23:32.820 That may be fake news, but I heard that from my team.
00:23:35.840 Let me ask you about this.
00:23:37.080 If this year teaches us anything, it is that you cannot have security without energy independence.
00:23:42.880 We have to be energy independent.
00:23:44.480 The Europeans need to be energy independent.
00:23:45.940 What's happening in Europe right now is that the Russians are actually restricting the flow of gas from the Nord 1 pipeline.
00:23:53.480 And they're really showing the Europeans who's Putin is showing the Europeans who's boss right now.
00:23:57.440 And I think there is a significant chance that come this winter, when the Europeans need to heat their homes, that is when the Western alliance on Ukraine may fracture.
00:24:07.080 I think this is what Putin is betting on.
00:24:09.520 And so we've seen now, again, you cannot be secure as a country unless your source of energy is secure.
00:24:15.840 And I think the Europeans are learning that the hard way, and we're learning it the hard way.
00:24:20.140 It's so crazy because we were, you know, we were energy independent under Donald Trump, and he gave it away.
00:24:25.560 And we see it.
00:24:26.460 We're seeing something similar, you know, because of his green energy policies and his his.
00:24:30.860 He's too beholden to the elite, super green faction of his own party that makes huge donations, unlike the working class members who used to be part of his party,
00:24:42.740 who are going to get hit by these policies of making us not energy independent and dependent on wind and solar, which doesn't work nearly as well.
00:24:51.260 And that's why what's happening in the Netherlands is interesting, because it's it's a parallel.
00:24:56.700 We have these farmers who are about to get hit by these attempted green policies severely out there protesting.
00:25:04.820 I mean, we never talk about the Netherlands.
00:25:06.300 In fact, the Netherlands confuses me because my my husband's Dutch and it always just can be like, what is the Netherlands?
00:25:11.940 What is Dutch? Why aren't you considered Netherlandian?
00:25:15.300 Why are you Dutch? Anyway, I could go on, David.
00:25:18.420 But yeah, for the viewers who haven't been following this.
00:25:23.080 So farmers are protesting around the Netherlands over the government's new policy, which would see the country slash nitrogen oxide and ammonia emissions by 50 percent, 50 by 2030.
00:25:34.260 This is in an attempt to go more green.
00:25:37.020 All right. And this all of this means they have to reduce their livestock numbers.
00:25:41.000 They it could force some farms to shut. They have to use less fertilizer in a very short time.
00:25:46.260 And they're saying this demand by the government is totally unfeasible.
00:25:49.640 And what it's going to what's going to happen is now the government's going to try to buy up all the farms.
00:25:54.020 And it's like a takeover.
00:25:56.040 So they're protesting. They're they're burning bales of hay.
00:25:59.760 They're kind of doing what the Canadian truckers did in a way and fighting back.
00:26:03.780 And it's somewhat inspirational to see them pushing back on this government overreach where their government's trying to do them what Joe Biden's kind of doing us.
00:26:12.600 Right. Yes.
00:26:13.500 I mean, the Dutch farmers are the new Canadian truckers.
00:26:16.620 These are working class folks who are basically being punished and they're basically being legislated out of existence by their own government.
00:26:23.880 And for what?
00:26:25.060 I mean, basically, some bureaucrat in Brussels had the bright idea that we're going to cut this type of emission by 50 percent by 2030.
00:26:31.920 Those are suspiciously round numbers to me.
00:26:34.100 I'd like them to prove why 50 percent is the number and why 2030 is the number.
00:26:39.860 I mean, these things, it doesn't really make sense.
00:26:42.140 Right. They've just kind of picked these numbers arbitrarily out of thin air.
00:26:45.480 And then what happens is the Dutch legislators say, oh, we have to implement this new directive from Brussels.
00:26:50.060 They start confiscating these farms and banning a way of making a living that these farmers have been, you know, have been engaged in for generations.
00:26:59.460 So, you know, it's it is very similar to what's happened with energy, which is the Europeans adopted a policy towards energy that made them dependent on Russia for their energy because they refused to use nuclear or develop other or develop their own gas for environmental reasons.
00:27:18.340 They're about to do the same thing with food, which is make themselves dependent on other people's food because they refuse to produce themselves, even though they have an enormous natural advantage.
00:27:29.880 The Netherlands is actually the number two agricultural exporter in the world.
00:27:33.220 So they are very, very good at this.
00:27:35.040 And the Dutch government seems intent on destroying the advantage they have.
00:27:39.480 It's just it's lunacy.
00:27:41.320 And if you want to see where all of this leads, just look at Sri Lanka right now.
00:27:45.340 So, you know, the Sri Lankan government and society just collapsed.
00:27:49.820 Why?
00:27:50.460 If you go back to April of last year, they banned these same types of chemical fertilizers that the Netherlands wants to restrict.
00:27:58.260 And as a result of that, their agricultural output fell by something like a third this year.
00:28:03.540 And the production of rice, which is one of their main staples, fell by something like 43 percent.
00:28:08.820 So all of a sudden people are going hungry.
00:28:11.200 They can't feed themselves.
00:28:12.140 It's a much poorer country, obviously, than the Netherlands.
00:28:15.580 But as a result, the whole society has collapsed.
00:28:18.060 Why do they implement that policy?
00:28:19.640 Well, it's the same types of policies that are being set out of Brussels.
00:28:23.360 It's this environmental extremism that doesn't take into account the needs of ordinary people.
00:28:30.620 One other point on the economy.
00:28:32.640 And then I have to ask you a question about Elon on Twitter.
00:28:34.840 One of the things I heard you say on the All In podcast was here domestically.
00:28:38.160 The next thing to get hit after the unemployment rate starts to get shakier are nest eggs and homes.
00:28:46.220 So that's scary.
00:28:47.900 Do you mean 401ks and just the housing markets likely to crash?
00:28:51.340 Is that what are you projecting there?
00:28:54.320 Well, 401ks have already been hit.
00:28:55.820 I mean, we're in a bear market officially.
00:28:58.820 I think the S&P is down something like 22% for the year, NASDAQ 30-something percent.
00:29:04.660 Maybe the Dow Jones is a little bit less.
00:29:06.060 But we're already in bear market territory.
00:29:08.740 And if you're a growth stock investor like I am, you know it's even much worse than that.
00:29:12.400 So, you know, if you haven't looked at your 401k lately, you probably don't want to.
00:29:17.060 It's going to be depressing.
00:29:18.860 Don't look at it.
00:29:20.000 Now, on the housing market, I think the issue there is that if you look at some of these charts showing the ratio of housing prices to median income, what you see is that the ratio has never been this high since around 2006, right before we had that sort of the global financial crisis driven by the real estate crash.
00:29:43.880 So essentially what that means is that people cannot afford home prices as they stand today based on their current income levels.
00:29:53.240 And now that interest rates have gone up so much, mortgage prices are rising rapidly.
00:29:59.360 So, you know, your typical home mortgage has gone from, call it roughly 3% to almost 6% in just the last few months.
00:30:06.300 So that is also creating a huge amount of pressure on the real estate market because people simply cannot afford as much house as they could just a few months ago because they can't borrow as much.
00:30:17.040 So, you know, I think we are due for some sort of big correction in the housing market.
00:30:21.380 What tends to happen first is that inventories build up.
00:30:24.280 You get illiquidity in the market because sellers don't want to drop their prices.
00:30:28.480 Eventually they capitulate and then we see a price decrease in the housing market.
00:30:34.080 Very, very bearish all around.
00:30:35.580 Okay, last question.
00:30:37.200 Elon has officially been sued now by Twitter, which is seeking to force specific performance of his promise to buy Twitter at $44 billion.
00:30:46.420 He says they materially breached the deal first by not disclosing all of the information on how many bots are actually on Twitter and perhaps in other ways.
00:30:56.280 CNBC analysts predicting Elon may go to jail if he loses this case.
00:31:02.000 What do you think?
00:31:02.840 That was a ridiculous comment about Elon going to jail.
00:31:06.060 Even Jim Cramer, who's not exactly known for, let's say, not making hyperbolic statements, laughed out loud at that.
00:31:12.240 But the worst thing that could happen to Elon, although I would consider it to be a good thing, is that he could be ordered by Delaware Chancellery Court to actually consummate the acquisition.
00:31:21.180 He could be ordered to basically perform.
00:31:23.080 Or alternately, he could end up having to pay damages, a kill fee, basically.
00:31:27.360 So that's the risk to Elon, and that's what Twitter is seeking.
00:31:31.960 What Elon has to show is that there was a material adverse effect related to this bot problem, that basically the fake accounts, the bots were massively understated by Twitter's public filings.
00:31:44.080 They wouldn't give him the information, and that basically permanently impaired the business, and it meant that their revenue would be much lower.
00:31:51.900 That's his assertion.
00:31:53.440 And so I think what Elon has going for him is that the discovery around this for Twitter, I think, is going to be very messy.
00:32:00.400 The question very quickly is going to become, what do Twitter executives know, and when do they know it with regard to this bot, fake account problem?
00:32:07.960 What do they do about it?
00:32:09.060 Is there an email anywhere in the company in which executives are saying, well, gee, do we really want to pursue this vigorously when we know it may decrease our revenue?
00:32:15.920 I mean, I'm not saying that email exists, but if it does, it's going to be very messy and embarrassing for them.
00:32:20.460 So I was a little surprised at Twitter before this because I don't think they're going to like the discovery.
00:32:25.840 On the other hand, you know, the battle for Elon is he's got to show this material adverse effect, which traditionally that's a pretty tough thing to show in Delaware court.
00:32:36.640 So that's sort of the base of the balance there.
00:32:39.980 You know him.
00:32:40.580 You work with him.
00:32:41.400 Do you think he still wants the company?
00:32:45.780 You know, I don't know.
00:32:47.000 The answer is I don't know.
00:32:48.220 I mean, I have been a fan of the idea of him buying Twitter because he would restore free speech to Twitter,
00:32:55.300 and I think that's an important cause.
00:32:56.580 We've talked about that before on the show.
00:32:58.640 So I still hope the deal goes through.
00:33:02.300 I'd love to see him restructure the company.
00:33:06.480 Yes.
00:33:07.040 But I don't know if he wants it.
00:33:09.740 Maybe he realized this would be a big headache.
00:33:11.320 It certainly surely would be.
00:33:14.180 And, of course, we're operating against this backdrop of a massive stock market decrease.
00:33:18.600 So, obviously, it's hard to ignore that the deal may not be economically quite as good a deal as it was, say.
00:33:25.380 No, but it's a public service.
00:33:27.420 It would be a public service.
00:33:29.000 I would really like it as a public service.
00:33:31.180 Although, as an investor in some of Elon's other companies, I'm not sure I want him signing up for the distraction.
00:33:37.040 But, yeah, I mean, I think it would be a great thing for society if it actually happened.
00:33:41.060 Well, he is one of the disruptors, and he certainly disrupted Twitter in a way that is fascinating to watch,
00:33:46.720 but a little sad for those of us who want to see him close this deal.
00:33:50.080 David Sachs, always a pleasure.
00:33:51.180 Thank you.
00:33:52.660 Thanks for having me.
00:33:53.980 And up next, we are digging much more into the Elon Musk lawsuit with a lawyer who had a fascinating piece in The Wall Street Journal today,
00:34:01.300 which he's got a very different take than the one you're going to hear on the mainstream.
00:34:11.060 Twitter now officially at war with Elon Musk, but who will prevail?
00:34:15.660 My next guest just co-authored a piece in The Wall Street Journal that says,
00:34:19.360 Twitter's lawsuit looks like a loser.
00:34:21.600 That's not what you're hearing anyplace else.
00:34:23.720 Todd Henderson is a law professor at the University of Chicago, one of the top law schools in the world,
00:34:30.280 and he joins me now.
00:34:31.960 Todd, thank you so much for being here.
00:34:34.040 Delighted to be here, Megan.
00:34:35.080 So that's a fascinating and provocative headline, and it barely encaptures your position.
00:34:41.420 Twitter's lawsuit against Elon Musk looks like a loser.
00:34:44.220 If you check any mainstream publication or television show, they're going to tell you that Elon will be the loser.
00:34:51.420 Twitter's got him.
00:34:52.540 You know what?
00:34:54.020 And that he's either going to have to pay a billion dollars or he's going to be forced to buy this company
00:34:59.360 or, according to the CNBC analyst, will go to jail.
00:35:02.580 So let's start with why you think, do you think on the merits they're going to lose?
00:35:07.540 Twitter's going to lose?
00:35:08.240 Or are you just taking a different position on punishment to Elon if he doesn't go through?
00:35:13.100 Well, let's just start with, you know, don't believe me.
00:35:16.580 I mean, I'm just a law professor.
00:35:18.620 I will say I have some credibility because I was law school classmates with David, who was your previous guest.
00:35:25.120 So this is the class of 1998 University of Chicago Law School day of the Megyn Kelly show.
00:35:31.460 Small world.
00:35:31.940 Yeah, but look, the stock market, Twitter's stock, I didn't check it this morning, but yesterday it was $33 or something.
00:35:41.640 Twitter's argument is that the court should order Musk to buy Twitter at $54.
00:35:47.380 And that means if you're a stock market professional and you believe the court is going to do that,
00:35:51.960 you can buy Twitter today for $33.
00:35:54.820 And when the court issues its order requirement to buy it, you get $54.
00:35:58.160 That seems like a pretty good trade if you believe the court's going to do that.
00:36:02.080 And the stock price is nowhere near $54.
00:36:04.720 So I think a lot of pundits are saying that Twitter's got a good case.
00:36:07.960 I think the stock market, the wisdom of crowds is kind of on my side.
00:36:12.260 That's fascinating.
00:36:13.260 Let me address your question, which is, you know, they've asked him, they said, basically, you promised to buy us.
00:36:23.040 And now that you've backed out, we're going to make you buy us.
00:36:26.500 And, you know, you went to law school, Megan, and so you were in a first-year contracts class.
00:36:32.680 And pretty much the first thing we teach our idealistic, you know, change the world students when they come to law school is not every wrong has a remedy.
00:36:42.200 And that the rule in contracts generally is not promise and you have to do it or we send you to jail, like these crazy analysts think.
00:36:49.800 But either do it or pay damages.
00:36:53.560 That's the kind of general principle that animates our piece.
00:36:58.760 And so if he backs away and, you know, breaches the contract in the technical sense, and I understand, you know, you were talking with David about these bots and whatever.
00:37:07.320 We can talk about that, whether he is actually breaching or whether they breach.
00:37:10.820 But let's just imagine he just walks away and says, you know what, I changed my mind.
00:37:14.000 In that situation, probably the worst thing that could happen to him is that he would have to pay damages.
00:37:20.260 And, of course, there's a question of what those damages would be.
00:37:23.660 Some have suggested, you know, Matt Levine at Bloomberg has suggested those damages are the damages to the shareholders.
00:37:29.660 But as we point out in the piece, shareholders are not part of this contract.
00:37:33.280 They can't sue through Twitter to get their damages.
00:37:36.860 It's only the damage that Twitter has.
00:37:39.000 So I don't think the courts will make him buy it.
00:37:42.120 And I think for reasons we can talk about, that would be a disaster if we forced people to do things they don't want to do.
00:37:47.980 And the damages to Twitter are probably a lot less than a billion dollars.
00:37:51.120 All right. So let's get to that, because that was an interesting piece that you raised about Hearst Castle in California as an example of what what happens if they act.
00:38:01.340 If the court in Delaware says Elon Musk, you must buy Twitter now.
00:38:04.880 You must do it.
00:38:06.120 And if Elon Musk then does it, what does Twitter now have?
00:38:10.080 Who does it have running it?
00:38:11.480 How does that affect shareholder value?
00:38:13.660 Like, who wants to force a reluctant owner into running a several billion dollar company?
00:38:21.340 Yeah, I think it's it's as we as we allude to in the piece, I think it's irresponsible of Twitter's board.
00:38:28.100 Twitter's board has fiduciary duties to Twitter as an entity.
00:38:31.980 That means they have to put the interest of Twitter as an entity above anything else.
00:38:38.480 And forcing someone who doesn't want to run the company or own the company to buy the company seems antithetical to their obligation to do the best thing for Twitter.
00:38:48.300 You want someone to buy it, if you're selling your house, you want someone to buy your house that's going to be taking care of it, not someone that's going to want to, you know, would never cut the grass and let it fall apart or whatever.
00:39:01.540 And that's the obligation the board of directors has.
00:39:04.620 And so I think, you know, forcing a reluctant owner is a really bad idea.
00:39:09.080 You know, we point out, we use the analogy in the in the piece that if you contract with someone to paint your house and they back out and say, you know what, I got a better deal.
00:39:21.360 I promised to charge you five hundred dollars.
00:39:23.960 Someone else down the street is going to pay me a thousand dollars.
00:39:27.180 You don't want to go to court and compel that house painter to paint your house because they'll do a bad job.
00:39:35.080 Yeah, they'll shirk and they'll be lazy and maybe in ways that it's very hard for you to detect.
00:39:40.920 And for courts, this raises another problem, which is imagine that the court does order the person to paint your house or must to buy Twitter.
00:39:49.900 And then he kind of does it does a bad job.
00:39:53.040 He doesn't show up on time.
00:39:55.080 He's not using a really high quality paint or in this case, Musk sort of slow walks the funding with the banks and, you know, delays the acquisition, keeping Twitter in legal limbo.
00:40:05.860 Then Twitter is going to run back into court and say, look, he's not fulfilling.
00:40:10.340 He's not doing the best job he could closing the deal or running the company, whatever it is.
00:40:15.620 And that would be in sort of the mashing the court in a continuing obligation to make sure that the order to specifically perform to do the job, that they're actually doing it right.
00:40:25.160 Because to think that the court is going to snap its fingers and say, you do it.
00:40:29.500 And then Musk will say, oh, sure, I'll do the best I possibly can, I think is naive in the extreme.
00:40:34.680 And people need to keep the real world in mind here.
00:40:37.040 I mean, they sued Elon in Delaware, I presume, because Twitter is a Delaware based incorporated in Delaware, like virtually every corporation.
00:40:46.520 But that's a good thing for Elon, too, because these this court deals with basically nothing but business disputes and understands the realities of his own power, whoever is going to decide it and of how businesses work.
00:40:58.760 So, yes and no. Can I can I agree with you about everything in the in the premise you said and then sort of push back a little bit in one sense?
00:41:10.640 The Delaware Transit Court judges are experts and they will understand the what I just described and the limits of specific performance and those things.
00:41:20.340 They do, however, have a kind of, you know, we're in charge of business disputes.
00:41:26.300 We're worried that if we let Musk back out of this, the kind of ramifications that have for other deals and they have, because of their expertise, a little bit of an arrogance and a willingness to sort of hold people to deals in ways that don't reflect the economic efficiencies or or maybe what the right market conditions would be.
00:41:47.440 So I think that's a point in my favor. That's that's a point in my favor, because because I think the arrogance will make them not want to get in a position where they're ordering Elon Musk, the richest man in the world to do something he's not going to do.
00:42:01.120 Yeah, we say in the piece, there's a little bit of a game of chicken that could go on.
00:42:07.440 I mean, first of all, the fundamental problem here is something, again, that we teach and you learn in law school, which is the layperson sees Musk agreed to buy Twitter.
00:42:18.500 Well, that's not what happened. The contract or the merger agreement was structured in a particular way. Musk created some separate entities, X Holdings 1 and X Holdings 2.
00:42:32.420 They agreed to be funded. Twitter agreed to cancel the shares of its shareholders and give them certificates and they could show up at these separate entities and exchange those certificates for cash.
00:42:44.140 That particular structure was not Musk himself promising to pay each Twitter shareholder $54.20.
00:42:52.640 And that structural difference really matters. The lawyers could have struck this deal in a way that really did bind Musk, that made him accountable for the promise that he made to the shareholders effectively and could have been liable for the difference between the $33 stock price today and the $54.
00:43:10.580 But the lawyers did not structure it that way. But the lawyers did not structure it that way. And they're kind of stuck with the structure that they have, which what it does is forces Twitter in the first instance to sue these shell companies, X Holdings 1 and X Holdings 2, to force them to do something.
00:43:27.620 And as we point out there, there's no them really. You can't put them, you can't put X Holdings 1 in jail.
00:43:35.520 The agreement does require Musk to do his best efforts to get those entities to fulfill their obligations, but it's a kind of second order thing. And I cannot see the Delaware courts holding Musk in contempt.
00:43:50.780 And it just shows a little bit of the limitations of law here in holding people to deals.
00:43:58.540 So what do you think is likely to happen? Yeah, I know you end your piece with a good line. You say they could have structured it that way, but either Mr. Musk's lawyers were too smart for that or Twitter's weren't smart enough to structure it in a way that would have really required accountability on his part to the shareholders.
00:44:12.240 Twitter, of course, is saying we were injured. You know, he's blown up the company. He's damaged our reputation. He's created doubt amongst advertisers and our customer base about, you know, how many real accounts we have and so on.
00:44:24.280 So he's he came in, he damaged us. He left. So we want to be made whole. And maybe the wholeness is the difference between thirty three dollars a share and fifty four.
00:44:32.620 Who knows? But what are you knowing there? Both sides arguments. What do you predict? And I won't hold you to it is likely to happen.
00:44:39.600 Yeah. So, yeah. So the first thing is on the lawyering, just that's a plug for law for my students. Lawyering really matters.
00:44:46.380 And it seems like it, you know, just they agree to this deal. Lawyering really matters. And that's a point we want to get across in the piece.
00:44:52.400 As for predictions, I mean, I'm a law professor and so I'm not a prognosticator as such. And so, you know, with with that caveat, you know, I think the the the at the end of the day,
00:45:06.500 the court is going to want to not force the sale of a forty four billion dollar company to somebody who doesn't want it.
00:45:14.540 And I think the stock market, as I said, reflects that that reality. Musk agreed to pay this billion dollar breakup fee.
00:45:22.380 And so that's the cleanest way out of this. There is a question we raise in the piece whether or not that is actually Twitter's damages.
00:45:28.600 What pundits are doing by pointing to the shareholders and the fifty four dollars is just completely mistaken.
00:45:34.880 The shareholders are not a party to this contract. The lawyers could have made them a party to the contract and brought their damages onto Musk.
00:45:42.780 They didn't do that. Twitter is the part of the contract. And so only Twitter can sue for its damages.
00:45:48.580 And you mentioned some things like their reputational harms or things like that. Great.
00:45:54.500 If Twitter can go into court and prove that it is worth less today and make a causal link between that reduction and its value, reduced profits, reduced asset value, market value.
00:46:07.080 And Mr. Musk's behavior, then they could they could get those damages.
00:46:12.940 But I haven't seen any evidence that Twitter is less profitable today than it was before Mr. Musk made his offer.
00:46:22.140 And that's what they'd have to prove. And Twitter took a dive along with it.
00:46:26.100 Yeah, that's going to be a really tough standard. So why don't you think, though, and only have a short time, that they'll make him pay the billion bucks, the breakup fee?
00:46:34.020 Well, I won't say they may do that. And that seems like the cleanest way out.
00:46:41.020 And for Musk, I think, you know, a billion dollars, that's couch change for him.
00:46:44.580 And so I don't think that's a real I think that's a win for him if he can walk away here with with just paying a billion dollars.
00:46:50.700 The reason I'm a little bit cautious about that is because breakup fees are supposed to reflect the actual estimate of the damages.
00:46:59.020 So imagine you're buying a house, something everybody your listeners are all probably pretty familiar with.
00:47:03.680 You put up some earnest money, the earnest money when you put up a house and walk away or try to buy a house.
00:47:08.720 That's what you lose. You can't put in a contract, buy this house or pay me a billion dollars.
00:47:14.180 That's not the way it works. The damages, the billion is supposed to be an estimate of Twitter's losses.
00:47:19.020 I don't think they're a billion dollars. And the courts are reluctant to enforce penalties.
00:47:23.280 Fascinating. This is like totally different than what, you know, the people who hate Elon Musk, who control the rest of the media, say.
00:47:31.920 I appreciate the honest, straightforward analysis.
00:47:35.560 Todd Henderson, please come back.
00:47:38.000 I would love to, Megan. Anytime. Thank you.
00:47:39.840 All right. All the best.
00:47:40.740 Coming up, the bots angle to this story from a person who knows she's an expert in how many bots there are and how they're manipulating you right now.
00:47:53.980 Joining us now is someone who kind of studies bots for a living.
00:47:57.860 So she knows a lot about what Elon Musk says is his problem with the Twitter situation, though Twitter says it's a ruse.
00:48:05.960 She has also studied social media disinformation campaigns for years.
00:48:10.540 And believe it or not, you have been a victim of a social media disinformation campaign.
00:48:15.900 They're ubiquitous. They're everywhere.
00:48:18.040 And she can tell you some of the signs of, let's say, the Twitter account or the LinkedIn account that you may be interacting with that you have no idea is fake.
00:48:27.480 The person's fake. It's fake news.
00:48:30.040 So she's neck deep in something that we're all either living or we're about to know a lot more about in the coming decade.
00:48:37.180 Renee DiResta is the technical research manager at Stanford Internet Observatory.
00:48:42.360 And she joins me now.
00:48:43.900 Welcome, Renee. Great to have you.
00:48:45.660 It's great to be here. Thanks for having me.
00:48:46.920 So, yes, you've done far more than study bots online.
00:48:50.980 Exactly. But that's where we left off with our last two guests.
00:48:53.920 So we'll just pick it up there with you.
00:48:56.600 Putting aside whether that's genuinely Elon's problem with Twitter, you know, we that they'll hash that one out between themselves.
00:49:03.960 Bots on Twitter are a problem.
00:49:05.760 Twitter's acknowledged that, too.
00:49:07.380 And bots online are a problem.
00:49:08.720 But let me let me ask it this way.
00:49:10.480 If Elon hired you and said, Renee, I need an expert in, like, how I can figure out how many bots there are on Twitter.
00:49:19.980 Can it even be done?
00:49:21.360 I mean, is it really knowable?
00:49:22.800 Well, so not in the so I read the filing as, you know, the lawsuit filing.
00:49:28.720 And I'm not a lawyer, but I was very interested in the technical aspects of it.
00:49:31.640 And there's a couple of things in play here.
00:49:34.060 So, first of all, there's a strong public perception that bots are a huge problem on Twitter.
00:49:38.460 And we can talk about the history of that, why that is, you know, some of the dynamics in 2015 when they were particularly impactful.
00:49:46.000 Twitter actually did take a lot of steps to minimize the impact of bots after 2015, around 2017, 2018 timeframe.
00:49:54.480 But the first thing I'll say is that bots are not an evenly distributed problem on Twitter.
00:49:59.080 Meaning if you're a person like Elon Musk and you are famous, you have millions of followers, you're active in spaces where, like, cryptocurrency, where scams are abundant, you're going to see a lot more bot activity in part because people make impersonation bots of you.
00:50:17.060 And then they try to kind of dredge in your replies, to manipulate your followers, to pump cryptocurrency scams.
00:50:23.660 So Elon no doubt sees a whole lot more of this stuff than the average person who's engaging on Twitter does.
00:50:29.080 So I think that perception is one thing that's really key here.
00:50:32.160 He has, my team's telling me he has 100 million followers right now.
00:50:34.920 So, I mean, yes, so he's got a lot of incoming.
00:50:38.400 Go ahead.
00:50:39.040 Right.
00:50:40.080 So the terminology for bot, you know, it's a little bit fuzzy.
00:50:45.200 It should mean automated accounts.
00:50:48.140 What winds up happening, meaning a person doesn't sit there typing content into the, you know, into the user interface.
00:50:54.700 Instead, what's happening is the content is kind of pushed out at set time intervals or when a famous person like Elon tweets, you have bots that will see that his tweet has come out and will immediately reply to it.
00:51:07.700 This happened with Donald Trump.
00:51:08.880 This happens, again, with many, many famous accounts because people kind of want to get the reply in first.
00:51:14.180 Under Donald Trump, you used to see people selling, like, liberal tears mugs.
00:51:17.740 You know, just a form of spam, right, economic spam.
00:51:21.240 And so this is, again, this is not an uncommon problem.
00:51:24.280 The question is, in this context, per this legal filing, Twitter is claiming that 5% of its monetizable daily active users, so MDALs, are, that fewer, that 5% or less are these spam bots.
00:51:41.020 That is different than just number of users on site.
00:51:44.300 And so Twitter, in its calculation of MDALs, is theoretically already filtering out these spam bots that we all do know are on the platform.
00:51:53.020 Elon is saying, again, per my understanding of these filings, that Twitter is misrepresenting that number.
00:51:58.800 And so in response, what he asked for was access to, first, an understanding of Twitter's methodology, which it's my understanding they provided.
00:52:06.280 And I know in mid-May, the CEO, Parag, was tweeting about this methodology.
00:52:11.840 You know, they're sampling.
00:52:12.940 They have manual review.
00:52:15.120 It's almost impossible for an outsider, even a researcher like me, to concretely say that someone or something is a bot, is an automated account, or is a fake account.
00:52:25.340 Oftentimes, what people see is, for example, a bunch of accounts posting the same content over and over again.
00:52:31.300 So they think, oh, that's automated.
00:52:32.920 That's not always necessarily true.
00:52:34.820 What Twitter is looking at is information that it has about, did this account verify its phone number?
00:52:41.440 Is it, you know, what mechanisms is it using to engage on the platform?
00:52:46.300 Does it have other social media platforms linked in some way?
00:52:48.800 You know, they've got a number of different types of kind of, you know, information in the background that no outside person can see.
00:52:57.960 What we look at as researchers often comes through what's called the Twitter firehose.
00:53:01.540 And Elon asked for the Twitter firehose, and it was provided to him, again, per the terms of this, per the information in this lawsuit.
00:53:08.740 But you can't gauge bots by looking at the tweets that are coming through the firehose.
00:53:12.980 You can see maybe common uses of phrases.
00:53:16.140 You can see some repetition in terms.
00:53:17.720 But again, you can't verify that those are automated accounts, unfortunately, and you can't verify that they are have not already been filtered out of this monthly, sorry, monetizable daily active users.
00:53:30.260 That is MDAO.
00:53:31.500 Wait, so that makes it sound to me like you might give Elon the point that he was not provided with satisfactory information in order to be able to tell what percentage of the monetizable daily accounts are bots.
00:53:46.580 So there's two different things.
00:53:49.260 So there's the data that he was given, which I'm saying is not particularly useful for answering the question that he wants answered.
00:53:54.740 But then what Twitter says in its lawsuit is that they did provide extensive briefings detailing their methodology of how they arrive at MDAO and where their sampling happens and the processes by which they go and they check that background data to understand if this is a real account.
00:54:10.460 But not the actual data, not the underlying data.
00:54:12.440 That's where he's going to try to wiggle.
00:54:13.760 Right, Elon?
00:54:14.620 Probably.
00:54:15.120 I mean, you're the lawyer, not me.
00:54:17.560 I'm listening for exploitable points.
00:54:19.320 You know, if I'm Elon's lawyer and he's going up against Twitter hired Wachtel, which is a great, great, you know, white shoe law firm.
00:54:25.580 They will have these white shoe law firms.
00:54:26.920 So it'll be the very best and brightest lawyers duking it out.
00:54:29.920 But, yeah, he's going to say, I don't need to accept their summaries.
00:54:34.260 And when I'm paying forty four billion dollars, I want to see the actual data.
00:54:37.920 And then they're going to say, we gave you everything that was reasonable under the circumstances.
00:54:40.860 Oh, and by the way, you kind of waived your right to have a full vetting of all of our methodology when you made the offer.
00:54:47.300 So it'll go back and forth.
00:54:48.480 OK, that's fascinating, though.
00:54:49.460 So I, you know, it's until you just said that, I never really asked myself exactly what is a bot.
00:54:54.960 I just kind of assumed it was a computer generated, like a human generated at some point program that sent out a bunch of annoying messages to us all on various forms of social media.
00:55:08.980 That's kind of the purest term.
00:55:11.200 But colloquially, it really has taken on a meaning where I think bot and troll are often used quite interchangeably.
00:55:18.600 You know, people's experience of Twitter, of course, everybody has the experience where some, you know, we used to call them egg accounts, like the old egg profile picture.
00:55:25.900 There was no profile picture.
00:55:26.880 There was just an egg and it would send something nasty back at you kind of instantaneously.
00:55:31.420 And we've all had that experience.
00:55:33.920 As public awareness about bots increased, particularly in the 2016, 2017 time frame, when there was a lot of conversation about fake news, about small groups of accounts, you know, kind of trying to make things go viral,
00:55:47.540 about manipulation, using fake accounts to make hashtags trend, just kind of political shenanigans.
00:55:54.220 There started to be quite a lot of academic research on bots that had actually gone back several years prior to it really coming to public awareness.
00:56:01.680 But once it came to public awareness, there was this interesting phenomenon by which the more people learned that there were such things as bots on Twitter,
00:56:09.400 the more they started to actually kind of immediately dismiss people who would reply to them with like a nasty or snarky response.
00:56:16.160 Oh, that must be a bot.
00:56:17.980 You know, this was particularly as the kind of polarization on Twitter really heated up and anybody who waded into any vaguely political topic,
00:56:25.260 which now is, you know, much of Twitter, would get this response back and they would decide, oh, that's a right wing bot.
00:56:31.500 Oh, that's a left wing bot.
00:56:32.600 And so this idea that the person who was responding to the kind of like nasty throwaway account was a bot.
00:56:38.480 That was the term that people came to use, even though most of those accounts are not actually automated.
00:56:44.940 Most of them are not really bots.
00:56:47.080 So and I definitely want to get to like your experiences with identifying the fake ones.
00:56:52.240 And, you know, there are some tells which could be useful to our listeners and our viewers.
00:56:56.820 But I one of the reasons I think this is so important, what you do is I remember before I went over to interview Vladimir Putin the first time I had a big briefing at NBC,
00:57:07.160 like behind closed doors with FBI, CIA, like all sorts of top intel analysts and legit guys, you know, not hard partisans like, you know, we've some seen in some corners.
00:57:18.860 But I mean, legit guys would have taken a hard look at this and they showed me on a graph not just how like the sort of Russians had been manipulating conversation in America leading up to the election,
00:57:29.740 but even how I've been targeted, they they could pull sort of the the bot activity around my own name,
00:57:36.380 especially after I got in the crosshairs of Donald Trump and how it was amplified, like how, you know, negative tweets were amplified in different pockets of the world.
00:57:44.500 It was crazy. I mean, like the way that it can be seen if you care to actually know what you're doing like you do.
00:57:49.880 It's all right there. It's like right there. It's very traceable, or at least it was back then.
00:57:54.300 And so this is the point I try to make to my audience a lot, Renee, which is you don't you can 100 percent if you so believe you can you can say that Donald Trump won the 2016 election fairly.
00:58:07.880 And you can even believe that he won the 2020 election. You can be, you know, sort of one of the people who believes his claim that that election was stolen from him.
00:58:15.140 But I'm telling you, the Russians interfered with the 2016 election and they they've interfered with our national dialogue for years prior to that and after.
00:58:25.220 And and they definitely were pro Trump, but their agenda is so much bigger than that.
00:58:28.780 They were pro Trump to the extent they didn't like Hillary. But what they really want is division in America.
00:58:32.840 They want us weakened. They want us fighting with one another.
00:58:35.280 They'll take both sides of the Black Lives Matter issue or get whatever issue and just try to make us fight.
00:58:40.320 That's true. That really happened. And you you know that firsthand.
00:58:44.080 Yeah. Well, and that's something that I tried to emphasize, actually, in my own work.
00:58:48.780 Also, I think, you know, I led one of the teams for the Senate Intelligence Committee in 2018.
00:58:55.640 So there were tech hearings in 2017 as the public, you know, as investigations began to show that Russian interference in the election had happened.
00:59:04.000 Again, many people in the Obama administration knew about that in advance.
00:59:07.480 I think there's been a number of books written on this topic.
00:59:09.500 Like the FBI had been observing GRU, which is Russian military intelligence activities.
00:59:15.240 There was an excellent article in mid 2015 in The New York Times by a reporter named Adrian Chen.
00:59:21.980 I believe it was called The Agency. And it talked about the Internet Research Agency, the Russian troll factory,
00:59:26.960 which was really kind of put to use first very domestically, actually, to try to justify the Russian invasion of Crimea.
00:59:33.420 Of course, this kind of harkens back to the interesting dynamic that we're in with the current Russian invasion.
00:59:38.460 But what was happening there was they realized that propaganda had changed.
00:59:44.960 And there was there's this very old Cold War phenomenon called the agent of influence.
00:59:48.920 Right. And and this was if you've ever seen the TV show The Americans, that's what they are.
00:59:53.720 So they're these kind of like deep cover agents.
00:59:56.180 You know, they put them there and they recruit assets.
00:59:58.480 And you see in the narrative of the Americans, they recruit this this black man.
01:00:02.620 This is during the civil rights.
01:00:03.740 I think talking about civil rights issues.
01:00:06.760 And that phenomenon of agents of influence, you have to actually like put someone physically in country to go and infiltrate an activist movement and nudge things in a particular direction.
01:00:17.440 Or you started front media organizations that required a whole lot of effort to have front journalism.
01:00:23.940 You had to pay people. You know, there was a took multiple years at times for false news to go viral back then.
01:00:30.920 You know, viral was quite different in the in the age of just broadcast media and print.
01:00:34.960 But what the Russians realized is that you could actually pretend to be somebody else online quite easily.
01:00:40.920 And more than that, you could ingratiate yourself in online communities quite easily.
01:00:45.300 And so this was where, you know, it was it was just sort of an evolution of that tactic.
01:00:50.060 It wasn't that propaganda was new. It wasn't that Russian interference in elections was new or interference in American politics.
01:00:55.820 It was that social media had put us into these online factions where we were already telegraphing certain aspects of our identity.
01:01:04.980 Right. If we were joining a Texas secessionist group or a Black Lives Matter group, we were saying this is a belief that I hold.
01:01:11.520 Right. And so when they decided that this destabilization effort was worth going for, what they did was they didn't use fake news.
01:01:20.880 Actually, that was kind of a wholly separate thing that that military intelligence, the GRU was doing, but just staying focused on the Internet Research Agency, the Troll Factory.
01:01:28.720 What they started to do was create pages and just repurpose content from hyperpartisan and identity based American media that already existed or even just kind of memes that were very identity based.
01:01:41.800 And they created dozens and dozens of these pages in which they really got very, very deep down into what it meant to be an American and who America was for and these kind of existential questions.
01:01:52.720 And, you know, they had extraordinarily niche. They really did their research. If you were a person with an incarcerated spouse, there was a page for you.
01:02:03.700 If you were a Texas secessionist, there was a page for you. There was a page for for Chicanos, you know, just all of these different facets of American identity, Muslims, feminists, you know, liberals.
01:02:14.940 They just kind of ran the gamut. Lots and lots of conservative and Black Lives Matter pages.
01:02:19.580 So those were the two that they really wanted to to pit in opposition.
01:02:23.300 And by pretending to speak as a member of that community, the person who receives the message is much more receptive to it.
01:02:30.560 They don't think, oh, I'm getting some incentivized, you know, political propaganda piece from some random news site I've never heard of.
01:02:37.320 Instead, they see like commentary on Twitter that looks like it's coming from a Black woman that says, as Black women, we shouldn't do this.
01:02:44.100 We shouldn't vote for Hillary Clinton. As Texas secessionists, we need to rally at this park on this date at this time to preserve our Texas identity.
01:02:53.060 You know, and so it was it was always framed very much as like we are a member of your community sharing your views.
01:02:59.400 And that was how it was really done. It was really entrenching people, creating pride, deep pride in their identities and then pitting those identities as inherently in opposition to each other because of this question of who is America for.
01:03:13.020 It's so scary, manipulative. And you don't know how many times it's happened to you.
01:03:20.360 What what genuinely held belief do you have right now that's been planted there by somebody else intentionally in an effort to manipulate your vote or undermine our country?
01:03:33.320 It's scary to think about. And you've also pointed out that it's not just Vladimir Putin.
01:03:39.060 You know, it's he kind of worked in a way hand in hand with the social media companies in two ways.
01:03:45.060 Tell me what you think of this in the first way that stuff did get on social media with absolutely no gatekeeping for a long time.
01:03:54.600 And in the second way. They, too, are trying to manipulate us, you know, you were in the social dilemma like that by Tristan Harris, who's been on the show.
01:04:04.280 So that's one of the points that the film makes, you know, that the social media companies want to stoke outrage, want to fire us up, want to divide us.
01:04:14.020 My feed on Twitter or Facebook, whatever, will look totally different from my neighbors and my news consumption will be completely manipulated based on my prior likes and so on.
01:04:22.120 It's not just about like giving you what's in the news today.
01:04:24.840 It's about trying to collate results that they think will fire you up or outrage you in particular.
01:04:30.760 And then when we all sit back and we ask, why are we so divided?
01:04:34.800 Why? Why don't we feel patriotic anymore?
01:04:37.120 Why do we all hate each other?
01:04:38.660 And there really are answers to those questions.
01:04:42.200 There's a lot there.
01:04:43.920 So I am trying to think of how to kind of explain this.
01:04:48.940 There's a financial incentive.
01:04:50.700 There's a business model incentive to social media, which is to keep you on site, because most of these companies are ad based, which means that in order to serve you an ad, you have to be there to see it.
01:05:00.760 This creates a kind of perverse incentive where they're constantly gathering data to try to make sure that they're serving you content that you're interested in so that you stay on site.
01:05:10.960 And what starts to happen is that, you know, they want to show you content that you're going to engage with.
01:05:16.920 So if you join groups, they're going to show you a lot of posts from those groups.
01:05:20.760 If you follow friends and you engage with, you know, baby pictures and wedding pictures, you know, that's always going to be kind of pushed into your feed because people love baby pictures and wedding pictures.
01:05:29.020 So it's not so much a deliberate intent to say, like, we want to rile people up.
01:05:36.080 But the problem is that the intersection of social media dynamics with human nature and with the polarization that does exist creates an unfortunate feedback loop where the user does have agency.
01:05:49.800 You know, we can decide what to click on, we can decide what to share, but we're picking what to click on and what to share from content that's been curated for us or recommended to us.
01:05:59.960 So that's where the kind of agency intersects with the algorithm. Right.
01:06:04.000 And then we have these tools that we've been given to be propagators.
01:06:07.460 And one of the things that social science regularly shows is that people tend to propagate messages that are, you know, they have a strong what's called signaling factor.
01:06:16.220 I am a member of this political tribe. I am a member of this American identity.
01:06:20.460 You know, I am a member of this particular group. And oftentimes people really respond to language of like moral righteousness. Right.
01:06:28.140 And so you're saying I am outraged about this story and I think the world needs to know.
01:06:35.520 And so I am going to click that share button. Again, this is content that's been curated for and recommended to me.
01:06:41.700 I do have the decision and what I want to do with it. But in that moment, because of the norms and the behaviors that we've kind of come to embody on social media,
01:06:50.620 the way that that online behavior, you know, particularly on Twitter, which is really like the arena, the way that that's evolved,
01:06:58.060 we do tend to share the stuff that has the strong component, the strong moral component where we're saying, you know, as a as a good Democrat,
01:07:05.600 I am very outraged about this particular political decision that these other guys did.
01:07:10.860 And so I am going to share along that story that makes them look terrible, which is then going to continue to propagate.
01:07:16.880 It's it's a challenge because you do want social media to serve as as an amplifier, as a way to call attention to to speak.
01:07:27.700 This is where we can talk about the Elon that what he's actually asking for in the free speech kind of debate is actually a really interesting,
01:07:33.100 interesting thing to interrogate. But we are using them increasingly as like kind of factional battles for attention,
01:07:40.420 ways to activate political tribes, again, fundamentally in opposition to each other.
01:07:46.120 And so just to kind of connect it back to Russia, there are these and China and Iran.
01:07:50.840 And I mean, every every nation state has these accounts at this point. It's not new anymore.
01:07:56.200 It's not novel. Most of them are not particularly impactful.
01:07:59.320 That is one thing I really do want to caveat. You know, there are hundreds of thousands of Chinese people.
01:08:03.100 They just don't get much engagement. So you have these these things that are throwing, you know, accelerants on the fire.
01:08:09.620 But the fire is really domestic at this point. It's really domestic American influencers, domestic, highly activist crowds constantly,
01:08:17.480 you know, fighting in this particular environment again through this this sort of the unintended consequences of the business model incentive structure,
01:08:26.120 the curation and recommendation tools that were the algorithms that were developed.
01:08:30.960 But you say unintended. That may have been true originally. But now that it's been called to their attention, you can't say that anymore.
01:08:37.840 They know what they're doing. They don't care. I mean, look at the whistleblower, you know, from Instagram and Facebook.
01:08:42.940 They know they don't care. Their money is more important to them than the wellness of the country.
01:08:47.440 I think that I mean, I'm not going to dispute that. I think that there's a lot of really horrible stuff that came out in the whistleblower thing.
01:08:52.660 But there's one thing that has been really interesting, which is this question of once those networks have been established,
01:08:58.860 meaning once we've all kind of self-sorted over the last seven years into these very highly fact of it, you know, highly activist online factions,
01:09:06.040 it's really hard to know what to do next. So even when the platforms decide, hey, you know,
01:09:11.540 oh, we recommended QAnon to these hundreds of thousands of people. What do you do after that?
01:09:17.560 Right. This is where we're I kind of feel like we're like we're like paying debt, you know, the debt accrued from maybe they could just leave us alone decisions that were made.
01:09:25.020 Maybe the answer is not in the curating. It's in the obsessive, you know, need to make us part of their feed every two minutes,
01:09:32.840 you know, all the notifications. And like maybe that's the answer.
01:09:35.460 Now you've created this monster by feeding us all this disinformation and trying to shove us into these groups that are highly partisan or what have you.
01:09:42.060 Maybe the answer is, since you can't undo it, leave us alone. Let us let us go on Facebook.
01:09:45.800 If we want to go on Facebook, stop tapping us on the shoulder all day and having the computer think about how to get into our heads.
01:09:52.200 You know, this is where I think the I have I've had Twitter notifications.
01:09:55.900 Twitter is my favorite my favorite social media platform. It's the one I spend the most time on.
01:09:59.680 You know, despite all the, you know, all the critiques, all the disasters,
01:10:03.800 I really do find it an interesting place for like, you know, hearing novel things that I wouldn't hear otherwise and seeing people's perspectives.
01:10:10.320 I like Twitter, but I did turn off notifications about five years ago.
01:10:13.880 And I think that it means that I open it on my terms as opposed to getting some kind of push notification.
01:10:22.060 You know, I think Tristan talks about this a lot in The Social Dilemma.
01:10:24.560 And we're starting to see Apple kind of come in.
01:10:27.780 You know, you might notice if you get a lot of notifications from a particular app,
01:10:31.660 Apple will actually kind of pop up a prompt if you have an iPhone asking, do you want to receive these?
01:10:35.620 Right. And so it's constantly saying, you know, how do we avoid, you know,
01:10:39.280 it's almost like the device manufacturers sort of like serves as the gating function against the the excessive push notifications.
01:10:45.960 But I get them from, you know, media apps.
01:10:48.500 It's just everything is a constant battle for attention.
01:10:51.480 And so whether that's social media or even, again, unfortunately, like media properties,
01:10:56.720 media apps needing to compete for attention, wanting to drive people directly to their apps.
01:11:01.840 This this phenomenon of like the constant, incessant push notification constantly.
01:11:06.720 Well, yes, but it's not the same.
01:11:07.680 And look, I'll be the first to tell you that cable news is based on outrage and they want you to be on their websites as much as anybody else.
01:11:13.740 But they're not like they don't have the access to so much personal information about you in the way that your phone does.
01:11:20.580 That's what's so dangerous about the social media companies.
01:11:22.760 You know, they they can see everything and they use it.
01:11:25.600 You know, they use it for evil.
01:11:27.140 Don't get me wrong.
01:11:27.880 I go on these websites, but I'm very guarded.
01:11:30.020 And, you know, I'm now thanks to movies like this one.
01:11:33.400 I'm constantly with my kids like don't say yes to the notifications.
01:11:36.420 Instead, no, the answer is no.
01:11:38.080 And don't give any information about yourself.
01:11:39.780 And don't you know, it's like it's it's hard to raise kids in this era because.
01:11:44.140 Yeah, what they want, what my kids want to do is just go on these games.
01:11:47.060 You know, they don't do any social media, but the games want a lot of information about them.
01:11:51.780 And then it's like you're of birth.
01:11:54.120 What is your full name?
01:11:55.540 What is a phone number to recover the count?
01:11:57.460 I'm like, ah, I don't want them to know all this stuff about my child.
01:12:01.160 Right.
01:12:01.340 But like, and he can't get on the game.
01:12:03.280 I don't it's it's very disconcerting.
01:12:05.580 Yeah, this was minor eight, five and two.
01:12:08.980 So I have little little kids.
01:12:10.220 But the eight year old is definitely in the I want to play Among Us.
01:12:14.420 And I'm like, who what are the chat restrictions on Among Us?
01:12:17.820 Let me go play it for a while first, you know.
01:12:21.040 And but but there is also like, you know, you're right.
01:12:23.640 There's this is this is where people go, though, for social connection now.
01:12:27.960 Right.
01:12:28.240 I mean, I was on America Online when I was in like, you know, sixth grade or something like
01:12:32.820 that.
01:12:33.020 You seem too young.
01:12:33.600 So you would have been like two.
01:12:36.260 No, no, no, I was I, you know, was definitely one of those people who like got all those
01:12:41.260 free CDs and like ran out my parents phone bill secretly and got in a lot of trouble for
01:12:44.920 that.
01:12:45.180 But no, I was like very early to the Internet.
01:12:48.520 And as as a person who was and remembers like anonymous chat rooms and things, I definitely
01:12:52.580 think about all the really stupid decisions I made as a kid and how I'm, you know.
01:12:57.540 Thank God I was an adult when this happened.
01:12:59.240 I was like 25 when the Internet came out.
01:13:01.000 I was safe.
01:13:01.880 I had done all my stupid shit privately.
01:13:05.440 Yeah, it's a real thing, though, right?
01:13:07.100 I mean, like teaching kids to understand the incentive structures, I think, is actually
01:13:10.080 huge.
01:13:10.820 And but again, so much of it is this is kind of human nature and our desire to connect with
01:13:16.660 each other.
01:13:17.400 You know, it's fun to find groups where you have common interests.
01:13:21.160 And and this is this is the one thing that I think is is important to raise.
01:13:24.020 Right.
01:13:24.580 When we think about social media company obligations here, it's very, very clear.
01:13:29.200 There's some real bright lines where we say, OK, Russian, Chinese, Iranian bots like that's
01:13:33.160 a that's a hard no.
01:13:34.780 And the policy that was developed to address that was a policy called inauthentic activity.
01:13:40.080 Right.
01:13:40.340 It was a policy around inauthenticity.
01:13:42.380 The argument they were making was not that this stuff was true or false.
01:13:46.400 Oftentimes the content wasn't even falsifiable.
01:13:48.520 It was just political propaganda, you know, which were awash and everywhere.
01:13:52.140 But what they were saying was that you could not pretend to be a Texas secessionist from
01:13:58.880 a troll factory in St. Petersburg.
01:14:01.240 And so the arguments for for taking down these accounts, for disrupting these networks were
01:14:05.020 all around authenticity.
01:14:06.780 But when you ask the question of what do you do about the outrage machine?
01:14:10.940 Since so much of it is domestic, right, since so much of what is curated for you and what
01:14:15.580 is recommended to you is not Russian at all.
01:14:17.800 It's just domestic hyperpartisan content or, you know, again, these sorts of facets of
01:14:22.820 identity, identity based content or interest based groups.
01:14:25.760 The question that that kind of confronts the platforms now is what can you recommend?
01:14:31.620 How should you decide what to recommend?
01:14:33.840 And this is where that really interesting conversation around free speech versus content moderation
01:14:39.120 begins to come into play, because there is a sense that if the platforms nudge the algorithm,
01:14:47.460 let's use the algorithm is not the best term, but I think it's it's the most colloquial at
01:14:51.280 this point.
01:14:52.300 If the algorithm if Facebook nudges the algorithm, you might recall after January 6th, they had
01:14:57.420 what they call these like break glass measures where it chooses to deprioritize political content
01:15:03.940 in the feed.
01:15:04.660 There are a whole lot of people in politics who get very, very upset about that, who see
01:15:09.680 that as censorship, who see that that reallocation of attention by changing curation as being
01:15:16.840 fundamentally viewpoint based discrimination.
01:15:20.320 And so that's where this tension really comes into play as well.
01:15:23.980 But they're not just there, though.
01:15:26.340 I mean, I understand like we can dispute whether that was the right move or the wrong move.
01:15:30.780 But, you know, being more on the right side of the aisle, I can see the censorship that
01:15:35.440 they do to the conservatives and their viewpoints all the time.
01:15:37.900 And I know like covid, that's not it wasn't even a right left issue.
01:15:41.120 I mean, I all my Democrat friends in New York practically are ready to vote Republican over
01:15:45.380 what was done to their families during covid and the misinformation that was classified
01:15:51.120 online, all of which turned out to be true.
01:15:53.580 You know, like whatever the earlier stuff about this thing looks like it really came from a lab
01:15:58.040 and, you know, you'd get censored for saying things like that and questions about the vaccine.
01:16:02.100 My friend Dave Rubin, he got banned from Twitter for a while for saying the vaccines don't prevent
01:16:08.180 the spread.
01:16:09.400 And now we know that's true.
01:16:11.340 Like they don't prevent the spread, you know, like anyway, things like that were deemed
01:16:15.180 disinformation that we later learned are not disinformation.
01:16:18.060 And we should have been allowed to have an open conversation about it and disputed in the
01:16:21.380 however you want to call it public square or in these forums that we're all on.
01:16:25.120 And that's that's where the word censorship comes up.
01:16:28.560 And they were censored.
01:16:29.420 The Hunter Biden laptop story, right, which Twitter wouldn't allow to be circulated, saying
01:16:34.280 that was disinformation.
01:16:35.700 Meanwhile, their their censorship campaign was disinformation.
01:16:38.960 That the Hunter Biden laptop was real and people should have been able to see it and make
01:16:43.200 up their minds about whether it mattered.
01:16:45.580 See, I actually agree with you, even approaching it largely from from the center left perspective.
01:16:49.880 I don't think that I don't think that censorship, I don't think that the takedowns actually work.
01:16:56.200 And there's a variety of reasons for that.
01:16:57.920 But let's let's stick with COVID, because I think that that's an interesting that was an
01:17:01.560 interesting time because the problem with COVID was the platforms were trying to decide
01:17:06.880 how do we take our health misinformation policies that we've had for years?
01:17:12.320 You know, Google since 2012 has had a policy called your money or your life.
01:17:15.880 Right.
01:17:16.500 And what it says is we shouldn't be returning search results to you based on what's popular,
01:17:22.720 because what's popular can be manipulated, because if we're using engagements or links
01:17:27.440 on a social on a search engine, backlinks or likes or something like that, then we're just
01:17:31.700 surfacing what's popular, which means going back to the bots or the fake accounts.
01:17:35.780 Anybody with a you know, anybody who can generate those engagements can kind of trick the curation
01:17:41.280 algorithms into returning stuff that is popular.
01:17:43.760 So the question that the platforms have to ask is for, you know, are there certain areas
01:17:48.460 where they should try to return accurate information?
01:17:52.060 And this is where those original policies came from, this idea of your money or your life,
01:17:55.840 your health or your finances.
01:17:57.160 If you get a cancer diagnosis and you go to Google and you search for the name of your
01:18:01.360 cancer and what comes up is a bunch of like eat some mushrooms and have some peach pits,
01:18:06.140 you know, cyanide cures cancer kind of alt health quackery.
01:18:09.220 You're probably not going to necessarily get the, you know, the kind of most authoritative
01:18:14.560 medical information or links to, you know, to the right hospitals that you might want to
01:18:19.180 connect with.
01:18:19.980 Now, what's happening there is there's actually been a scientific consensus and it's evolving.
01:18:25.440 Science is always evolving.
01:18:26.700 But there's some sort of consensus where there's some sort of expert opinion that says when you
01:18:31.520 get this cancer diagnosis, these are the reputable centers that you should go to.
01:18:35.260 And here's the most reputable information about the various facets of the diagnosis you've
01:18:41.360 just been given.
01:18:42.260 Versus if you go to a social media platform, particularly in the kind of, you know, 2015
01:18:47.780 to 2019 timeframe, and you typed in that same thing, you'd probably find some good support
01:18:53.040 groups, which is very useful.
01:18:54.340 But you might also find just a lot of people who are creating content to try to pull in people
01:18:59.660 who've gotten these diagnoses so that they can sell them something.
01:19:02.460 And so what was happening during COVID for even prior to the rollout of the vaccines was
01:19:08.000 you had this novel disease.
01:19:10.100 The health institutions were not producing good content.
01:19:13.180 They weren't saying anything really.
01:19:14.900 They were very reticent to communicate.
01:19:16.820 There was no strong consensus about what had happened.
01:19:20.280 Nobody knew whether it was a lab leak or pangolins in a wet market.
01:19:24.260 And so the platforms, though, had to decide what results do we return in this environment of
01:19:30.780 incomplete consensus.
01:19:31.560 And so that's where what you start to see is these policies that were made for much more
01:19:39.100 established things.
01:19:40.120 The MMR vaccine stays very, very well established.
01:19:43.060 When people search for MMR vaccines on Facebook, particularly after the measles outbreak in
01:19:49.020 Samoa and the measles outbreak in Brooklyn in 2019, they didn't want to surface the most
01:19:53.860 popular content because oftentimes that was not medically reliable or it was from anti-vaccine
01:19:59.760 groups.
01:20:00.020 So what they tried to do was create this policy that said we're going to surface authoritative
01:20:04.560 information from the CDC and WHO.
01:20:07.560 That all really collapsed during COVID because, again, the consensus wasn't there and the
01:20:12.240 institutions weren't producing content.
01:20:14.880 And most importantly, I think institutions are not adept at communicating with the public
01:20:21.120 in this media environment.
01:20:22.600 They don't say, here's what we know and here's what we don't know.
01:20:24.900 They just wait until they think they know something and then they say that then.
01:20:28.260 I feel like I spent most of 2020 to 2022 writing articles about institutional failures and media
01:20:34.940 overreach and social media trying to communicate and just the absolute disaster that the information
01:20:40.480 environment had become.
01:20:41.840 At the same time, what I'm not comfortable with is the idea that large accounts that get a lot of
01:20:49.380 attention because they have a lot of followers, that they've managed to accrue in a totally
01:20:53.060 unrelated space, should be the things that platforms surface just because they have a contrarian
01:20:58.040 perspective about a disease.
01:20:59.580 And that's why I think this curation phenomenon is actually really the question that plagues us.
01:21:03.900 It is, in fact, I think the most important question as we move forward in this environment
01:21:08.700 that's not going away.
01:21:10.180 So, you know, how do we adapt to this?
01:21:11.800 What do we surface?
01:21:12.740 That I think is actually really the key question for us.
01:21:14.920 These are these are interesting issues.
01:21:16.580 I mean, I agree with your point of like the person who gets the cancer diagnosis and diagnosis
01:21:20.700 should be served up the best information from the most respected institutions.
01:21:24.300 And I see your point about how that but that's an established thing.
01:21:26.960 We know like the Mayo Clinic is not trying to mislead us on, you know, what pancreatic cancer
01:21:31.680 means.
01:21:32.100 You know, we know that we've lived in this earth long enough to know that and your distinction
01:21:36.340 about how COVID things went off the rails because it was new and there was too much
01:21:39.880 radio silence.
01:21:41.040 And but there's another element to it, too, which is now we know that, you know, Fauci
01:21:46.420 and Collins actually did try to suppress like the Great Barrington Declaration by, you know,
01:21:52.220 three very well respected doctors, Stanford, Harvard.
01:21:55.360 The third one was equally respected.
01:21:57.460 It was so forgive me.
01:21:58.080 I can't remember what university he was from.
01:22:00.280 Uh, yeah.
01:22:01.600 OK, yeah.
01:22:02.380 So my point is like and we've seen now, thanks to the FOIA requests and so on, like
01:22:05.840 the active attempt to silence really smart doctors who are thoughtful, who are infectious
01:22:11.740 disease doctors saying, here's another way we might go about this, you know, and so the
01:22:16.260 institutions with all their power said, no, let's silence them and let's disparage them
01:22:20.560 as quacks.
01:22:21.580 And the social media companies went along with it.
01:22:24.540 You know, that's that's where people get angry, totally distrustful.
01:22:29.440 You know, there's got to be a way of handling the politics that are behind some of these
01:22:34.160 decisions.
01:22:34.580 You know, in that instance, the left eventually had its near total respect and trust in Fauci
01:22:40.640 and the right had near total, you know, the opposite.
01:22:43.200 Like the social media companies, the more they're on us on a side in making those decisions,
01:22:49.040 the more aggravating they are, the more they infuriate people, the more people choose other
01:22:54.040 forums where you can really go into a rabbit hole like Reddit.
01:22:57.380 You know, I just I don't know what the solution is, but I see the problem very clearly.
01:23:02.000 Yeah, I spend a lot of my time trying to think about the solutions.
01:23:06.060 You know, I think I think there's there's a couple of things.
01:23:09.320 Again, I you know, I listened to your interview with Robert F. Kennedy, Jr., right.
01:23:14.100 And I thought I'm stridently pro vaccine.
01:23:17.040 That's just where my personal politics are.
01:23:19.380 Right.
01:23:19.760 You know, I have three kids.
01:23:20.840 And the idea that, you know, for me, I see, you know, the the I did a lot of arguing in
01:23:27.560 California that I actually do think that that childhood vaccine should be required for
01:23:31.400 school.
01:23:31.660 So there's my kind of disclosure of bias.
01:23:33.280 And and RFK Jr. was involved in that in that kind of legal battle that we had in California.
01:23:39.220 But what I thought was very interesting about about your interview with him was that you
01:23:44.680 had the conversation, right?
01:23:45.880 There was the the dialogue there, but you also did the fact check right alongside it.
01:23:50.520 You know, you spliced it in.
01:23:51.820 In some cases, you asked hard questions, you pushed back.
01:23:54.440 And one of the things that I think is really challenging about social media is that because
01:23:58.740 of just the again, we self-select to some extent and then the algorithms reinforce who we want
01:24:05.460 to follow.
01:24:05.900 And so it's very hard to see those ideas juxtaposed to actually have that counter speech, to have
01:24:12.860 that correction.
01:24:14.380 And so one of the questions for social platforms, I think, you know, just for for any of your
01:24:18.760 listeners who aren't familiar with platform moderation roughly falls into three buckets.
01:24:22.900 There's remove, which is what it sounds like.
01:24:24.980 They're going to take it down.
01:24:26.020 Then there's reduce, which is they're going to algorithmically throttle it.
01:24:29.200 And we can talk about that.
01:24:30.660 That was used in the Hunter Biden laptop story by Facebook, even as Twitter went the remove
01:24:36.040 route.
01:24:36.460 Right.
01:24:36.600 So there's difference differences in how platforms choose to respond to these things.
01:24:41.340 And then there's inform and inform is the posting of the fact check.
01:24:45.840 Right.
01:24:46.100 Or the posting of some sort of contextualization or counter speech right alongside the content
01:24:51.680 in an effort to try to make people realize that that there's that there's a matter of
01:24:57.700 debate that I like that, that it's I like I like that third one.
01:25:03.060 And I never complain when, you know, YouTube wants us to throw up the CDC website or that's
01:25:08.160 fine.
01:25:08.540 I don't care.
01:25:09.080 I'm I truly am like the more speech, the better.
01:25:11.600 Like you should check out what the CDC is saying.
01:25:13.320 You it's up to you.
01:25:14.060 You're smart enough viewers and listeners to figure it out for yourself.
01:25:16.880 I have no problem doing that, but I do definitely have problems with the other two in in most
01:25:22.540 instances.
01:25:22.960 I mean, there's certainly some I know you've written about sort of like the ISIS videos
01:25:26.660 on how to make a bomb.
01:25:28.020 No, they serve absolutely no social purpose and they shouldn't be allowed to stay on there.
01:25:31.520 But that's very different than I think COVID started in a lab.
01:25:35.520 Yeah, I agree.
01:25:36.040 I mean, hey, I am not defending the moderation choices of of platforms.
01:25:40.720 What we try to do again, you know, my my team is at Stanford Internet Observatory.
01:25:45.040 So I am a researcher of the stuff and and we ask these questions and we say, like, was
01:25:50.840 was content uniformly actioned?
01:25:54.940 And what that means, it's very like kind of nerdy way to put it.
01:25:57.040 But when a platform decides that something violates its policies, first, is the policy
01:26:02.100 clearly articulated?
01:26:03.080 This is something that the Facebook Oversight Board in particular kind of publicly puts out
01:26:07.440 assessments.
01:26:07.880 Was this policy clearly articulated?
01:26:09.760 Then was it fairly applied?
01:26:12.000 And then was it uniformly applied?
01:26:14.380 And what we see sometimes is that, you know, one guy's content does get a label or come
01:26:19.180 down and the other piece of content with the exact same claim does not.
01:26:23.580 It's and that's I think that that feeling of unfairness, that feeling of enforcement
01:26:27.240 unfairness is one thing that that people can constantly point to because there's millions
01:26:31.660 and millions of posts.
01:26:32.900 I think there's not a single group on social media that I can think of where I haven't seen
01:26:36.740 some sort of claim that social media is biased against them.
01:26:39.640 But this are ways of actually analyzing these things for us really come from like we make
01:26:47.220 a very discreet, you know, we make a very, very small data set where we say like, OK,
01:26:50.880 here are all the platform actions on election misinformation in 2020 on URLs that we looked
01:26:56.360 at that were fact checked to be false.
01:26:58.400 OK, then what happened?
01:26:59.900 And that's the kind of work that we try to do now where we say like, is the enforcement
01:27:03.880 fair and is the policy fair?
01:27:07.320 These are actually kind of two different questions.
01:27:09.520 Yeah.
01:27:09.600 Do you have different ideologies on your team?
01:27:13.040 Do you have conservatives with you?
01:27:14.720 I would like us to have more conservatives.
01:27:17.520 I think that this is the kind of constant chronic challenge of, you know, of academia.
01:27:23.540 I think that there's a there's a fair bit of of healthy debate, you know, among the team,
01:27:28.960 actually, and I we the one one way that we try to deal with this is through interinstitution
01:27:35.040 partnerships where we say, what are ways in which we can engage with civil society or,
01:27:40.540 you know, other academic institutions that have a different perspective or that have
01:27:45.080 different data sets?
01:27:45.940 And that's really the best thing that I really think that that's the solution.
01:27:49.120 That's that's the that's the next move.
01:27:50.680 You know, I've told my audience this, but I went out.
01:27:53.340 I spoke at Google and Facebook and I've been at Snapchat.
01:27:56.060 A bunch of these media companies have asked me to go out.
01:27:58.960 And talk to them about how they can be more fair.
01:28:01.820 And and I did that.
01:28:03.480 And I told them all the same thing.
01:28:04.920 And and listen, I just say, you know, I've been a registered Democrat.
01:28:07.580 I've been a registered Republican.
01:28:09.120 I've been a registered independent for the past 20 years.
01:28:11.960 I just I don't like wearing anybody's team jersey.
01:28:14.080 And there's too many losers in each party for me to associate with them.
01:28:17.860 So I'm not I'm not ideological.
01:28:19.820 You know, I would say my sensibilities lean center right.
01:28:22.040 As you said, you're center left.
01:28:23.180 But but my advice to the companies was wherever you stand, you've got to get more conservatives
01:28:27.860 in on these decisions so that you can make sure, you know, you've got your hands at 10 and
01:28:32.940 two and this thing pulls to the left.
01:28:34.980 So you've got to get somebody in there to make sure it doesn't pull to the left too much.
01:28:38.580 Otherwise, you will wind up with biased decisions and upset, you know, consumers and so on.
01:28:44.660 And I'm sure it is hard in academia, but talk to David Sachs.
01:28:48.380 He's in tech and and he's a conservative.
01:28:50.820 You probably know some people.
01:28:52.620 I we know we know a lot of the same people, actually.
01:28:54.540 There's a I think that Silicon Valley has been a that's where I spent a lot of time in
01:29:00.160 tech prior to going into academia.
01:29:01.740 So I've only been in academia for three years.
01:29:03.460 I'm not exactly like a, you know, an entrenched academic.
01:29:06.700 But the center left is good there.
01:29:08.680 We'll take it.
01:29:09.380 Pardon?
01:29:10.060 Even center left is good in academia.
01:29:11.800 Well, you know, I mean, and I think it's also it's interesting, though, like how you
01:29:16.080 get how you get read.
01:29:17.720 I think it is, you know, I think I am occasionally read a center, right?
01:29:23.080 You know, I was I was pretty active in the, you know, some of the you mentioned COVID and
01:29:27.820 outrage at how kids were treated.
01:29:29.560 And I was pretty outraged about a lot of that myself, a lot of school closer stuff.
01:29:33.160 And so, you know, I think that it's there's a tendency to kind of reduce people down to an
01:29:37.720 ideological persona.
01:29:39.620 And and I think that it's, you know, it's a it's a kind of like a short term heuristic.
01:29:44.340 But I think that the importance is having those those people who are on the, you know,
01:29:49.080 who lean more in one direction or another.
01:29:51.260 I will say that one of the things is, you know, it's there are certain things that are,
01:29:55.800 in fact, you know, quite quite demonstrably false.
01:29:58.800 And then there comes the question of how do we handle those?
01:30:01.020 And where is the because you have a free speech right to say nonsense whenever you want to.
01:30:05.940 It will cure your COVID.
01:30:07.920 Right.
01:30:08.360 And anything, elections, COVID, you name it, you can vaccines, you know, you can you
01:30:12.900 have your right to your bad, wrong opinion and your right to express it.
01:30:16.600 Right.
01:30:16.760 And so the question becomes then, because everything is curated, what should be surfaced?
01:30:22.260 Like, what is the ranking function?
01:30:23.660 And that's where I think this this tension is really in what is upranked or downranked?
01:30:32.000 To what extent should factual accuracy be factuated?
01:30:34.780 No, I get that.
01:30:36.020 But I also think to what extent should expertise be factored in?
01:30:38.480 That's a that's a real tough one, too.
01:30:40.400 Well, I agree.
01:30:40.940 I agree with that.
01:30:41.540 But I also think it's it gets so gray because it's like you could say, yes, I agree with you
01:30:48.920 on election misinformation.
01:30:49.760 There are definitely claims that, you know, are false that you could that you could.
01:30:52.680 But there could also be political bias definitely playing a role in what gets deemed false and
01:30:58.440 what you know, what doesn't.
01:30:59.940 And, you know, for one example, you know, there's a very strong difference in opinion
01:31:03.940 about whether the mail in ballot should have been allowed in Pennsylvania, you know, given
01:31:08.060 the way they changed the voting, whatever.
01:31:10.100 There was like a legitimate dispute there.
01:31:12.120 So if somebody's tweeting out the vote in Pennsylvania is not legitimate, there may be
01:31:16.080 actually a very good basis to say that that has nothing to do with Kraken, Sidney
01:31:19.280 Powell, Rudy Giuliani or any.
01:31:20.860 Right.
01:31:21.020 So it's like, who's going to make that decision?
01:31:22.940 John Stossel, my old pal from he was a Fox business and he was ABC for years before that.
01:31:28.060 He did this great expose.
01:31:30.040 If you haven't, I'm going to send it to you if you haven't seen it, but it's basically
01:31:32.480 on how he tried to do some environmental reporting that challenged some of the green
01:31:37.180 energy crews assertions.
01:31:39.240 And he had both sides, he had both sides and they censored him and he went through the
01:31:44.480 layers to see, like, why was I censored?
01:31:46.120 And it truly was just a matter of opinion.
01:31:48.340 They couldn't point to anything he had said that was factually wrong.
01:31:51.020 It was infuriating.
01:31:51.900 Anyway, I got to leave it at that because I got to squeeze in this quick break, but
01:31:54.240 I'm going to bring you back.
01:31:55.220 Come back for a couple of minutes on the opposite side before we have to end.
01:31:57.680 This is fascinating.
01:31:58.380 And I really do want to ask you about how people can tell whether they're interacting
01:32:01.100 with a real life human on LinkedIn, on Twitter, on any social media, because Renee is the
01:32:06.040 person who knows.
01:32:07.200 Stand by.
01:32:11.520 All right, Renee.
01:32:12.260 So there you were online and you got a message on your LinkedIn.
01:32:18.160 Seemed normal enough, according to the report from NPR.
01:32:21.020 From Kenan Ramsey, wanting to connect with you.
01:32:24.740 You're both in a LinkedIn group for entrepreneurs.
01:32:26.980 And we have a picture of Kenan here.
01:32:28.600 And you thought Kenan looked a little weird and your expertise led you to start asking
01:32:33.400 questions.
01:32:34.120 And Kenan is fake news.
01:32:35.820 She's a fake person, just like my Louis at Air France.
01:32:40.180 Fake news.
01:32:40.780 So how do we figure out when we're interacting with someone who is fake?
01:32:48.380 So AI can be used to generate wholly novel faces.
01:32:52.160 And it's called generative adversarial networks.
01:32:54.740 GANs is sometimes the terminology that you'll see.
01:32:57.240 But AI can generate images, videos, text, right?
01:33:00.120 At this point now.
01:33:00.760 And so what happened there was I got this in-mail from this person.
01:33:06.240 You can usually tell the eyes, nose, and mouth.
01:33:09.080 When a computer generates a face, it uses a certain kind of grid.
01:33:12.140 And so it puts the features roughly in a particular line.
01:33:14.800 And if you superimpose a lot of these faces on top of each other, you'll actually see that
01:33:19.300 the eyes, nose, and mouth are always in the same place.
01:33:22.940 A lot of times the hair is wrong.
01:33:25.620 It blends into the background in some way.
01:33:27.900 The collar melds into the neck, melds into the hair, melds into the background.
01:33:32.320 The teeth are often wrong.
01:33:35.100 The pupils, actually, AI, for some reason, computers don't do a great job with pupils.
01:33:39.360 The pupils are wrong.
01:33:40.780 The ears are weird.
01:33:42.260 And it doesn't know what to do with jewelry.
01:33:43.740 It doesn't understand earrings.
01:33:45.480 So you'll have one earring, but not another.
01:33:48.820 I think it's one of these things where if you've seen enough of them, they kind of jump
01:33:51.520 out at you.
01:33:51.880 I thought it was interesting because I thought, okay, there's, you know, of all the, it was
01:33:57.160 novel to me to receive a message on LinkedIn in particular.
01:34:00.180 These things are all over Twitter.
01:34:01.820 And they're used because it's very hard.
01:34:04.160 You can't just reverse image search and go figure out that it was a stock photo that
01:34:07.780 this fake account is using.
01:34:09.740 But people do now at this point actually, interestingly, use them just because they want to be anonymous
01:34:14.340 online as well.
01:34:15.320 So again, this question when we were talking earlier, what's a bot?
01:34:18.480 What's a troll?
01:34:19.180 What's a fake account?
01:34:20.060 Who's really behind something?
01:34:22.280 It's, it is increasingly hard to tell.
01:34:24.760 And that's where I think platforms do have an important role to play here in, in identifying
01:34:29.800 these networks themselves.
01:34:31.440 Well, Renee's done a lot of great work in figuring out like, you know, how like your AI on your
01:34:35.620 computer can figure out what word you're typing or how you want to end your sentence.
01:34:39.400 It's so far beyond that.
01:34:40.820 They can, they can generate whole articles now that get sent out that were not written
01:34:44.960 by a human and the future is more of that and they're getting better at it.
01:34:49.660 And that's what we're up against, which is why we need Renee and her group to be out
01:34:53.820 there foreseeing it and cluing us all in on, you know, how to avoid it or recognize it.
01:34:59.260 So we appreciate you and what your team does and your expertise, Renee.
01:35:02.140 Thank you.
01:35:02.660 Thank you so much for coming on.
01:35:04.440 Thanks for having me.
01:35:05.600 All right.
01:35:05.780 I want to tell you that this Friday, Andrew Schultz is making his return to the show.
01:35:09.600 This is going to be big news.
01:35:10.900 You'll find out why.
01:35:12.100 Stay with us.
01:35:13.120 See you tomorrow.
01:35:15.180 Thanks for listening to the Megan Kelly show.
01:35:17.380 No BS, no agenda and no fear.