The Megyn Kelly Show - May 23, 2022


A Culture of Fear, Social Media Toxicity, and America's Descent Into Stupidity, with Jonathan Haidt | Ep. 327


Episode Stats

Length

1 hour and 34 minutes

Words per Minute

203.30838

Word Count

19,206

Sentence Count

1,257

Misogynist Sentences

10

Hate Speech Sentences

23


Summary

Jonathan Haidt is the author of The Coddling of the American Mind, The Righteous Mind, and Why Good People Are Divided by Politics and Religion. He s also the co-author of Why the Past 10 Years of American Life Have Been Uniquely Stupid.


Transcript

00:00:00.500 Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.700 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:15.140 Here's the question for you on a Monday morning.
00:00:17.460 Are we living through a uniquely stupid time in American history?
00:00:22.720 Have you woken up recently and said to yourself, how the hell did we get like this?
00:00:28.000 What happened to the America of 10 years ago?
00:00:32.080 What's going on? Why can't we talk to each other?
00:00:34.260 Why does everyone hate each other?
00:00:35.980 Why is there no more truth?
00:00:38.140 Well, we've got some answers for you today, and they're great.
00:00:40.980 They're so insightful. I've really enjoyed preparing for today's interview.
00:00:44.060 Here with us today to explain our societal lapse in intelligence, among other problems,
00:00:48.780 is social psychologist at the New York University Stern School of Business, Jonathan Haidt.
00:00:55.000 Jonathan is also the author of the New York Times bestsellers, The Righteous Mind,
00:00:59.520 Why Good People Are Divided by Politics and Religion,
00:01:01.980 and the absolutely brilliant and just game-changing, The Coddling of the American Mind,
00:01:08.800 which he co-wrote with Greg Lukianoff, who's also been on the show earlier,
00:01:14.340 talking about what's happening on universities and his work to sort of document it and fight for free speech.
00:01:19.480 Now, Jonathan's latest Atlantic column from April is called
00:01:23.580 Why the Past 10 Years of American Life Have Been Uniquely Stupid,
00:01:27.120 and man, did it get everyone talking because of its brilliant insights.
00:01:30.380 His mission is to use research on moral psychology to help people understand each other
00:01:35.180 and to help important social institutions work better so you can sense his frustration
00:01:40.580 because those things are not going so well right now.
00:01:47.880 Jonathan, welcome to the program.
00:01:49.480 Thanks so much, Megan. What a pleasure to be talking with you.
00:01:52.220 Oh, the pleasure is entirely mine.
00:01:54.520 Cannot wait to tap into your wealth of intellectual resources.
00:01:58.360 So let's start with, because I love The Coddling of the American Mind.
00:02:02.380 It just had such great insights, and it covered a lot of stuff that I'd been covering on the news
00:02:06.260 for the previous 10 years because I lived it.
00:02:08.980 And when I think about your latest round of research, I think about the birth of my children.
00:02:13.420 I had a son in 2009.
00:02:14.940 I had a daughter in 2011, and then I had a son in 2013.
00:02:19.500 And so that's the time frame during which we lost our collective ever-loving minds as a country, right?
00:02:25.960 So that's going to sort of the mark of their arrival corresponds with the time where our society just went nuts.
00:02:32.880 It went nuts.
00:02:33.660 And I know that we've felt it.
00:02:36.080 I know the listeners to this show and the viewers have felt it, but perhaps not diagnosed.
00:02:41.720 How did it happen?
00:02:43.280 How?
00:02:43.940 All the stuff that, you know, Greg's been working against on college campuses and that led you guys to write The Coddling,
00:02:49.020 all of that, it's so far beyond what it was when you looked at it.
00:02:56.180 It's out of the frying pan and into the fire.
00:02:58.380 Things have exploded.
00:02:59.360 It's gotten so much worse.
00:03:01.160 And I know we all know that, but why and how can it be stopped?
00:03:05.480 Because you can't arrest it unless you understand it.
00:03:07.500 That's where your latest round of research and writing and your next book come in.
00:03:12.420 So let's start there on, weirdly, the Tower of Babel.
00:03:17.640 Okay, explain that, what that is, and why that's your focus.
00:03:21.560 Sure.
00:03:22.600 So I've been a professor since 1995, and I love being a professor.
00:03:27.800 I love universities.
00:03:29.320 And it just seemed like all of a sudden, in 2014, something changed.
00:03:33.320 Something changed, like, in the fabric of space-time.
00:03:36.280 And weird stuff started happening.
00:03:38.720 And I saw some of that on campus here at NYU.
00:03:41.500 And then Greg Lukianoff came to talk to me.
00:03:43.820 We'd met through a mutual friend in May of 2014.
00:03:46.480 And he said, John, weird stuff is happening.
00:03:48.380 And he had a theory as to why.
00:03:50.140 Well, it turns out our theory was partially wrong.
00:03:53.400 We thought universities were causing this to happen.
00:03:56.700 And now we know, no, this was a much bigger thing happening that affected Gen Z.
00:04:01.100 It affected kids born after 1996.
00:04:03.460 So your kids, my kids, my daughter is 12, my son is 15.
00:04:07.560 So I've been trying since 2014 to figure out what on earth happened.
00:04:12.420 Why did so many things change in such weird and strange ways?
00:04:15.560 And I've always been looking for metaphors.
00:04:17.420 I think we need metaphors to understand anything complicated, anything that we don't, doesn't
00:04:21.560 sort of fit easily into our minds.
00:04:23.720 And it was when I went back and reread the Babel story.
00:04:26.860 It's, you know, it's a short little story in Genesis.
00:04:29.020 And it's that the descendants of Noah are spreading out across the plain of Shinar, and they decide
00:04:36.140 to build a city with a tower to reach onto heaven.
00:04:39.240 And God thinks this is hubristic.
00:04:41.460 Well, there are a variety of theories I've heard as to why God didn't like this.
00:04:46.720 But in any case, God, he doesn't actually physically destroy the tower.
00:04:50.220 What the text says, and this is the key line, he says, let us go down and confuse their language
00:04:56.020 so that they may not understand one another.
00:04:59.480 And so I reread that.
00:05:00.320 I found that story again a couple of years ago, and I thought, oh my God, that's it.
00:05:03.360 That's what happened to us.
00:05:04.660 Because it's not just a story about tribalism, like left versus right.
00:05:07.780 That's been getting worse since the 1990s.
00:05:09.900 It's a story about how everything has come apart.
00:05:12.740 And if you have a group of people that are entirely on the left or entirely on the right,
00:05:17.500 they're going to fight and find ways to fragment internally.
00:05:20.680 So something changed.
00:05:21.980 It feels like everything is crumbling since the early 2010s.
00:05:26.020 And in the metaphor, is the new God social media?
00:05:32.420 Or the devil or the whatever you want to call it.
00:05:35.900 It's not a force for good.
00:05:37.480 It's not a force for good.
00:05:39.340 And so what I, you know, I think, look, we've all seen dozens of articles about how
00:05:44.480 social media is destroying everything and destroying kids.
00:05:48.100 And I think why my article is a little different is that I'm a social psychologist.
00:05:52.120 And I wasn't just saying, you know, it's bad and here's why.
00:05:55.440 I was really trying to dig into what exactly is it that it did to social relations?
00:06:00.480 What did it do to communication?
00:06:02.880 And so the story, I think the best way in is to put it in a narrative form, where once
00:06:08.700 upon a time, we had this incredible time of optimism in the 1990s.
00:06:13.560 Those of us old enough to remember, the 1990s went to the end of the Cold War and, you know,
00:06:19.680 new technology.
00:06:21.940 And America even had a surplus in its debt for the first time in a long time.
00:06:26.440 It was this amazing time of techno-democratic optimism.
00:06:30.020 We thought that we defeated the Soviets.
00:06:31.960 We defeated all the authoritarians.
00:06:33.780 It's going to be liberal democracy from here on in.
00:06:35.840 And then when social media comes in, in the early days, 2004 or so, people think this is
00:06:43.080 going to be a great force for democracy.
00:06:44.680 This is going to democratize power and voice.
00:06:48.160 And in 2011, the Arab, that year, that amazing year, begins with the Arab Spring, where Facebook
00:06:54.240 and a few other platforms helped the people of Egypt and Tunisia.
00:06:57.820 It helped them to bring down those Arab dictators.
00:07:00.580 And we thought democracy is going to break out in the Arab world.
00:07:03.520 And that year ends with Occupy Wall Street.
00:07:05.420 And once again, populist movement, now this is more of a left-wing populist movement, but
00:07:08.880 the people have power.
00:07:10.620 So it was a time of enormous optimism about the power of this technology to help democracy.
00:07:17.180 But then after that, everything turns around.
00:07:19.300 And now democracy is on the back foot, as they say in Britain, and authoritarians, I know
00:07:25.140 you had Tristan Harris on your show.
00:07:26.880 Tristan points out that China is using this technology.
00:07:30.140 To make themselves better authoritarians, as it were.
00:07:33.520 And it's making us worse Democrats, worse at democracy.
00:07:37.720 So what I was trying to show is that the central problem with social media is that it doesn't
00:07:42.060 actually make us communicate.
00:07:43.480 It doesn't, you know, we can communicate with text and Zoom and phone calls.
00:07:46.900 There's all kinds of ways to really talk to someone authentically.
00:07:49.680 But when you put something out on social media, you're performing.
00:07:52.460 And then you wait to see what everyone says about it.
00:07:54.780 And so it's this incentive that makes us want to perform at others to impress still
00:08:00.640 others, often strangers.
00:08:02.240 That's what warps everything.
00:08:04.080 And that's what encourages us to just fight among ourselves constantly.
00:08:09.040 And we can't have a democracy if we're just fighting among ourselves all the time.
00:08:13.160 The founding fathers knew that.
00:08:14.260 I remember all the good press about social media back in 2011 during the Arab Spring,
00:08:19.760 which also had a different ending than the one that we were hoping for when that broke.
00:08:26.460 But to me, it's almost like, you know, you go out on the date with the person.
00:08:29.820 And on the first date, the person sweeps you off of your feet and they're utterly charming
00:08:34.840 and they're beautiful and they're smart.
00:08:36.780 And you're thinking, oh, my God, this is wonderful.
00:08:38.720 And then you flash forward a few years.
00:08:40.680 This is our relationship with social media as well.
00:08:42.620 And that the person's become a stalker.
00:08:45.180 The person's extremely controlling.
00:08:46.820 The person can't let go of you or your time.
00:08:49.120 The person's made you less happy.
00:08:51.040 Right.
00:08:51.220 Like this is this is the 30,000 foot zoom out on what's happened between us and social media.
00:08:57.220 And by us, I mean, our our society, our world that goes beyond America, as you do a good
00:09:02.140 job of pointing out every time you get get back to.
00:09:04.680 Is it just American girls that are deeply depressed?
00:09:07.700 It's not.
00:09:08.360 If you look at other societies, all these things are spiking in sort of the Western world
00:09:13.020 and the beginning, the onset of it is always linked back to right when social media, it's
00:09:19.940 not even just the creation of the iPhone in 2007.
00:09:21.940 It's the birth and explosion of social media, which was the game changer.
00:09:27.220 Um, I want to this is from your article.
00:09:30.100 You write something went terribly wrong.
00:09:31.520 Very suddenly we are disoriented, unable to speak the same language back to the Babel
00:09:36.020 comment or recognize the same truth.
00:09:38.680 We are cut off from one another and from the past.
00:09:41.780 So I was feeling this just last week or recognize the same truth, you know, in the wake of the
00:09:47.220 terrible Buffalo shooting.
00:09:48.560 Look what the country did.
00:09:49.880 Right.
00:09:50.320 You must have been watching this, John, like half the country went to this is all to be
00:09:55.120 blamed on the right, on the political right.
00:09:57.220 I mean, there was literally an article in Rolling Stone saying that the shooting was
00:10:00.240 a mainstream Republican type of ideology at work.
00:10:03.700 And then, you know, the right looking at it and seeing something very different and
00:10:07.680 and being, I think, outraged that anybody tried to blame it on a political party as
00:10:13.240 opposed to radicalization by a guy who was drawn to the Internet and wasn't doing well
00:10:16.440 emotionally, mentally and so on.
00:10:18.120 But it really is at the point where it's just two totally different truths.
00:10:21.380 You know, like if you watched MSNBC last week, you would have thought you are an evil person
00:10:25.480 if you are a registered Republican.
00:10:27.440 You are hashtag part of the problem of massive white supremacy and mass shootings in America.
00:10:31.980 And you would have felt something very different had you turned into Fox News or any other
00:10:36.860 more conservative digital media.
00:10:38.460 Right.
00:10:39.040 It's and you could do that on any given week of the year.
00:10:41.600 Yeah, that's right.
00:10:45.140 Well, so the first thing to keep in mind as we go through this is that most Americans
00:10:49.820 are actually pretty reasonable.
00:10:51.720 Most Americans don't want to attack anyone, destroy the reputation.
00:10:57.460 Most Americans are sick and tired of what's going on.
00:11:00.400 Part of what social media did is it changed who has voice, as it were, because it's always
00:11:06.540 the case.
00:11:07.660 It's always the case that the people on the far right and far left are going to be more
00:11:10.660 passionate.
00:11:11.160 Whoever's more passionate is going to talk more.
00:11:13.240 So they're always going to have more voice, more representation than the people in the
00:11:16.240 middle.
00:11:16.880 But when social media becomes very widely used, and this is really around 2011, 2012, is when
00:11:22.540 most people now have a smartphone and now they can be on it every day, 10, 20, 50 times
00:11:27.920 a day.
00:11:29.220 When it gives when social media becomes widely used and it becomes much more viralized,
00:11:34.420 which we'll talk about in a moment, I hope, the extremes are going to have much more
00:11:37.680 voice.
00:11:38.680 And the middle 80% of the country, we just keep our head down.
00:11:42.220 We don't want to be in the shooting war.
00:11:44.520 And so it looks as though we all hate each other.
00:11:47.440 It looks as though all there is out there is extremists.
00:11:49.400 That's not true.
00:11:50.560 That's part of the hall of mirrors that social media does to us.
00:11:53.920 Our minds, we evolved to have a really, we really care what public opinion is.
00:11:59.180 We want to size up.
00:12:00.240 What are people thinking?
00:12:00.980 And in a small community, you can actually tell what people are thinking, just try to
00:12:06.080 look on their face as people are talking.
00:12:08.360 But in the social media world, we have no idea what people are thinking.
00:12:11.120 All we know is what people are tweeting or posting on Instagram, whatever it is.
00:12:14.180 And that's never representative of public opinion.
00:12:16.620 It really is a big news.
00:12:17.700 That's part of what happened to us.
00:12:18.980 Yeah.
00:12:19.700 Yeah.
00:12:20.100 So let's go through it because you take on the three major forces that bind successful
00:12:24.560 democracies together.
00:12:25.560 And you make the case that social media has undermined all three in America and beyond.
00:12:31.320 Number one, social capital, which you describe as extensive social networks with high level
00:12:35.940 of trust.
00:12:36.680 Two is strong institutions.
00:12:38.020 Three is shared stories.
00:12:39.500 Those get much more interesting as we get into the details.
00:12:42.640 So the first one, social capital, extensive social networks with high levels of trust.
00:12:47.580 What does that mean?
00:12:49.020 So social capital is a it's one of the most common terms in the social sciences and refers
00:12:54.240 to the fact that if you have two companies and one has a lot of financial capital, you
00:12:59.440 know, money that they can invest, that if everything else is equal, that company is going to outperform
00:13:04.220 the one with less financial capital.
00:13:06.760 Similarly, if you have two companies or sports teams or towns or nations, identical in all
00:13:12.940 respects, except one, people really trust each other.
00:13:15.580 I don't have to monitor.
00:13:17.220 Like if we're having an election, I don't think that you're going to cheat and steal.
00:13:21.340 We trust each other.
00:13:22.360 That country is going to be much more successful than one with low trust.
00:13:26.340 And we've seen this throughout the 20th century.
00:13:28.300 That was part of the problem with communism.
00:13:29.660 The communist countries, everyone knows everyone is lying and there's no trust.
00:13:33.840 So America used to have very high trust up until the 60s or early 70s, equivalent to many
00:13:39.560 of the most successful European countries.
00:13:42.560 But we've been on a downward slide.
00:13:44.220 And so if you lose trust, if you lose social capital, now that brings us to the next issue,
00:13:50.580 which is strong or shared institutions, institutions that we trust.
00:13:55.700 A dictatorship is based on the strength of the ruler and the army and his ability to intimidate
00:14:00.740 everyone into obedience.
00:14:02.540 A democracy such as ours or a republic or whatever you want to call it.
00:14:05.660 But if the founding fathers didn't want a king or a monarch, they believed in government of
00:14:10.880 the people, by the people.
00:14:12.500 And so they create, they took, we were lucky to inherit good British institutions and then
00:14:18.300 we improved them.
00:14:19.500 And those have lasted us well.
00:14:21.120 Obviously, they've performed badly at certain points.
00:14:23.400 But on a world historical scale, American institutions work very, very well.
00:14:28.160 And now they're malfunctioning.
00:14:29.980 We trust them less, in part because they are less trustworthy, but also in part because we
00:14:34.360 are just saturated, saturated with stories about their failures.
00:14:38.340 Some of those stories are true, some are false.
00:14:40.840 So if we don't trust each other, if we don't trust our institutions, including even our courts,
00:14:46.720 our legislatures, our public schools, if we don't trust them, it's going to be very
00:14:51.860 hard to have a country we could actually fail as a country.
00:14:56.480 I know the third of the third is the shared stories.
00:14:59.820 Yeah, shared stories.
00:15:00.760 Go ahead.
00:15:01.620 Yeah, the third is shared stories.
00:15:02.860 So the secret to binding people together, you know, in my writing, I'm very interested
00:15:11.220 in evolution.
00:15:12.020 I look at how other species get cooperation, and it's almost always because they're siblings.
00:15:16.460 Bees are the quintessential example.
00:15:18.240 You can have, you know, millions of bees or ants cooperating because they're all sisters.
00:15:22.320 Um, humans can cooperate at the level of millions too, but we're not siblings.
00:15:27.400 We do it because we, we have shared stories.
00:15:30.480 We have a common understanding of what we're doing.
00:15:32.100 We make something sacred.
00:15:33.460 It can be a God.
00:15:34.460 It can be the declaration of independence.
00:15:36.560 It can be a physical place.
00:15:39.340 Um, we make something sacred.
00:15:40.700 Then we circle around and we worship something together, or we hold it as sacred together.
00:15:45.360 We can cooperate.
00:15:47.280 And America got a big boost from World War II and the Cold War.
00:15:50.420 We had a real story of who we were, why we were fighting for good.
00:15:53.780 We had really good, clear, evil enemies in the 20th century.
00:15:57.340 So that really bound us together.
00:15:59.600 Social media gives us all our own story.
00:16:02.420 Everyone has their own little fragment of a story.
00:16:04.780 It's almost impossible to knit it together into a common story.
00:16:07.720 And identity politics has given people a new thing to latch onto at the expense of, yeah,
00:16:14.740 the, the American story and our history and pride and country, which is problematic.
00:16:19.320 It is definitely problematic.
00:16:21.180 And I know you've been raising the flag on that for a long time.
00:16:23.420 So you write that, um, back to the performance issue, because I think it's important what
00:16:27.280 happens on social media and why, why it's been so pernicious that it's no longer about,
00:16:32.680 um, coming up through middle school and high school and making a verbal error that your
00:16:39.100 friends give you a little brush back on.
00:16:41.320 It's about complete fear about what's going to happen to you on social media or actually
00:16:46.400 making a misstep and being absolutely ruined at a young age.
00:16:49.200 And at the same time, it's about posting the perfect selfie on Instagram, as opposed to
00:16:56.800 just spending time with your friends and laughing and swimming in the pool and riding your bike
00:17:02.060 around the neighborhood and all that stuff that actually formed true human bonding.
00:17:06.180 That's right.
00:17:07.180 That's right.
00:17:07.760 And so I think what we have to do is think here about what is a, what is a normal, healthy
00:17:12.020 childhood?
00:17:13.120 And, um, in human, in, in human societies around the world, by the age of around seven,
00:17:19.200 kids are given responsibilities.
00:17:21.000 They're not being supervised closely by adults.
00:17:23.300 They can bring the cattle down to the river or whatever it is.
00:17:26.240 They can certainly walk to the store and buy a quart of milk.
00:17:28.980 Um, and that was true all the way up into the 1990s.
00:17:32.060 Kids had, uh, sort of normal human childhoods, but in the 1990s, America in particular, we freaked
00:17:38.420 out about child abduction.
00:17:40.200 Uh, and this, I think is partly the, the saturation of cable TV and full-time, you know, 24 hour news
00:17:45.040 stations and they focus on story for whatever reason, the 1990s, just as the
00:17:49.140 crime wave was ending, really.
00:17:51.000 I mean, crime rates were way down.
00:17:52.880 Drunk driving was way down.
00:17:54.100 There'd never been a safer time to raise kids or let them out outside.
00:17:58.060 Um, just at that time, we decided it's too dangerous and we say, no, you can't go outside
00:18:01.860 unsupervised.
00:18:02.760 Uh, we also say after school, no, don't go out and play with your friends.
00:18:06.600 You have soccer practice or guitar practice or whatever it is.
00:18:10.260 Um, we basically took away childhood.
00:18:12.240 And, um, when kids don't get to practice those skills, as you were saying, you, you, you,
00:18:16.480 you know, you say something, you make a mistake, you learn, um, kids have to make thousands
00:18:21.040 of mistakes and the consequences need to be very small so that they learn.
00:18:25.360 Imagine if you were trying to teach kids how to do the balance beam and they go out in the
00:18:29.580 balance beam, but every time they fall, they're going to fall 30 feet into a pit of alligators
00:18:33.740 or something.
00:18:34.100 Well, whatever, you know, they're going to get really, really hurt pretty quickly.
00:18:37.420 They just wouldn't go out on the balance beam.
00:18:38.780 They're not going to learn.
00:18:39.380 Um, so social media has made the consequences of a mistake so high, uh, you never know if
00:18:46.420 you could rise to even international infamy.
00:18:48.940 You know, we see clips of kids doing or saying stupid things.
00:18:52.480 So I think social media didn't take away childhood.
00:18:56.000 We sort of did that before social media came.
00:18:58.380 And then social media comes in when kids are heavily supervised.
00:19:01.600 The only place they can get away from adults is actually on, well, video games, which are not
00:19:05.480 as harmful, uh, but also social media platforms.
00:19:08.240 Um, and I think it completely distorts normal childhood learning and it deprives them of
00:19:12.700 the repeated experiences of trying something and failing or succeeding, but they need to
00:19:18.020 grow up.
00:19:19.060 Yeah, no, it's truly the, I mean, when, when, when I have play dates at my house with, with
00:19:23.380 kids, I don't let them go on their phones.
00:19:25.060 I'm like, put your phones down.
00:19:26.160 That's not, you're not here for that.
00:19:27.980 It's, and my kids aren't on social media and nor will they be until, you know, they're
00:19:32.320 old enough to overrule me.
00:19:33.620 Um, but I just find it so dangerous and I, and I, you know, I understand now it's not
00:19:38.220 it's not necessarily the phone.
00:19:39.540 It's the social media.
00:19:40.860 Um, that's, that's the problem.
00:19:42.260 And I think if more parents understood that they, they'd relax a little, you can still
00:19:46.760 get Johnny the phone.
00:19:48.240 Cause what I hear from all my friends is I need to pick them up.
00:19:51.380 I, you know, the other day he misses ride and he called me and I'm not willing to give
00:19:54.600 that up.
00:19:55.120 And, and you don't have to, you, Johnny can have the cell phone.
00:19:58.240 Johnny can play video games and he can text with his friends.
00:20:01.040 You know, there's just, there'll be limits, I think in terms of the time, the time that
00:20:04.280 that phone is open and available, but Johnny does not need to be on Facebook and Instagram
00:20:09.580 and Snapchat and Tik TOK and YouTube and all these other forums that are really potentially
00:20:15.300 dangerous.
00:20:15.840 A, but also B, um, have really pernicious societal effects and effects on him.
00:20:21.480 I mean, and we'll get to the suicide rates and the depression rates and so on.
00:20:24.920 So it's like, this is not in Johnny's best interest and not in society's best interest.
00:20:28.920 Um, okay.
00:20:29.580 So, but the performance aspect, can you speak to that?
00:20:31.820 Because that's something I think we see a lot as grownups or grownups.
00:20:35.680 You can tell I have kids whenever you've heard adults as grownups, it means you have
00:20:38.460 children.
00:20:38.700 Um, we see that as adults, but it's, it's very present and in the face of teens.
00:20:48.380 That's right.
00:20:49.180 So if you wanted to train a seal to balance a ball on its nose, you would not explain the
00:20:56.140 thing to the seal.
00:20:57.240 You would not wait until the seal does it to give the seal a fish.
00:21:01.040 You, you reward any progress towards the behavior you want.
00:21:04.280 Same thing with training a dog.
00:21:05.380 Um, and this is called operant conditioning in behaviorism and psychology.
00:21:10.000 Um, operant conditioning is incredibly powerful.
00:21:12.380 And if you could reinforce your kids within three seconds of them doing a behavior, you
00:21:17.920 could get your kids making their beds every morning within a week.
00:21:21.320 Um, so operant conditioning is very, very powerful.
00:21:23.860 Now, what happened all of a sudden when kids got phones with touchscreens, uh, around,
00:21:29.340 around 2009 to 2012, you know, the iPhone comes out 2007, but it's expensive.
00:21:33.020 Very few kids have one around 2012, 2013.
00:21:35.620 That's when, that's when kids' lives switched from mostly not being on social media every
00:21:40.800 day to having a phone and having most experience come through the phone.
00:21:45.680 Um, and that phone is the most powerful operant conditioning machine ever invented.
00:21:50.960 What's the first word you're saying before conditioning?
00:21:53.460 Oh, operant.
00:21:54.740 Uh, there are two kinds of, of conditioning.
00:21:57.060 Pavlovian conditioning, uh, which is like, gets your, your, um, uh, your autonomic immune
00:22:02.060 system going like Pavlov's dogs would salivate when they heard a bell, but the way you train
00:22:07.120 a circus animal is called operant condition.
00:22:08.980 You give them a small reward right away.
00:22:10.920 Very, very powerful.
00:22:12.280 Now, all of us who have kids, we try to get our kids to sit up at the table.
00:22:16.920 We try to get them to eat well.
00:22:18.100 We try to get them to write thank you notes.
00:22:19.480 We want to affect the influence.
00:22:21.220 We want to influence our kids very hard because we don't have a little shock box, like shock
00:22:26.520 necklace around their neck.
00:22:27.360 We can't do operant conditioning to like train them like a, like a circus animal, but Facebook
00:22:32.480 and Instagram and Twitter and all these other platforms do.
00:22:35.260 And so what we've essentially done is we've given our kids, uh, an incredibly powerful operant
00:22:40.340 conditioning machine and the people giving them rewards are total strangers, total strangers.
00:22:45.240 We've given over the training of our children to total strangers.
00:22:47.960 And this, I think is a big reason why Gen Z is in terrible shape.
00:22:53.260 It's not a gradual change from the millennials to Gen Z.
00:22:55.860 It's very, very sudden right around birth year, 1996, um, kids born in 19, let's say kids
00:23:00.920 born in 1998 are much worse off, much more fragile, much more depressed and anxious than
00:23:06.340 say kids born in 1993 to 94.
00:23:08.720 That's the end of the millennial generation.
00:23:11.040 Um, and so part of it is, um, we, we've taken away childhood in its place.
00:23:15.800 We've given them this training machine, which is sick, which is inhuman.
00:23:19.700 Oh my gosh.
00:23:21.400 And there's so much to delve into when it comes to the children and, and, and how they
00:23:24.800 use these, uh, platforms and what it does to them.
00:23:27.820 But just to take a quick step back, as you were talking about, um, tribalism really, because
00:23:33.960 you were talking about who really has the microphone.
00:23:36.300 If, when we go online, you know, and it affects our children and it affects us, who are we listening
00:23:40.760 to?
00:23:41.040 And we've heard, you know, our, I'm more center, right, but we talk about how, um, the
00:23:47.620 left wing controls media and the left wing certainly controls most social media.
00:23:51.140 Um, and we'll see what happens with Elon, but at Twitter right now, it's a very leftist
00:23:55.460 site and it's dominated for the most part by leftists, but really just by partisans, by
00:24:00.740 hard partisans.
00:24:02.080 And that's indicative of most of social media, which people fail to factor in.
00:24:05.840 I always get a great reminder of this, John, when I go to visit my two close friends in
00:24:09.420 the Midwest, one lives in Chicago, one lives outside of Detroit, um, two girlfriends I met
00:24:14.580 many years ago and you go to the Midwest and sometimes you just get reminded, even though
00:24:18.440 I know it's happening there too, of like normal people, not like crazy New York hard partisans
00:24:25.560 where I spend most of my time.
00:24:27.380 Um, and you, you write about, um, what's the name of the study?
00:24:31.540 Oh, the hit, the hidden tribes study that really actually proved this and proved who it
00:24:38.080 is we are listening to whether we realize it or not when we go on these platforms.
00:24:44.360 Yeah, that's right.
00:24:45.740 So the hidden tribe study, it was a, an outfit, uh, from the UK.
00:24:48.780 They do great research in Europe.
00:24:50.320 They did an amazing study here in America.
00:24:52.340 They, um, interviewed, I forget how many thousands of people in, in 2017.
00:24:56.780 Uh, and they, by doing various Americans, 8,000 Americans, 2017, this is from your article.
00:25:03.280 Um, and they, they found that seven, seven clusters of people who gave similar kinds of
00:25:08.160 answers to each other.
00:25:09.440 And it turns out that though the one on the far right, they called the devoted conservatives.
00:25:14.040 Um, that's where, that's where you'd find Trump's hardcore support.
00:25:18.940 And they're very different psychologically from, uh, I think the, the, the, the traditional
00:25:23.220 conservatives, I think it is, who are more like the, like the, like the conservative
00:25:27.260 intellectual tradition, like Edmund Burke, Thomas Sowell.
00:25:30.260 Uh, they're, they're cautious, they're prudent.
00:25:32.260 They believe in the importance of structures and the far right group is more radical.
00:25:35.540 They kind of want to burn things down.
00:25:37.160 So there's a kind of a, they're not conservative.
00:25:39.720 They're certainly not liberal.
00:25:40.580 Um, so there is a group on the far right.
00:25:43.640 Now there's a corresponding group on the far left, uh, which is also not liberal.
00:25:48.500 Um, they, there's, those are called the progressive activists.
00:25:51.520 Uh, and those are the ones who have taken, uh, equality as quality of outcome as their central
00:25:56.920 good.
00:25:57.340 That's what's sacred to them.
00:25:58.740 Um, and it's easier to achieve equality by, um, tearing down the top than it is by pulling
00:26:03.300 up the bottom.
00:26:03.800 So they tend to focus, and this is true across eras, um, radical egalitarian movements tend to
00:26:08.920 focus on, on pulling down the top.
00:26:10.880 So these two groups are in, in my opinion, of course, they, there are reasons for their,
00:26:15.660 for them having the views that they have, but the net effect on democracy, if they're
00:26:18.980 powerful is they want to pull things down.
00:26:21.800 They want to destroy.
00:26:22.720 They're not builders.
00:26:23.420 They're critics.
00:26:23.940 Now you need critics on each side.
00:26:25.640 You do need critics.
00:26:26.860 Uh, I'm all about viewpoint diversity, but what social media did was it took, let's say
00:26:30.800 the, the, the Republican coalition that Ronald Reagan had built, uh, which in which you
00:26:34.580 had the business conservatives, you have the Christian conservatives.
00:26:37.140 And there were always people who are more prone, well, psychologically more prone to
00:26:40.960 authoritarianism.
00:26:41.860 They were all part of a group, but the far right group didn't have as much influence.
00:26:45.740 And all of a sudden social media, they have much more influence and they're able to intimidate
00:26:49.740 and basically push out a lot of the moderates on the left.
00:26:52.780 As you say, it's, you know, the media on the left, what I argued in my paper in the
00:26:58.060 Atlantic essay is that the democratic party still has a healthy debate between the far
00:27:02.620 left and the more moderate left.
00:27:04.040 The party itself has not lost that ability to debate.
00:27:07.560 The problem on the left, I believe, um, is that the left largely controls the, the epistemic
00:27:13.140 institution.
00:27:13.880 That is the institution that generate knowledge.
00:27:16.120 So universities, uh, journalism, media, the museums and the arts, a lot of areas.
00:27:22.260 And when you get homogeneity and social media, something happens, which is the extremes now
00:27:27.540 have power to really intimidate the moderates.
00:27:29.280 And so they go quiet.
00:27:30.580 And so what we're left with is rather than just having a country where most of us are
00:27:34.340 fairly moderate, reasonable, and we have these extremes.
00:27:37.620 Now the extremes are so powerful that the rest of us just really go quiet.
00:27:41.460 And that's when you visit your friends in the Midwest, they're not out there tweeting
00:27:44.600 their outrage about this and that.
00:27:47.180 Exactly right.
00:27:47.720 Never.
00:27:48.200 I mean, there was an interesting David Brooks piece in the times.
00:27:50.380 I don't know if you saw this recently talking about how he thinks there really is still
00:27:54.980 room for liberals who aren't pro cancel culture and pro demonization of the other side.
00:28:01.620 And, you know, he often wonders whether there's still room, but he still thinks there's still
00:28:06.280 room, uh, for the, for said people.
00:28:08.520 But I mean, I know this firsthand from, from being somebody who's, you know, more right
00:28:12.920 leaning, but has immersed herself for the past 20, 30 years in very left leaning communities
00:28:18.040 only, you know, like I live with them.
00:28:20.420 I know them.
00:28:21.140 They're my friends or my neighbors, but you know, they're lovely.
00:28:23.180 They're not hard partisans for the most part, and they don't want to mess with anybody else's
00:28:27.460 life, you know, but you go online and you see a very different version and it leads to
00:28:32.200 hate and more tribalism and intolerance on your own part, you know, and you really, it's
00:28:37.760 a, it's something you have to actively work to fight against.
00:28:41.160 And I do think if I lived in a more red community, I'd be more subject to that narrative because
00:28:48.120 I wouldn't be surrounded all the time by people who are liberal and who are absolutely
00:28:53.080 lovely and don't subscribe to any of this nonsense.
00:28:56.860 Yeah, no, I think that's right.
00:28:58.780 I'm very cautious about using the word liberal because in America, we use the word liberal
00:29:02.800 to mean left and it shouldn't mean left.
00:29:04.680 We can talk about progressives and conservatives.
00:29:06.400 We can talk about the far left and the far right.
00:29:09.240 I reserve the word liberal to a person who believes in the liberal tradition.
00:29:12.300 That is freedom of speech, freedom of association, freedom of religion, economic freedom.
00:29:17.360 We don't want to be telling people how to speak, how to dress, how to live their lives.
00:29:20.660 We want to make room for people.
00:29:22.360 That's the liberal tradition.
00:29:23.600 And in Europe, they speak about right liberals and left liberals.
00:29:26.600 So the problem, I think, our left is no longer liberal.
00:29:30.640 Our right is no longer liberal or conservative.
00:29:34.280 And, but to your point about, depending on where you are, if you're in a partisan community,
00:29:38.620 you are just deluged with evidence that the other side is horrible.
00:29:42.760 And you'll see videos of people saying horrible, horrible things.
00:29:46.320 And people on your side can list all the sins of the other side, but you tend to have no
00:29:51.120 idea that an equally strong case is being made on the other side where they are deluged
00:29:56.660 by horrible things that your co-partisans have said.
00:29:59.500 So, you know, it's, you know, why do you complain about the speck in your neighbor's
00:30:03.300 eye when you cannot see the plank in your own?
00:30:04.960 And it continues, Jesus says, first, take the plank out of your own eye.
00:30:08.940 First, look at yourself, look at your side.
00:30:10.960 Only once you understand that problem.
00:30:13.160 Now let's talk about the other team.
00:30:14.960 And that's one of the things Tristan Harris was talking about, about how the social media
00:30:20.000 companies have made it almost impossible.
00:30:21.780 If you're relying on them, on your Facebook feed, on your Twitter feed for actual information
00:30:27.240 about what's happening in America, good luck, because your feed has been totally manipulated
00:30:30.680 to feed you only the things that will outrage you.
00:30:33.400 For the most part, you write about this as well.
00:30:35.280 They want to make you upset.
00:30:36.400 They want to make you angry.
00:30:37.520 Cable news, same.
00:30:38.520 And they are not in the business of delivering to you the truth, what is actually happening,
00:30:45.560 a broader picture of America and the news.
00:30:47.800 They're trying to upset you.
00:30:50.160 All right.
00:30:50.320 Let me pause it there, John, and squeeze in a quick, quick break.
00:30:52.840 That's a good place to come back on.
00:30:54.560 John Haidt, the one and only right after this.
00:30:57.260 It's not news now that they're trying to upset us on social media.
00:31:05.960 I think most people have heard that at least once or twice.
00:31:09.440 But you sort of link it to you've got the development of the ability to like something
00:31:16.180 on Facebook and retweet something on Twitter and then share something on Facebook.
00:31:21.860 And all of this sort of builds to the place where you write the newly tweaked platforms
00:31:26.880 were almost perfectly designed to bring out our most moralistic and least reflective selves.
00:31:35.700 And the volume of outrage is shocking.
00:31:38.920 It's been shocking.
00:31:40.200 Yes.
00:31:40.720 So if we go back to when these platforms were new, there was MySpace and Friendster and Facebook
00:31:47.100 around 2003, 2004.
00:31:48.940 They were just like, you know, bulletin boards.
00:31:51.640 I've got my bulletin board here.
00:31:53.200 Look at these photos.
00:31:54.100 Here's who I am.
00:31:54.880 And I can link to yours.
00:31:56.420 Totally not toxic.
00:31:58.440 The early technology was actually, and I hope people will remember this, like in the 90s,
00:32:02.840 the early internet, it was so exciting.
00:32:05.840 And, you know, there were pockets of toxicity, but there was a real positivity around a lot of
00:32:10.980 it.
00:32:11.120 We were exploring this new space.
00:32:13.460 So from 2003 or 2004 to about 2009, social media is not particularly toxic.
00:32:18.860 It's not particularly bad for democracy.
00:32:21.640 And then what happens in 2009 is Facebook innovates.
00:32:26.360 They say, you know, because they're all about finding out engagement.
00:32:29.540 They want to target advertisements to people.
00:32:31.660 So they give you a like button, a button you can click to say, I like this.
00:32:35.220 And that way you're generating a lot of data for them about what you like.
00:32:38.640 And then they develop algorithms that can maximize the degree to which you're going to
00:32:42.560 like something.
00:32:43.520 Now there's not necessarily anything nefarious about this.
00:32:46.600 What's wrong with giving you more of what you like?
00:32:48.260 But the net effect is that it's much more addictive because now you're getting reinforced
00:32:51.940 more often.
00:32:52.500 And you're strategically liking things, which is kind of you're performing.
00:32:57.360 You're doing a little performance.
00:32:58.720 Like you think, should I like this?
00:32:59.900 Should I not like this?
00:33:01.200 Even more damaging, I'd say, is the retweet button that Twitter develops.
00:33:06.540 And then Facebook copies it, the share button.
00:33:08.280 So Twitter develops the retweet button.
00:33:12.060 You click on something, not just to say, I like it, but you can now forward it to, let's
00:33:15.760 say you have 500 followers.
00:33:17.040 You forward it to your 500 followers.
00:33:19.000 And if it's really outrageous, they might forward it to each of their 500 followers.
00:33:23.220 And very quickly, you can get to millions and millions of people.
00:33:26.040 So 2009 is the year that social media changes radically.
00:33:30.220 Before 2009, you couldn't go viral very easily.
00:33:33.660 But by 2010, you can.
00:33:35.280 And now the game is on, who can win, who can go viral?
00:33:39.500 So now the platforms are shaping us.
00:33:41.680 We're all learning, what should I say?
00:33:44.460 How should I say it to maximally increase my clicks and my likes, my retweets, my follower
00:33:50.040 count?
00:33:51.260 So it's really then that it becomes much more hyper-viralized.
00:33:56.960 We all understand from COVID what happens when a virus is much more transmissible.
00:34:02.060 Boy, it can spread very quickly.
00:34:03.040 So that's what I argue in the piece, those small architectural changes.
00:34:07.660 That's why everything went haywire in the 2010s.
00:34:10.460 It wasn't like this.
00:34:11.560 It wasn't like this in 2008, 2009.
00:34:14.660 There were still much greater possibilities of co-partisanship, of people working together
00:34:19.260 in Washington.
00:34:20.420 We didn't feel like we were all attacking each other all the time.
00:34:22.740 Mm-hmm.
00:34:23.720 It's true.
00:34:24.900 So you say really between 2011 and 2015 was the apex of the problem.
00:34:30.700 And I think that's interesting because I think a lot of people, when they look at the collapse
00:34:33.520 of, or the, I don't want to say our society has collapsed, but it certainly isn't in good
00:34:37.180 health.
00:34:37.600 It's on the road.
00:34:38.520 Yeah.
00:34:38.760 Yeah.
00:34:39.100 It's on the road.
00:34:39.800 And I think a lot of people look back and especially on the left and they say, it was
00:34:43.960 Trump, Trump.
00:34:45.600 And I did a long documentary with PBS a couple of years ago where they actually took an open
00:34:51.380 minded look at Trump.
00:34:53.240 And I made the case that there were a lot of very divisive things that happened during
00:34:56.660 the Obama presidency that set us on the road to, you know, this divisiveness we're feeling
00:35:00.900 now and tribalism.
00:35:02.060 But it didn't look at social media.
00:35:03.740 It was just about politics.
00:35:05.800 This is a much more persuasive case.
00:35:07.460 It's society wide.
00:35:08.720 There are a lot of people in the country who are not political at all, who are on their
00:35:12.500 phones, as you point out, 50 to a hundred times a day.
00:35:15.020 So this is a much bigger elephant in the room.
00:35:18.560 And you make the case that it was between 2011 and 2015.
00:35:22.240 So pre-Trump, that things really reached the height of awfulness or the pernicious effects
00:35:29.120 really sort of reached, I don't know, it was their apex and then they kind of stayed
00:35:32.260 there.
00:35:33.020 But explain why you picked those dates and what you mean by those, by picking those four
00:35:37.320 years.
00:35:38.520 So on campus, it was 2014 when this stuff first started.
00:35:42.140 Anyone who graduated from college in 2012 didn't see the speech of violence, the cancel
00:35:46.040 culture, the fear of speaking up.
00:35:48.500 And in 2015, so Greg and I read our article, The Coddling in the American Mind, that comes
00:35:53.880 out in August of 2015.
00:35:55.000 And then at Halloween 2015, with the Christakis affair at Yale and the students protesting
00:36:00.120 about Halloween costume guidance.
00:36:01.440 So things blow up on campus in 2015, really, 2014, 2015.
00:36:05.840 And only recently have I discovered how many other things were happening around 2014, because
00:36:10.600 it's only then that we can get global cancellations.
00:36:13.300 So many listeners will remember the story of Justine Sacco, a woman who was flying to South
00:36:18.960 Africa.
00:36:19.500 She tweeted a joke in somewhat poor taste, but it wasn't racist.
00:36:22.220 It was actually a joke about white privilege.
00:36:23.520 So she tweets this joke, gets on her plane, lands in South Africa, and there's a global
00:36:28.480 outrage around her, and she's fired the next day.
00:36:30.880 So that happened in December of 2013.
00:36:33.160 That could not have happened in 2008.
00:36:35.180 There was no way to have a global mob around this woman in 2008.
00:36:41.440 So that's December 2013.
00:36:43.720 In early 2014, Brendan Eich is promoted to CEO of Netscape.
00:36:49.060 And someone discovers that he gave $1,000 to a group in California that was opposed to
00:36:53.840 a bill, a proposition about gay marriage.
00:36:57.060 And so he is fired or let go within two weeks.
00:37:00.820 This sort of global, the ability to create a Twitter mob wasn't there before the retweet
00:37:05.540 button.
00:37:06.340 And so we get a lot of things happening 2014, 2015.
00:37:10.800 There's a new culture of outrage.
00:37:12.400 And so the argument that I made in the paper, in the Atlantic essay, is that it's as though
00:37:17.720 with these hyper-viralized platforms, it's as though they gave out a billion dart guns.
00:37:22.860 Everyone in the world gets a dart gun with unlimited darts, and you get to shoot anyone
00:37:27.600 you want.
00:37:28.060 It's not going to kill them, but it'll shame them.
00:37:30.320 It'll hurt them emotionally, relation.
00:37:32.380 It'll hurt their reputation.
00:37:34.060 And the net effect of everyone having a dart gun is most people don't want to shoot anyone.
00:37:38.340 That's the middle 80% of the country.
00:37:39.660 They just put them down.
00:37:40.400 I don't want to attack anyone.
00:37:41.320 But the people on the extremes are psyched about being able to shoot not just their enemies
00:37:47.120 across the aisle, but the moderates on their own side.
00:37:50.960 And so we now have a communication environment, which is much more characterized by walking
00:37:56.740 on eggshells.
00:37:57.420 That's the phrase everyone uses.
00:37:58.740 I feel like I'm walking on eggshells.
00:38:00.980 And if you're walking on eggshells, you can't trust the people around you.
00:38:04.160 You have to self-censor.
00:38:05.680 You can't be creative.
00:38:06.820 You can't be funny.
00:38:07.940 There used to be a lot of humor in the academy.
00:38:09.680 And it was, you know, students love to laugh.
00:38:11.660 Professors love to tell jokes.
00:38:13.200 But all of that goes away when we're all afraid that if anyone takes offense, they've
00:38:17.580 got a dart gun.
00:38:18.300 They've got a website to report me.
00:38:20.640 So that's why I argue that this isn't just about polarization left, right.
00:38:25.680 This is about we're afraid of the person next to us or the person, the unknown person on a
00:38:30.080 social media platform.
00:38:30.800 And that is new.
00:38:31.620 That really comes to fruition in 2014.
00:38:35.040 My God.
00:38:35.500 And it's so very much still happening.
00:38:38.520 I mean, I saw you tweeted something on this.
00:38:42.460 I think you retweeted a Jonathan Turley article.
00:38:44.840 But I tweeted about this just recently today as well about what's happening to this Princeton
00:38:49.760 professor.
00:38:50.860 It's horrible.
00:38:52.620 It's horrible.
00:38:53.500 There's a Princeton professor who's now they're pushing to fire this professor, Joshua
00:38:58.580 Katz.
00:38:59.140 He's a classics professor, which you're already not allowed to be because you study a bunch
00:39:02.460 of old white guys.
00:39:03.760 And that's that's his first sin.
00:39:06.840 And now the president of Princeton is calling on the university to fire this tenured professor.
00:39:13.360 Why?
00:39:13.680 What did the guy do?
00:39:14.600 Is it what was his hideous sin?
00:39:16.340 You know what he did in 2006?
00:39:19.160 He had a consensual affair with a student for which he's already been punished.
00:39:24.000 They already turfed him off campus for a year where he was suspended for a year.
00:39:28.420 He paid the penalty.
00:39:31.060 You know, I mean, and now in a court of law, double jeopardy would have attached.
00:39:35.000 You can't go back and relitigate it.
00:39:37.160 But they are.
00:39:38.420 They're trying to get back into it and open back up the case.
00:39:41.580 Why?
00:39:42.240 Because he wrote an article critical of the reforms some of the black faculty members
00:39:48.160 are asking for at Princeton, like more sabbatical than their white colleagues.
00:39:54.720 And I think additional pay.
00:39:57.560 I'm trying to look for what they were.
00:39:58.940 But I mean, they were it was obviously like favoritism based on race.
00:40:02.920 He said he objected to faculty of color receiving special course relief and summer salary and
00:40:09.060 an extra semester of sabbatical and criticized, quote, extra perks for no reason other than
00:40:14.860 pigment pigmentation.
00:40:16.060 And he also criticized something called the Black Justice League, which was active for
00:40:21.900 two years on campus as he called it a local terrorist organization that made life miserable
00:40:26.700 for many, including many black students who didn't agree with its members demands.
00:40:30.560 And because it seems very clear because of that letter, they're now trying to meet to him
00:40:35.720 right out of Princeton.
00:40:37.000 We covered what happened to Roland Fryer, something very similar at Harvard, which is still ongoing.
00:40:43.040 These are professional assassinations.
00:40:45.400 So they have good reason to fear the dart.
00:40:48.260 Yeah, that's right.
00:40:49.480 So, I mean, you know, the case is complicated and I don't know the I don't know the details.
00:40:53.280 But I think what is clear is that whatever whatever behaviors he did on campus, he was
00:40:58.320 investigated for, he did have affairs with students over time.
00:41:02.440 So I can't comment on any of that.
00:41:04.060 I don't know.
00:41:05.100 But what was clear to me was that the investigation was restarted and pushed to the point of firing
00:41:13.420 him only because he wrote an article in Quillette that offended many on campus.
00:41:19.100 And I think we see the same dynamics here that we saw in the Dorian Abbott case.
00:41:23.420 So Dorian Abbott, a physicist at University of Chicago, he wrote an essay in Newsweek.
00:41:28.720 Criticizing race-based affirmative action, criticizing certain aspects of DEI, agreeing with certain
00:41:34.180 goals, but being critical of the main thrust of DEI programs.
00:41:39.780 Now, that's part of what a democracy is.
00:41:42.180 That's certainly something that a professor should be able to write about.
00:41:45.460 And he was invited to give a very prestigious lecture at MIT.
00:41:50.040 And some, I don't know if it was students or administrators, were upset that we're going
00:41:53.780 to give an honor to this guy who wrote this article that offends us.
00:41:58.460 And so they put pressure on the MIT administration to uninvite him because of an article he wrote
00:42:04.700 in Newsweek.
00:42:05.660 And so I think it's the same sort of thing here.
00:42:07.600 It's not that Princeton wants to do this, as far as I know.
00:42:09.680 Again, I don't know the details of a complicated case.
00:42:11.720 But I'm pretty confident that pressure is being put on the administration by students, I suppose,
00:42:18.720 primarily.
00:42:19.620 Pressure is being put on, and we've seen this over and over again, when pressure is put on
00:42:23.180 by pressure groups.
00:42:24.640 The administration almost always caves.
00:42:26.940 And they use the same kind of language.
00:42:28.880 So that's why I think the principle here is, if you say something that offends, if you say
00:42:34.920 something that violates the sacred values of a powerful group on campus, then if they can't
00:42:40.080 get you for what you said, they do what's called grievance archaeology.
00:42:43.980 That is, you dig up everything.
00:42:45.280 You go back through all their tweets.
00:42:46.640 You find something that you can get them for.
00:42:49.680 And that is illiberal.
00:42:51.760 That's part of due process is there's got to be a process to adjudicate bad behavior.
00:42:56.560 We follow the process.
00:42:57.580 If you're found guilty, you receive your punishment, and then that's it.
00:43:01.320 We're done.
00:43:02.280 And so, again, I don't know the details, but it seems as though Princeton is behaving this
00:43:08.840 way because of internal pressure groups rather than, I mean, I suppose if they could go back
00:43:13.500 and investigate every single professor, they would find a lot of malfeasance.
00:43:16.980 I hope somebody goes back right now and investigates the president of the university.
00:43:20.420 This is me speaking.
00:43:22.100 Christopher Eisgruber, who is the one calling on the university board to fire Professor Katz.
00:43:27.600 Go back.
00:43:28.060 I'm urging the online people to go back.
00:43:29.840 Go do an investigation on him.
00:43:31.460 I guarantee you he's had some affair with some student or he's crossed some ethical line.
00:43:35.480 These people are not pure.
00:43:37.000 These are judges.
00:43:38.440 None of us are.
00:43:38.880 They're not pure.
00:43:39.800 It's right.
00:43:40.200 No one is.
00:43:40.800 That's what you learn in Catholicism.
00:43:42.080 That's one benefit I had growing up.
00:43:43.800 You know, you have to be quick to forgive.
00:43:45.760 You know, who among us has a sin?
00:43:47.300 Slow to judge, quick to forgive.
00:43:48.540 That's right.
00:43:48.800 Yes.
00:43:49.360 And so this is you want to play this game.
00:43:51.940 Let's play.
00:43:53.380 Let's go.
00:43:54.380 Look, I'm in.
00:43:55.520 If these are the rules we have to play by, then let's play by them.
00:43:58.480 Because these people like woke CEOs and woke college presidents need to be taught a lesson
00:44:03.600 that you come for me and I will sick the anti woke mob on you.
00:44:08.500 You did not lead a perfect life.
00:44:10.400 It's the only terms they're going to understand.
00:44:12.140 I can certainly understand the need for counterforce.
00:44:16.840 I can certainly understand turning the tables, hoist on your own retard, all of that.
00:44:20.740 But I've been studying the culture war for a long time.
00:44:23.160 I study political polarization.
00:44:25.260 And, you know, in a military war, you can apply such force that you literally kill your
00:44:30.060 enemy and can take their territory.
00:44:32.320 But in a culture war, you can't do that.
00:44:34.320 In a culture war, the harder you attack your enemy, the stronger he gets.
00:44:37.740 It's the only thing you can do is you can either give them more ammunition by doing
00:44:42.280 giving more anecdotes, more terrible things that people on your side have said you can
00:44:47.560 give them more ammunition or less ammunition.
00:44:49.680 I'm not sure that this is a strategy with Disney and DeSantis, right?
00:44:52.920 Disney's lost billions of dollars in stock value.
00:44:56.000 All corporations are now they're not they're not coming and commenting on abortion suddenly.
00:45:00.120 And Disney's gone quiet on its woke agenda for the past few weeks.
00:45:04.540 Why?
00:45:05.180 Because they took a terrible hit.
00:45:07.740 For speaking out about this.
00:45:09.280 And they got in a war that didn't really serve them well down in Florida.
00:45:12.340 I don't I don't agree with you on this point.
00:45:14.500 I mean, I know you're brilliant.
00:45:15.400 You've studied this way more than I have, but I've lived it and I've been through it
00:45:17.820 personally.
00:45:18.160 And I can tell you fighting back works.
00:45:21.480 OK, let's talk about it, because I really am not sure I'm right here.
00:45:24.220 In fact, I think what I have here is just an insight, not an overall strategy.
00:45:30.660 And you're right that Disney was under Disney was under pressure, let's say, from left wing
00:45:35.260 groups.
00:45:35.540 And so they moved to the left and then it's only if there's counterpressure that they'll
00:45:39.220 move back.
00:45:39.760 So there is a need for counterforce.
00:45:41.380 I think it was a quote from Margaret Thatcher during the during the Balkan War.
00:45:44.580 I can't find it, but I remember hearing it when I was young.
00:45:46.960 The problem is not whether we should use force in in Bosnia.
00:45:50.560 The problem is that force is being used by one side overwhelmingly every day.
00:45:54.780 The question is, will there be counterforce?
00:45:56.820 That is the question of arming the Bosnians against the Serbs.
00:45:59.280 So by analogy here, I do understand the need for pushback.
00:46:01.720 But here's what really alarms me.
00:46:03.460 Each time we have an escalation in the culture war, a new strategy is brought in, which is
00:46:08.320 an escalation of weaponry.
00:46:09.900 And so what we're now seeing, which really alarms me, is things like the Texas abortion law.
00:46:13.960 And I think the Florida, you know, the law we're talking about here, I believe they have
00:46:18.860 provisions by which anyone can bring a lawsuit against a citizen, a teacher, a doctor.
00:46:25.120 This is a huge escalation in the culture war.
00:46:27.320 This means that now people doing their job have to worry that they could suddenly be sued
00:46:33.060 by a bunch of activists on the other side, even if they've done nothing wrong.
00:46:37.200 That's certainly true in the case of the Texas law.
00:46:40.440 I don't know about the Florida one.
00:46:41.440 I'm going to have to check.
00:46:42.080 I don't remember seeing personal liability.
00:46:44.340 Provision in that way.
00:46:45.060 But let me stand you by.
00:46:45.920 I'll do a quick break.
00:46:46.960 And then I want to talk to you about this.
00:46:48.180 This is an interesting strategy and what what serves the community and what doesn't.
00:46:52.780 And then we've got to talk about Trump, because I think your observations on how
00:46:55.880 he was the first one to understand the Tower of Babel had fallen and how to work that
00:47:01.040 is very, very fascinating.
00:47:03.420 OK, stand by.
00:47:04.400 John's staying with us.
00:47:05.280 And don't forget, folks, you can find The Megyn Kelly Show live on Sirius XM Triumph Channel
00:47:08.940 111 every weekday at noon east, the full video show and clips by subscribing to
00:47:12.440 our YouTube channel, YouTube dot com slash Megan Kelly.
00:47:15.220 So, yeah, this has been a debate on the on the right in particular for a long time now
00:47:22.100 since cancel culture really sort of took over how to fight back, you know what to do.
00:47:25.860 And I think for a long time that the effort was to say, stop it, stop it.
00:47:30.280 And on the on the right, too.
00:47:31.560 That's what the Harper's letter was about, right, saying just just stop it.
00:47:34.800 That was mostly liberal saying, yeah, we hate Trump.
00:47:38.440 Yeah.
00:47:38.840 We don't we don't like Trump.
00:47:40.080 This isn't about loving Trump.
00:47:41.540 But this is about being truly liberal in the in the genuine sense of that word and allowing
00:47:46.580 free speech, a principle you've been fighting for your entire academic career.
00:47:50.280 And it's not working.
00:47:51.780 So I think that that has led a lot of folks in this battle, left or right, to say, all
00:47:58.660 right, what would work?
00:48:00.120 And I've come around to the belief of cancel them, cancel them all, cancel Chrissy Teigen,
00:48:06.060 cancel all of them.
00:48:07.860 Now, I realize, depending on the institution that they're at, that may not be realistic.
00:48:12.380 Joy Reid should have been canceled a long, long time ago, given her views.
00:48:15.820 But she won't be.
00:48:17.160 But at least you can get some skin in the game.
00:48:19.620 I guarantee you this Princeton president has got something in his past he doesn't want
00:48:23.920 coming out.
00:48:24.500 And the only thing preventing it from coming out is journalists and people who are anti
00:48:30.660 cancel culture from taking out their own microscopes and magnifying glasses and taking a hard look.
00:48:37.060 So the more you punish people for being this cruel, the less likely they are to dip a toe
00:48:45.660 into those waters.
00:48:46.520 OK, so during the break, during that nine minute break, I thought about what it is.
00:48:52.360 I appreciate you pushing back on me.
00:48:54.400 But I was saying, wait, you know, this is just going to escalate things.
00:48:57.240 And I realized what my concern is.
00:49:00.140 So I've been studying polarization since about 2004.
00:49:03.860 And the path we're on is towards catastrophic failure as a country.
00:49:07.740 If we don't change what we're doing, the American experiment set up by our founding fathers who
00:49:13.500 understood our liability to faction, factionalism, fighting, it wasn't clear that this experiment
00:49:19.180 would work.
00:49:19.960 Now, it's worked very well on and off, extremely well.
00:49:23.480 And now it's not working.
00:49:25.020 We are headed.
00:49:26.500 If we keep going this way, we are headed towards catastrophic failure.
00:49:29.480 And so if you think about this is like a fight taking place on a ship or on a ship, the ship
00:49:36.080 has been damaged.
00:49:37.260 There's water coming into the hull.
00:49:39.060 We've got the crew is divided into the red team and the blue team.
00:49:42.120 And the red team and blue team are fighting.
00:49:44.260 And what you're saying is, look, the blue team has been really unfair.
00:49:47.600 It's time that the red team fought back using blue team's methods.
00:49:50.720 And there's a certain wisdom in that.
00:49:52.420 But we can see where this is going to go.
00:49:54.120 There's no way either side can win.
00:49:55.420 So we just keep doing this until in five or 10 years, the ship sinks.
00:49:59.300 That's what I'm trying to avoid.
00:50:00.920 And that's why the last quarter of my article was not about how one side can win, because
00:50:05.420 neither side can win.
00:50:06.440 The last quarter was, what are the structural changes?
00:50:09.620 What are the changes to in Congress, changes to how we do elections, changes to social media?
00:50:15.180 What are the changes we could do that would return us just to levels of hatred we had back
00:50:19.260 in maybe 2008?
00:50:20.760 If we could go back to those levels of hatred, then I think we can make it as a country.
00:50:24.380 But on our current path, we're in huge trouble.
00:50:28.880 So what I'm saying is, we need to think about structural reforms so that we can actually
00:50:33.320 sometimes, in some places, talk to each other and even work together sometimes.
00:50:37.560 Fair enough.
00:50:38.220 I get that.
00:50:38.780 But I see it more as like, and I see, I understand your point of it's holes in the boat either
00:50:43.360 way you do it.
00:50:43.940 And that means the boat sinks.
00:50:45.220 I see it more like there's generals up on the left and there's generals up on the right
00:50:49.000 at the front of the boat, admirals, I guess, sailing it.
00:50:52.200 And the left keeps taking out only one side's admirals.
00:50:57.000 They just keep continuing to take down admiral after admiral after admiral who they think
00:51:01.940 are on the opposite side.
00:51:02.920 And the red team keeps looking at them saying, please stop doing that.
00:51:06.180 That's not good for the boat.
00:51:07.480 We're not going to make it if you get rid of all this brain trust up at the front.
00:51:11.180 And the left just has the middle finger up.
00:51:14.180 And finally, the right says, we're going to hurt some of your generals or your admirals.
00:51:17.560 They're going too.
00:51:18.660 And finally, the hope is that the offensive side will realize, well, this is stupid.
00:51:23.740 We can't survive without our leaders and stop the firing.
00:51:26.080 And you think that's going to happen?
00:51:27.640 Well, it's a shift in tactics.
00:51:29.500 It's worth trying.
00:51:30.760 Reaching out across the aisle and saying, please don't.
00:51:33.780 I mean, look, what difference did the Harper's letter make?
00:51:36.380 Nothing.
00:51:36.940 They were mocked.
00:51:37.540 Everybody on that was like, and I love I love Thomas Chatterton Williams, and I thought
00:51:41.120 it was a noble idea.
00:51:42.480 It's just you can't reason with these people.
00:51:46.060 OK, but what but there's no point at which the other side, whichever side it is, is going
00:51:52.700 to realize, oops, we're wrong.
00:51:54.660 We need to be more realistic.
00:51:55.860 Like, that's just not going to happen.
00:51:58.340 Why can't we destroy them?
00:51:59.920 Why can't we make them afraid?
00:52:01.340 Why can't we make them have skin in the game?
00:52:03.340 Like, no, no, I just mean, I just mean, think about it.
00:52:06.560 This this president of Princeton, if he really thought that I was going to devote the resources
00:52:10.000 of my entire team over the next month to digging up dirt on him, which I 100 percent could and
00:52:14.440 maybe will do, I think he'd be a little scared.
00:52:17.780 I think he would.
00:52:18.780 And I think if I found something, if I found some young woman he slept with, this is made
00:52:22.460 up.
00:52:22.660 I don't have any information that happened.
00:52:25.200 He'd be scared shitless that I was going to turn around.
00:52:27.400 I was going to do it to him.
00:52:28.180 And then maybe the next university president would hesitate a little before they decided
00:52:32.640 to use a human failing for which someone has already been held accountable against him
00:52:37.900 to punish him for his divergent viewpoint on a separate matter.
00:52:42.280 Oh, OK.
00:52:43.240 You're talking about a kind of counterforce which would have an effect.
00:52:46.700 Here's a better way to do counterforce.
00:52:48.360 In fact, there's an organization co-founded originally by some Princeton professors called
00:52:53.720 the Academic Freedom Alliance, AFA.
00:52:57.820 And what they do is in cases where, in fact, Jonathan Katz and Robbie George are founding
00:53:05.260 members of this.
00:53:06.000 I'm one of the founding members as well.
00:53:08.160 What the Academic Freedom Alliance does is it sends a letter to universities saying, if
00:53:12.480 you go ahead with this, we've got lawyers.
00:53:14.400 And this has been FIRE, the Foundation for Individual Rights in Education, also does similar work.
00:53:18.740 So I think that kind of pressure, I think, is a positive way of doing something which
00:53:23.800 is within professional norms.
00:53:27.000 I think the politics of personal destruction, I do understand if they do it, then why can't
00:53:31.820 we do it?
00:53:32.460 I understand that.
00:53:34.220 But I think that way lies just continual escalation and the death of our country.
00:53:38.960 So I'm doing everything I can to think of how do we change the venue?
00:53:42.460 How do we change the incentives?
00:53:43.600 How do we change things so that people don't do this on either side?
00:53:46.660 That's the challenge, I think, that we have as a country for the next 10 years.
00:53:50.120 Well, and you do have real proposals.
00:53:52.760 I mean, I should say that to the audience.
00:53:54.040 You actually have, very condensed, a three-point plan that might at least help.
00:53:59.920 I guess I am feeling right now, just given being in the news and covering this so much,
00:54:04.100 it's just the constant indignity of what they do to people.
00:54:07.020 Like one of those Braveheart warriors, you know, with the face paint and like no underpants
00:54:11.460 and running and killing.
00:54:16.620 Yeah.
00:54:17.400 No, that's right.
00:54:18.160 Look, when you're repeatedly attacked and viciously attacked, yeah, you're not gonna, you know,
00:54:23.540 the strategy is not, oh, you know, let's make peace.
00:54:26.440 No.
00:54:26.860 But I'm trying to break us.
00:54:27.780 I'm trying to break us out of the binary of, and it's like, look at it this way.
00:54:31.660 Like, we used to talk in various venues.
00:54:35.640 We used to communicate in various ways.
00:54:37.840 Members of Congress would talk to each other.
00:54:39.860 Now, if you change the venue, you put in cameras.
00:54:42.800 Now, they're not talking to each other.
00:54:44.340 They're talking at the camera.
00:54:45.940 And in the same way, people used to talk to each other.
00:54:49.160 They could call on the telephone.
00:54:50.540 They could talk at work.
00:54:52.320 But with social media, what it's done is it said, here, why don't you guys fight it out
00:54:55.940 in the comments under a tweet or something?
00:54:58.000 And it's almost as if they said, hey, we're a venue, we're a platform for people to talk
00:55:04.320 to each other in the middle of the Roman Colosseum.
00:55:06.920 All conversations are going to take place with an audience that wants blood.
00:55:10.920 The audience is cheering for blood.
00:55:13.180 And so when Facebook developed threaded comments, this was 2013, they said, it's not enough that,
00:55:20.120 you know, President Obama posts something and people can yell and scream at him in the
00:55:24.700 comments.
00:55:25.500 That's not enough.
00:55:26.240 We want people to yell and scream at each other in the comments.
00:55:30.200 And so anybody types anything, you can now respond to them and people can respond to you.
00:55:34.700 And it's endless fighting.
00:55:36.480 Why?
00:55:37.140 Why do we have this?
00:55:38.400 Well, I understand Facebook wanted to increase engagement and it worked.
00:55:42.660 What I'm saying is as long as our entire environment pushes us to fight with each other and be mad
00:55:47.940 at each other and be drowning in outrage stories, there is no way out of this.
00:55:52.260 We have to find a way to break this dynamic to get out of the Roman Colosseum.
00:55:56.240 Mm hmm.
00:55:57.720 No, I mean, it makes perfect sense.
00:55:59.600 It's do you stand by that understanding, though, because you're in academia.
00:56:03.700 So your world has been completely saturated with this for I mean, it's it's up to the gills.
00:56:08.920 Not every industry is quite that bad.
00:56:11.040 And I do believe that these and you make the great case that sort of this the loudest members
00:56:16.320 of cancel culture have an have a disproportionate voice.
00:56:19.720 But I think the vast majority, as you also say, of the country is not with them.
00:56:23.720 They don't they don't really put political differences aside.
00:56:26.400 I think the people who are pro cancel culture are a small minority with a very big voice.
00:56:32.660 So why can't we destroy them?
00:56:34.280 Why can't the society survive if we just destroy them?
00:56:38.140 Because the rest of us, the vast majority of us aren't for that stuff.
00:56:42.440 I've said this so many times on my show.
00:56:44.440 And then we can go once we destroy them, then we can go.
00:56:47.540 We can argue about abortion and the Florida law and all.
00:56:50.580 We can do all the old fashioned arguing we used to do over politics, but this is where
00:56:55.860 metaphors can either illuminate or lead us astray.
00:56:59.180 And as the linguist George Lakoff said long ago in a brilliant book called Metaphors We
00:57:03.420 Live By, we think about argument using the metaphor of war.
00:57:06.600 And you just did that.
00:57:07.680 Why can't we destroy them?
00:57:09.220 Now, if we literally mean, why can't we destroy them?
00:57:11.860 What you mean is either kill them or lock them up or cut out their tongues or sometimes
00:57:16.940 that they can't talk.
00:57:18.040 I mean, hurt them so they will stop doing this.
00:57:21.120 Wait a second.
00:57:21.800 If you hurt people, are they going to stop doing it?
00:57:24.540 There's no, again, there's no way to win a culture war.
00:57:26.880 Look at Disney.
00:57:28.620 Well, yes, you can get, you can, you can have victories.
00:57:32.540 You can have Pyrrhic victories.
00:57:34.060 And this is something that the left has a lot of.
00:57:36.360 The left has a lot of, so a Pyrrhic victory is from a story in ancient Greece where a general,
00:57:40.800 I believe is, you know, he wins the battle, but he loses so many men that he ultimately
00:57:44.920 loses the war.
00:57:45.700 And so I think what we're seeing is, you know, my argument is that while the, I think the
00:57:52.900 Republican Party has in many ways gone off the deep end more than the Democratic Party,
00:57:57.200 but the cultural left has gone off the deep end much more than the cultural right.
00:58:00.660 That's the asymmetry I wrote about.
00:58:01.900 And that's what you're talking about is you've got all these institutions where the left is
00:58:05.800 doing all these things.
00:58:06.680 And what I'm arguing is that we need to, you can't beat something with nothing and you
00:58:14.080 can't shut people up by hurting them.
00:58:16.220 What we need, I think, is a much clearer notion.
00:58:18.920 We've got to all start talking about professional responsibility, a sense of duty, a sense of what
00:58:25.160 are you here for?
00:58:25.800 What is your job?
00:58:26.520 And if you're a university, your job is to, as a faculty member, it's to do research and
00:58:32.360 find the truth.
00:58:33.780 And as a teacher, it's to educate and bring up students.
00:58:37.300 If you're a journalist, again, it's to find the truth, but using very different methods.
00:58:41.020 So each institution has a telos, is the Greek word for end or purpose.
00:58:45.920 I think we're not going to end the culture war just by silencing our opponents.
00:58:50.360 We've got to build something positive instead.
00:58:52.480 We have to develop, the middle 80% of us, if we can develop a notion of basically, do
00:58:56.640 your job.
00:58:57.600 We live in a diverse world.
00:58:58.960 We live in a world with people who have different views.
00:59:01.180 Yeah, well, I had a comedian on the show not long ago.
00:59:03.520 I think it was Ryan Long who said, if everybody could just do their job instead of feeling the
00:59:09.540 need to cross lanes and judge and comment on everybody else's job, we'd be a lot better
00:59:14.460 off.
00:59:14.700 But let me turn the camera back on you because you've been fighting for these principles
00:59:20.560 for a long, long time.
00:59:22.280 You've been living them.
00:59:23.280 I mean, I was so impressed at the number of organizations that you're a part of that I
00:59:26.720 love, you know, Heterodox Academy and FIRE and all, we'll get to the Leck Rowe project
00:59:32.600 about children and so on, but it's not working.
00:59:35.720 I mean, academia has been lost.
00:59:38.320 And so what makes you think there's hope for it?
00:59:41.120 So you're right that it has not been working so far.
00:59:45.740 It is true that the concerns that Greg Lukianoff and I had in 2014, 2015 have spread far beyond
00:59:52.420 the university.
00:59:53.580 The universities have bought into a certain mindset that has brought them away from their
00:59:58.140 core mission.
00:59:59.420 So if we were to just say, okay, this is it.
01:00:01.920 This is, let's evaluate where we are now.
01:00:03.240 I'd have to say the trends have been against us.
01:00:06.780 But here's a reason for hope.
01:00:09.760 When I wrote the Atlantic article that came out six weeks ago today, actually, I was expecting
01:00:16.040 to get attacked from the left and the right and nobody attacked me at all.
01:00:19.340 In fact, hundreds of people wrote me, just regular people just wrote me thank you notes
01:00:23.500 saying, thank you.
01:00:24.920 I'm completely exhausted.
01:00:26.480 What is happening to our country?
01:00:27.760 I think what we're seeing is we had sort of mounting insanity throughout the 2010s.
01:00:33.140 The pendulum kept swinging and swinging and swinging, and there was no sign it was going
01:00:36.040 to swing back.
01:00:37.200 And then, of course, after George Floyd and that year of COVID, things went even further
01:00:41.080 and a lot of schools implemented like Ibram Kendi style programs.
01:00:44.520 They had very bad results, generally.
01:00:46.240 A lot of things backfired.
01:00:47.880 And I think what we're seeing now is that most people are recognizing this is crazy.
01:00:51.820 This is just completely crazy what's happening to us.
01:00:53.740 Um, so I, I am perceiving, like, look at, for example, the New York Times, the New York
01:00:58.440 Times dared to publish an op-ed, uh, an editorial praising free speech.
01:01:02.980 Now they were attacked widely for it.
01:01:05.080 Uh, you know, isn't that just speech for racists?
01:01:07.220 Um, but they did it.
01:01:08.480 And I, we're seeing this more and more that the leaders of organizations who are generally
01:01:12.820 true liberals, that is they're on the left and they believe in free speech and freedom
01:01:16.880 of association.
01:01:17.920 Um, they've been intimidated, uh, uh, uh, and, and pushed around.
01:01:22.180 But I think we're beginning to see more of them stand up.
01:01:24.620 Uh, we're seeing corporations that Netflix announcing, you know what, if you can't work
01:01:28.360 on a project that doesn't share your values, maybe you shouldn't work here.
01:01:31.860 I think we're going to see in the next few months, a lot of companies, a lot of companies
01:01:35.180 taking that line.
01:01:36.380 Yeah.
01:01:36.560 That was great to see.
01:01:37.640 I felt so heartened by that.
01:01:39.020 And, and you know what, frankly, it's just like, it's what Sirius, for example, XM has already
01:01:42.940 been living.
01:01:43.540 You know, Sirius has got lefties on, on its lineup.
01:01:46.520 It's got righties.
01:01:47.380 It's got people who are somewhere in between.
01:01:49.160 Um, that's the principle of the organization, you know, let, let more conversations happen.
01:01:55.800 There's a huge marketplace for ideas here and you can go to the ones that you agree with.
01:01:59.740 You can go to the ones you disagree with, but that's the American way.
01:02:03.020 They, they never lost sight of that.
01:02:05.080 Netflix did.
01:02:06.060 And I think it took the Dave Chappelle crisis to remind them of their core mission and of
01:02:11.380 what American people want.
01:02:12.860 They don't want Netflix to be just this woke corporation.
01:02:15.780 That's shoving social messages down our throats that all align with one worldview.
01:02:20.100 That's not a winning business model.
01:02:21.860 I think they're, they're starting to get that.
01:02:25.240 That's right.
01:02:25.960 The American people ultimately have a good sense.
01:02:28.080 And while it seems as though everything has been moving in one direction since around 2014,
01:02:32.600 um, I do think that now, and, and here it's incumbent on people to stand up for principles,
01:02:38.040 to stand up for the professional responsibilities, but to do it in a way that doesn't just trigger
01:02:42.380 more outrage.
01:02:43.120 This is my fear that the dynamics of polarization and of culture are, Hey, I'm so mad at you.
01:02:48.380 I'm going to hit you hard.
01:02:49.080 And then you get mad.
01:02:49.760 You hit me hard.
01:02:50.840 Um, breaking out of the cycle, carrying ourselves with more dignity, more civility, still standing
01:02:56.460 up for principles.
01:02:57.440 I think in the long run, um, I think this is the way to go.
01:03:01.920 I'm not seeding that point.
01:03:03.300 I still think I could be wrong.
01:03:05.280 I've got the brave heart based pain on when it comes to those canceled culture warriors and
01:03:09.160 woke university presidents who are casting judgment on everybody.
01:03:11.760 But I'm open-minded as always to, to the possibility that I may be the wrong one.
01:03:16.360 Can we spend one minute on Trump?
01:03:17.960 Cause I want to get into solutions and what you actually think might help.
01:03:20.600 And there's a, there's an interesting law on the books.
01:03:22.620 Well, not on the books, but being proposed in California that might help with the kid
01:03:26.960 internet stuff that we can talk about too.
01:03:29.100 Um, it's the first time I've ever seen a law in California that I think I might get behind.
01:03:32.660 Um, but I do think Trump is interesting because there's a line, um, from your article that
01:03:37.360 says you talking about how you date sort of the, the crisis, uh, peaking to the years
01:03:42.820 between 2011 and 2015, a year marked by the great awakening on the left and the ascendancy
01:03:46.740 of Donald Trump on the right.
01:03:47.960 Then you write, Trump did not destroy the tower, meaning the tower of Babel, as we've
01:03:52.180 discussed, he merely exploited its fall.
01:03:54.320 Then you add, he was the first politician to master the new dynamics of the post Babel era
01:04:00.080 in which outrage is the key to virality.
01:04:03.660 So good.
01:04:04.400 Uh, stage performance crushes competence.
01:04:08.100 Uh, I really related to that.
01:04:10.540 And then you say, and in which Twitter can overpower all the newspapers in the country
01:04:13.700 and so on.
01:04:14.700 That's it.
01:04:15.420 That was like, as a politician, forget his policies as a politician.
01:04:19.820 That was the thing he got before anybody else got it.
01:04:24.820 That's right.
01:04:25.900 Yes.
01:04:26.380 Um, because in the, in the mass media age where there, there was some sort of professionalism
01:04:31.920 in politics and journalism, you can question how good it was, but you know, if someone said
01:04:35.880 something atrocious, that could be the end of their campaign.
01:04:38.100 There were certain principles and rules and processes that we understood from what you might
01:04:42.140 call the pre Babel era when it was possible for a narrative to emerge about Jimmy Carter
01:04:47.600 or Paul Tsongas or whatever candidate, you know, um, um, there was a shared narrative that
01:04:53.440 could emerge, but Trump wasn't paying attention to any of that.
01:04:57.760 He was there tweeting.
01:04:59.260 Um, he was there saying outrageous things.
01:05:01.380 And I think if he had run four or eight years previously, I don't think he could possibly
01:05:05.440 have gotten the nomination.
01:05:07.040 Um, he just happened to come in at this time when everything was shredded.
01:05:11.740 There is no overarching narrative.
01:05:13.900 Um, there's wide distrust in institutions and we have such high, it's called negative partisanship.
01:05:19.780 That is Americans since the early 20, uh, since the early 2000s, we don't vote for the
01:05:24.600 candidate we want.
01:05:25.560 We vote against the candidate we hate.
01:05:27.900 And so Trump was perfect for that dynamic.
01:05:30.300 Um, Trump benefited from it.
01:05:32.240 Now, the fact that some of the, some similar things happen in Canada and the UK, certainly
01:05:36.100 the universities are identical in Canada and UK, the teen mental health crisis is identical
01:05:40.220 and they didn't have Trump.
01:05:42.000 Uh, now I do think that Trump made our politics much more coarse.
01:05:44.860 I think he greatly amplified, uh, uh, polarization.
01:05:47.840 I think he certainly drove people on the left insane, making them say and do things that
01:05:51.640 then, uh, you know, they attack people on the right with extra passion.
01:05:54.920 And that's how our culture war got so much more heated.
01:05:57.320 We're much more polarized than any Western democracy.
01:05:59.540 And that's partly why we're in such trouble now.
01:06:02.140 Hmm.
01:06:03.440 Uh, I don't, I mean, I can't disagree with any of that.
01:06:05.380 I think he, um, he saw the seam in the story and got himself in there and then totally
01:06:09.380 exploited it.
01:06:10.460 Uh, and it's why nothing could touch him.
01:06:12.400 You know, he called it the fifth Avenue rule that he could shoot somebody at fifth Avenue and
01:06:16.100 wouldn't lose any supporters, but you saw it happen time and time again with his, his, the
01:06:19.640 crazy things that emerged about him or from him during the campaign that didn't, didn't
01:06:23.540 touch him.
01:06:24.340 Uh, but I think that's interesting.
01:06:25.560 I do.
01:06:25.960 It, it concerns me that all the future politicians are going to think they have to do all the
01:06:29.940 same stuff that I would argue.
01:06:31.780 Yes.
01:06:32.120 Maybe it got Trump elected.
01:06:33.240 Yes.
01:06:33.460 It telegraphed to the base that he didn't care what the old party thought or did, but if
01:06:39.540 it continues, it scares me about what we're going to get on an ongoing basis in
01:06:44.160 the white house, cause at least Trump did wind up having some good policies, at least
01:06:47.580 from my standpoint.
01:06:48.820 Um, I don't know whether the next guy will, or whether he'll just be skilled at dividing
01:06:52.600 us, fighting back, flipping the middle finger and, you know, getting himself in the office.
01:06:59.700 Yeah.
01:07:00.280 Yeah.
01:07:00.680 I agree with that.
01:07:02.520 Um, that, and that's, what's appealing about somebody like a Glenn Youngkin, right?
01:07:05.320 He seems like his little fleece sweater vest seems unthreatening though.
01:07:08.480 We'll see.
01:07:09.560 You never know.
01:07:10.560 I can't judge the book by its cover.
01:07:12.460 Okay.
01:07:12.900 So let's go back to, um, social media because one thing we didn't talk about was Instagram
01:07:18.500 and how in particular, this is a pernicious force.
01:07:22.480 I mean, we, Twitter, we know Facebook, we know Instagram has been outed by many, including
01:07:26.920 the, the whistleblower and, and you've written a long piece on this too, about how, just how
01:07:32.400 bad it's gotten.
01:07:33.040 Cause it's not just, wow, they're really divisive.
01:07:35.600 Wow.
01:07:36.040 They're really undermining institutions and faith of America, Americans in each other.
01:07:42.320 And in their country, it's, they're actually seriously causing mental health problems that
01:07:47.840 may be fatal.
01:07:48.780 That actually may be fatal in a lot of cases with young girls in particular.
01:07:53.240 It's not to put it all on Instagram.
01:07:54.840 You know, I like to believe I'm raising healthy children that Instagram, there's only so much
01:07:58.820 damage you could do, but, um, I'm sure that's what most of the families believe.
01:08:04.080 So can you spend a minute on them?
01:08:06.320 Sure.
01:08:06.900 Um, so first I'd encourage listeners and viewers to go to, uh, I just created a page where I
01:08:11.860 put all of my top resources for social media.
01:08:13.880 So if you go to jonathanheit.com slash social media, all one word, um, I've put there my
01:08:20.140 Atlantic articles.
01:08:20.980 I gave a testimony in front of a Senate committee two weeks ago where I, I created a document
01:08:25.180 laid out.
01:08:25.680 What exactly does the data say?
01:08:27.540 What's the evidence that social media is a cause of this problem?
01:08:31.020 And so what the evidence shows clearly is that rates of anxiety, depression, self-harm
01:08:36.080 and suicide were relatively flat, um, in the early two thousands.
01:08:40.460 And then around 2010 to 2012, there are very sharp upturns in all of those graphs, especially
01:08:46.080 for girls.
01:08:47.140 Um, uh, suicide is certainly for both, but self-harm is primarily for girls.
01:08:50.860 And it starts very suddenly around 2012.
01:08:53.280 Um, and, and so that certainly points to social media as the cause, but the question is correlation
01:08:58.680 doesn't prove causation.
01:09:00.140 There've been a lot of previous moral panics over television and video games that turned
01:09:04.200 out not to have really been, uh, been correct.
01:09:06.760 Um, so I've been focusing on gathering all the academic research together to get a sense
01:09:11.220 of what's the evidence.
01:09:12.300 And it turns out the evidence is a lot of correlational studies that kids who use it more, especially
01:09:16.320 heavy users are two to three times more likely to develop depression or anxiety
01:09:20.760 disorders.
01:09:21.380 So there's correlational evidence.
01:09:22.740 There's experimental evidence.
01:09:24.080 When you randomly assign people to either use more or less, uh, social media accounts,
01:09:28.320 you generally see either, uh, uh, you know, a downturn or an upturn in their, in their mental
01:09:32.760 health.
01:09:33.400 And there's eyewitness testimony.
01:09:35.040 Ask any group of girls and their studies have done this.
01:09:37.380 Why do you know, why do you think that, uh, depression is rising?
01:09:40.440 They'll say it's social media.
01:09:41.580 So if you have all these sources of evidence, um, I think it is pretty clear that social media
01:09:47.180 and particularly Instagram is bad for girls' health.
01:09:49.620 The thing to really keep in mind is it's not just being on a screen.
01:09:53.260 It's especially, I believe I can't prove this part, but I think the most active ingredient
01:09:56.640 is when a girl puts a photo of herself up and waits for strangers or even friends, just
01:10:02.360 for people to judge her and comment on her.
01:10:05.000 There's new evidence that, that when girls do this during puberty, when you're going through
01:10:09.520 puberty, 11 to 13, that's when there's maximum damage.
01:10:12.460 So what I'm proposing in my article is we've, the age of internet adulthood was set to 13
01:10:17.940 crazily back in like 1997.
01:10:19.660 I think it was, that's way too low.
01:10:21.980 Um, it needs to be 16 and needs to be in four 16 or 18, but we can't have kids, especially
01:10:26.240 girls going through puberty, self-conscious, so uncomfortable in their bodies, putting
01:10:32.160 photos out there waiting for validation.
01:10:33.700 And then what if someone else gets more validation?
01:10:35.480 What if someone, your friend is more beautiful than you because of filters or whatever?
01:10:39.360 So, uh, I think the evidence that these visual media, especially Instagram is harmful for
01:10:43.440 girls' mental health is now pretty compelling.
01:10:46.240 So how would that be put into practice?
01:10:49.880 Like I mentioned, the California has got a law right now, uh, that would crack down on
01:10:53.640 the social media.
01:10:54.300 It's being proposed that would crack down on the social media companies with respect to
01:10:58.420 children and would make it tougher for them to do to them what they do to us in terms of
01:11:02.920 the addictive nature, uh, tracking them everywhere.
01:11:06.660 Um, things like autoplay where the next video just comes up and, you know, makes you want to
01:11:11.100 click on it.
01:11:11.940 Uh, notifications past a certain time of night.
01:11:14.740 I mean, those all make sense, but how would it work?
01:11:17.600 So when you, when a 13 year old gets an iPad, you as the parent would have to program in
01:11:22.420 this third, this device belongs to a 13 year old.
01:11:26.220 Hello, not a grownup and just that information, or you'd have to, as a parent, like type in
01:11:31.980 the restrictions you want it.
01:11:32.980 Cause we already have some restrictions we can put on.
01:11:34.940 Yeah, no, it has to be that the default is that kids can't get on until 16.
01:11:39.380 And so the way it needs, there are a lot of schemes to do this.
01:11:42.060 So, um, so for example, you know, what if you could, if anybody could go to Twitter or
01:11:47.960 Facebook or one of these, any of these sites, and it, suppose you could open an account,
01:11:52.320 um, uh, um, suppose you can open an account, um, but you have to get verified by, to show
01:11:59.480 that you're old enough to be using the platform, especially if you want to post, that's the most
01:12:02.520 damaging thing.
01:12:03.160 And 10 years ago, it was like, well, how are we going to know?
01:12:06.220 Like, how can you possibly know that the kid is, is, you know, is 16?
01:12:09.180 Like, are they going to have to show the driver's license?
01:12:10.960 But now there's all kinds of companies that figured out how to do this.
01:12:14.300 So the banking industry, gambling industry, there's all kinds of companies that figured
01:12:17.860 out how do we verify identity?
01:12:20.160 How do we verify age in ways that are not taking your driver's license and giving it to
01:12:24.980 Facebook?
01:12:25.960 So the industry is very creative.
01:12:27.020 There are lots of ways to do this.
01:12:28.700 Um, and what I'm arguing is that the only reason I, I don't know if your kids are on
01:12:33.820 yet, you're 12 year old, but both of my kids, when they entered sixth grade, they said, daddy,
01:12:38.020 can I have an Instagram account?
01:12:39.140 Everyone, or at least my, my, my, my son did, uh, you know, everyone has an Instagram account.
01:12:43.840 Um, and the only reason everyone has one is because everyone said to their parents, mom,
01:12:48.500 can I have an Instagram account?
01:12:49.240 Cause everyone has one.
01:12:50.040 We're all caught in a trap.
01:12:50.980 And that's the central idea of that, of the documentary, the social dilemma.
01:12:55.240 So we've got to break the trap.
01:12:57.020 Um, and so as long as we can keep most kids off until 16, even if a few are able to sneak
01:13:01.460 on, that doesn't matter because that won't put pressure on everyone to be on.
01:13:05.220 We've got to break that social trap.
01:13:06.980 Yeah, no, I, I took the road less traveled on this one and I said, no, I mean, my kids
01:13:13.620 are still young, but my 12 year old now he has a phone and just a phone and there's
01:13:18.880 no, there's no social media for him and there won't be for any of my children.
01:13:21.820 And you know, my, I hadn't even considered 16 as an opener.
01:13:25.840 I was thinking, enjoy college and good luck, but certainly no time before then, because
01:13:31.720 I, it's just too damaging.
01:13:33.660 I just don't see the upside.
01:13:34.740 Now, if we get to the point where every single kid in the class is on some social media app,
01:13:38.920 I guess we'd have to reassess it.
01:13:40.560 I mean, I had one guest come on and say the thing that's really, that's you really want
01:13:43.900 to avoid is do not let Snapchat or Facebook or Twitter or one of these become the main
01:13:50.160 place where they text because these apps are not dumb.
01:13:54.280 So Snapchat has the ability now to create group chats and group texts so that they go
01:13:58.600 through the app to do all their party planning and so on.
01:14:01.700 And that's the absolute worst thing you could do.
01:14:03.640 So that would be a hard line.
01:14:04.820 But right now, John, I'm kind of in a good place because I'm a public figure and I basically
01:14:08.340 told them, you know, do you like eating?
01:14:10.480 Because you won't be able to if you go on those websites and post something stupid.
01:14:15.900 And, you know, that's, they kind of accepted that.
01:14:19.500 Yeah.
01:14:19.860 So I can offer some advice to all the parents out there.
01:14:23.760 So the first is please go to letgrow.org.
01:14:26.980 It's an organization that I co-founded with Lenore Skenazy, a wonderful woman who wrote this
01:14:31.480 brilliant book, Free Range Kids, about how to give your kids a childhood where they'll
01:14:35.240 develop autonomy.
01:14:36.220 They'll learn how to take care of themselves and how to have conflict and cooperation.
01:14:40.480 So at letgrow.org, we've got lots of ideas, lots of suggestions.
01:14:44.320 What I can add as a social psychologist is we can each put controls on our own kid, but
01:14:49.320 our kids, when they're teenagers, what matters most to them, of course, is their friends,
01:14:53.860 what other kids think of them in their grade.
01:14:56.020 That's what matters most to them.
01:14:57.860 And so if you're the, if your kid is the only one who's not on, that will be painful.
01:15:01.320 That kid will be excluded.
01:15:02.520 Now in the long run, maybe that's good, but it will certainly be painful along the way.
01:15:05.340 Far better is if you can really make an effort to find some other parents, find some other
01:15:10.460 parents that share your idea, especially parents close enough where your kid can walk back and
01:15:15.680 forth.
01:15:15.960 They can walk to each other's homes.
01:15:17.960 So there's all kinds of ideas in Let Grow.
01:15:20.160 There's all kinds of ideas on my website, jonathanhyte.com slash social media.
01:15:24.780 This is a social dilemma and we have to work together to break it.
01:15:28.080 There's limited, it's hard for us as individual parents to keep our kids away from these platforms.
01:15:33.720 We have to try.
01:15:35.680 It's so hard, you know, I mean, I can definitely see a situation where somebody says, you know,
01:15:40.680 they're all, they're all drinking or they're all smoking pot, you know, or they're all vaping.
01:15:46.760 And, you know, if your kid's the only one, he's not going to get invited in which I'd be
01:15:50.520 like, too bad, you know?
01:15:52.680 So social media is different though, because it's basic communication.
01:15:56.060 It's the way you set up a party.
01:15:57.540 It's the way you set up a play date.
01:15:59.240 I mean, it really can be exclusionary if they don't, if they're all on one app and your
01:16:03.980 kid's not on it.
01:16:05.300 But I also plan on, on being like a little inspector Clouseau if they ever, I mean, I
01:16:09.900 am going to spy on everything and get ahead of problems because I do think, I don't really
01:16:16.100 believe in trust when it comes to your teenage kids and social media.
01:16:19.260 I believe in spying on them.
01:16:21.280 Am I wrong, John?
01:16:22.200 Well, yeah.
01:16:23.460 But this, but this is the difficulty is that at what point do they learn to, to moderate
01:16:27.620 themselves now these platforms are so powerful.
01:16:30.780 The law of the law of the reinforcement is so powerful that yet, if you don't do any
01:16:35.840 monitoring, your kids are likely to lie.
01:16:37.880 Look, we all, they all learn to lie.
01:16:39.640 Whenever you open the account, you just lie about your age.
01:16:41.520 They all learn that.
01:16:43.000 So what is the lesson we're teaching them?
01:16:45.180 We, I agree with you.
01:16:46.360 You have to monitor it, especially, especially early on, especially when they're going through
01:16:49.500 puberty, you know, 11, 12, 13, 14 kids must not be on, on social media, especially
01:16:54.320 on Instagram.
01:16:54.920 Um, but at a certain point before they go to college, you have to give it the more autonomy.
01:16:59.880 Um, my son, uh, knew that he couldn't have an Instagram account in middle school, but
01:17:03.760 he's very responsible, very conscientious.
01:17:06.340 Uh, when he joined the track team in 10th grade, uh, now he's, he's with a group of friends.
01:17:11.040 He just went ahead and opened an Instagram account by himself.
01:17:13.100 Didn't ask me for permission, but that was appropriate in my family.
01:17:16.120 Uh, cause he's really, he's really earned it.
01:17:18.020 Um, and so I don't spy on my son.
01:17:20.220 Um, I, you know, I do trust him and, uh, you know, maybe he'll betray that trust, but
01:17:24.520 you have to, of course, you have to go with what your kid is like and what the situation
01:17:28.600 is.
01:17:28.880 But, you know, we have to, the job of a parent is to work him or herself out of a job.
01:17:33.680 Um, that's something Greg and I say in our book and, uh, we, it's, it's hard with social
01:17:37.880 media if we have to try.
01:17:39.380 I know my God, I'm in denial about that facts.
01:17:41.700 Mine are still relatively young and it is painful to think about, but, uh, I know you
01:17:46.660 are right.
01:17:47.020 I want to talk a little bit about that, about the, um, sort of the, the approach that you,
01:17:52.320 that you were speaking of in the, in the Lekero project, because I'm, I'm a big believer
01:17:55.840 in it and the total lack of autonomy for children today is a massive problem and it's feeding
01:17:59.780 into this.
01:18:00.420 Uh, we'll pick it up there with John height after this quick break.
01:18:03.280 Don't miss a moment.
01:18:04.740 Don't go away.
01:18:09.840 John, the, the three proposals that you have among, among others, but the three to sort
01:18:14.140 of address some of these issues, the, the catastrophic fall of the tower of Babel include
01:18:20.620 harden our democratic institutions, reform social media and prepare the next generation.
01:18:29.180 So let's go through those.
01:18:31.060 Um, what do you mean by harden our democratic institutions?
01:18:33.740 Um, so, um, the, the key to a healthy democracy is having good institutions.
01:18:39.320 This is what distinguishes.
01:18:41.220 And if you go around the world, those places settled by great Britain tend to have more
01:18:44.820 stable democracy than those places settled by Spain, uh, for example.
01:18:48.680 Um, and so, especially now that we're going through rapidly rising political polarization
01:18:54.500 and cross-party hatred, and we're seeing some beginnings of political violence, which is
01:18:58.300 very frightening.
01:18:58.760 Um, we have to make sure that our democratic institutions are trusted and trustworthy and
01:19:04.560 that they can function even if things get a lot worse in terms of cross-partisan hatred.
01:19:08.960 And so, for example, I'm so disheartened by what has happened with the Supreme Court in
01:19:13.940 terms of, as I see it, I'm a, I'm a nonpartisan centrist as I see it, what Mitch McConnell did
01:19:19.800 in, in denying Obama a Supreme Court nomination, I think was, uh, it was a hardball baseball move
01:19:25.920 that I think damaged legitimacy and the, and the respect of the institution.
01:19:29.060 That was very bad for the Supreme Court for the country.
01:19:31.020 Um, and now where we are is, uh, you know, many people on the left are not going to trust
01:19:35.780 the Supreme Court.
01:19:36.420 They don't think the current makeup is what it should be.
01:19:38.700 Anyway, whatever you think about it, my point is just that it should not be as much of a
01:19:43.800 partisan game to get the timing.
01:19:45.100 The fact that we're picking judges based on, you know, how old they are.
01:19:49.600 No, just an 18 year old term.
01:19:51.600 A lot of people have talked about this.
01:19:52.880 Everyone should have an 18 year term.
01:19:54.540 Every president gets an appointment every two years, things like that.
01:19:58.060 If we do that, that just regularizes the process.
01:20:01.280 Now the process, of course, there's always politics in the process, but it's not, we're
01:20:05.640 not fighting to the death because so much is at stake over each, over each appointment.
01:20:09.440 So it's, you know, and gerrymandering of electoral districts, there's just a lot of things we
01:20:13.620 can do so that, you know, you want the Yankees and the Red Sox to have a good baseball game.
01:20:17.460 You don't want, say the Yankees to get to control all of the rules or the Red, I mean, or
01:20:21.180 the Red Sox, that'd be insane.
01:20:22.420 We've got to fix the game and then we can have the two teams play, play ball.
01:20:26.540 Yeah, it's hard to find nonpartisans to oversee something like elections.
01:20:31.500 I know that's one of your things, like make sure we have somebody we have in a position
01:20:34.460 of trust to oversee the fairness of elections.
01:20:36.640 It's hard to find nonpartisans right now.
01:20:38.480 It's like the secretary of state.
01:20:39.660 That's always a partisan person and they tend to, you know, push it for whatever side they're
01:20:45.120 aligned with.
01:20:45.980 But I don't know, like a few people, the people who are truly nonpartisan don't get
01:20:50.700 involved in government.
01:20:52.320 True, but it doesn't have to be, it doesn't have to be that every person is nonpartisan.
01:20:55.820 Suppose you had a commission to draw electoral districts in your state and the rule is you
01:21:01.720 go for generally compact, you know, you can't have long stringy districts, generally compact.
01:21:07.140 Now you have some people on the left, some people on the right, but they're not far left
01:21:10.380 or right.
01:21:11.120 And of course they're partisans, but they also, they live in the same town.
01:21:15.320 Maybe they're, you know, they have a lot in common.
01:21:18.400 They can work it out just as the jury works things out.
01:21:21.060 So there are ways to do this.
01:21:22.480 And our present system is a mess.
01:21:25.260 I'll add it just as an asterisk.
01:21:26.860 I think that as much as I agreed as a legal matter with Citizens United, I do think it
01:21:31.080 opened up such a floodgate of corporate cash into campaigns that it made individual politicians
01:21:36.820 beholden to like one donor instead of feeling any need to work across the aisle.
01:21:41.880 And so I don't know exactly the reform that's going to solve that.
01:21:45.220 But just because it's constitutional doesn't mean it's good.
01:21:49.440 And it's something we might take a hard look at.
01:21:52.480 That's the right way to think about this.
01:21:54.180 Think about it like if you love America, if you think that America is and has been and
01:21:58.620 should be a beacon to the world about self-governance, that we can govern ourselves.
01:22:02.920 We don't, you know, authoritarians, they can do certain big things well, but in the long
01:22:07.180 run, they fail.
01:22:08.200 We have to succeed.
01:22:09.320 If you want, if you want the American experiment to succeed, you've got to think about the rules
01:22:12.680 of the game and whatever it takes.
01:22:15.480 So, you know, we want people running for office to be responsive to their constituents.
01:22:21.120 We want them to have a long-term view.
01:22:23.080 The more they're incentivized to pay attention just to a few rich donors or just to their
01:22:27.220 partisan extremes, the worst system of government we have and the more China ultimately wins.
01:22:33.360 But I mean, as I say that, and I, you know, I said how I feel, but I hear the left saying,
01:22:38.380 you know, hate speech isn't free speech all the time.
01:22:40.720 You know, they said that literally our pal Michael Knowles just caused a controversy
01:22:45.160 on some college campus recently.
01:22:47.460 They said he's anti-LGBTQ.
01:22:49.420 He's not, he doesn't really believe in affirming gender pronouns and all that.
01:22:53.940 And they said, oh, no, we believe in free speech, but we just, but hate speech is not
01:22:57.420 free speech, which of course it is.
01:22:59.100 It's literally free speech.
01:23:00.640 And it's the reason the first amendment was created and all that.
01:23:03.520 So they want to burn down the first amendment in the constitution.
01:23:05.740 I don't want to do that.
01:23:06.860 You know, I just can see the consequence of why free speech laws are often ones that
01:23:11.060 may not be absolutely perfect for the, for the union.
01:23:14.140 Right.
01:23:14.720 Okay.
01:23:15.180 So let's move on to the second bucket of reforms, which is reform social media to make it less
01:23:19.120 toxic.
01:23:19.820 Now, whenever you say reform social media or regulate, people think what we're talking
01:23:24.720 about is the government's going to decide who gets to speak.
01:23:27.600 The government's going to decide what content is legal.
01:23:30.300 No, no, no.
01:23:32.040 Content moderation is a, is an issue.
01:23:34.000 And that's what almost everyone talks about, but what got us into this mess, isn't that
01:23:39.240 some people can post crazy conspiracy theories.
01:23:41.640 They could always do that back before the internet.
01:23:43.580 They could do that.
01:23:44.660 What got us into this mess is that, is that now since 2009, the more outrageous something
01:23:49.800 is, the more likely it is to spread.
01:23:51.580 It's a change in the dynamics of the platforms.
01:23:54.160 That's what has really, that's what knocked over the tower of Babel.
01:23:56.920 That's what's doing us in.
01:23:57.940 So in this bucket of reforms to social media, um, uh, it's things like, um, just, uh, reforming
01:24:06.360 the, doing things that things don't go viral so quickly.
01:24:09.760 So one of the most important things we could do is actually verify, um, verify identity.
01:24:15.280 It doesn't mean you have to post with your real name.
01:24:16.960 You can still post anonymously, but if you want to post content, banks have no, your customer
01:24:24.200 laws.
01:24:24.480 You can't just open an account with any bank and give a fake name.
01:24:26.740 You have to show who you are.
01:24:27.860 And I think it should be the same on, on at least the large platforms, the ones that really
01:24:31.980 have an effect on our democracy.
01:24:33.920 Um, you can open an account.
01:24:35.420 You can, uh, uh, you can see what's going on.
01:24:37.500 No problem.
01:24:37.940 But if you want to reap the advantages of these viral dynamics on a platform that has a special
01:24:43.420 protection from section two 30, the platform has a minimum obligation to verify that you're
01:24:48.060 a human being and not a Russian agent or a Russian bot that you're old enough to be
01:24:51.400 using the platform.
01:24:52.180 And actually Elon Musk tweeted this.
01:24:53.940 He said, authenticate all humans.
01:24:55.160 So whether we're just authenticating that you're a human, whether authenticating that
01:24:58.600 you're a human, you're old enough, but a few things like this, that would knock out
01:25:01.380 almost all the bots and it would reduce some of the really nasty behavior.
01:25:05.040 Now, who wants to be in a place where you say something and you just attacked by, by,
01:25:09.240 you know, thousands of, uh, thousands of accounts.
01:25:11.940 We need to make these platforms.
01:25:13.740 If this, if they're going to be important to our democracy, we need to make them places
01:25:16.720 that we feel aren't afraid to speak up.
01:25:18.340 So that's the second bucket of reforms.
01:25:20.800 Yeah.
01:25:20.920 And there's a second layer of what do we do to protect our children?
01:25:23.180 We talked about that, uh, on, in terms of online.
01:25:25.900 Um, then the third is very interesting and I, I love it and it's a cause near and dear to
01:25:30.140 my heart.
01:25:30.860 Prepare the next generation.
01:25:32.000 And this relates to the let grow project.
01:25:34.920 Um, your, your position is, and I share it entirely.
01:25:39.040 Treating kids as fragile makes them so.
01:25:42.040 That's right.
01:25:43.580 That's right.
01:25:44.100 So we are anti-fragile.
01:25:46.040 This is a wonderful notion from Nassim Taleb, uh, you know, glass is fragile.
01:25:50.300 If you drop it, it breaks.
01:25:52.020 Plastic is resilient, but there are certain things where if you drop them, they get stronger
01:25:56.820 and kids are like that.
01:25:58.820 Obviously I'm not saying physically drop your kids, but the point is if you protect your
01:26:02.460 kids, like if you protect their immune systems, they don't encounter bacteria, you're not helping
01:26:06.520 them.
01:26:06.900 You're actually crippling the development of their immune system.
01:26:09.720 And if you protect your kids that nobody ever teases them, nobody ever insults them.
01:26:13.060 They have no conflicts.
01:26:14.160 You're crippling their emotional development.
01:26:16.480 So, um, we have to prepare them for a world in which a lot of people don't share their
01:26:20.300 opinions and sometimes they'll criticize them.
01:26:23.180 And that used to happen on the playground.
01:26:24.820 Now we got concerned about bullying and of course their bullying is a real thing, especially
01:26:29.860 when it goes on for multiple days, it ruins a kid's life.
01:26:32.420 So we have, it's a fine line between preventing bullying and preventing conflict.
01:26:36.480 We have to allow unsupervised conflict.
01:26:38.660 We have to, kids have to have a lot of unsupervised experience.
01:26:41.040 Um, and you know, what we did in, in, in 2010 or so is they're supposed to have a lot of
01:26:45.780 experience, but we put them on experience blockers.
01:26:47.460 So this here, this is an experience blocker.
01:26:49.300 Once you, you know, once you have it, um, you know, once you, once your kid is, uh, is,
01:26:54.060 is on an experience blocker, they're not going to have the normal sorts of conflicts.
01:26:57.580 They're going to have, um, they're going to, everything's going to be immediate through
01:27:00.460 the phone.
01:27:01.420 So, um, so we have to attend to child development.
01:27:03.920 We have to give them a lot more unsupervised experience.
01:27:05.680 That's what let grow is, is all about.
01:27:08.100 I love that.
01:27:08.940 I'm ready to get down in 2010.
01:27:10.260 We put them on experience blockers.
01:27:11.840 That's exactly right.
01:27:12.640 Forget puberty blockers.
01:27:14.020 We have our kids on experience blockers and it's the phone that you have right in your
01:27:17.280 hand that you let your kid use or the one you gave him or her.
01:27:20.280 Um, this is so important to me.
01:27:22.280 Okay.
01:27:22.580 So both of these two and three in terms of your reforms are getting at one of my questions
01:27:29.340 here, which is, and it relates to the entire discussion we've had over these two hours.
01:27:32.480 The, the means to kill somebody socially can very much be found in social media.
01:27:41.220 You know, the Twitter mob piles on with the retweets and so on.
01:27:45.140 And they, they take somebody down, they cancel somebody that ruined somebody's life.
01:27:48.200 The means to kill is very much embedded in social media, but, but the desire to kill,
01:27:54.120 is it new?
01:27:56.380 Was that always there?
01:27:57.680 Was it latent and sitting there?
01:27:59.720 And we just finally found the means to, you know, express it or is the desire to kill
01:28:05.980 amongst this younger set in particular related to the things we're talking about?
01:28:11.880 If we get our kids, you know, quote prepared, and if we don't treat them as fragile and if
01:28:17.420 we expose them to different ideas and if we do all the things, are they going to be less
01:28:22.400 likely to want to use the social media for evil and, and cruelty in this way?
01:28:27.260 Yeah.
01:28:28.260 Well, I think if they're mentally healthy, um, uh, I think they will be stronger and kinder.
01:28:35.160 And I think if you are, uh, anxious, insecure, and fragile, you're more likely to seek solace
01:28:40.560 and comfort in a, in a mob, in a movement, in a group.
01:28:43.500 Um, and when that group engages in something, you're going to want to fit in with that group.
01:28:47.480 You're not going to have the guts to stand up against it.
01:28:50.060 Um, so I think for so many reasons, look, our kids' mental health is plummeting and this
01:28:54.480 is a, this is a humanitarian crisis.
01:28:56.060 This is a national crisis.
01:28:57.540 The surgeon general recently put out an advisory basically saying we have mental health, uh,
01:29:02.260 epidemic in this, in this country for teenagers.
01:29:05.020 Um, so I can't say that if we give kids normal childhoods and we let them have conflicts and
01:29:10.300 experiences on the playground, let them make teams, let them enforce rules, you know, that's
01:29:14.280 going to certainly be good for their mental health and their development.
01:29:16.400 And I can't say that's going to keep them from being nasty on social media.
01:29:19.920 In fact, look, you know, you and I know a lot of the people attacking us, almost all
01:29:23.300 of them are adults.
01:29:24.020 They had normal childhoods.
01:29:25.580 Um, so preparing them for adulthood alone, isn't going to stop the viral dynamics.
01:29:30.600 That's why I keep focusing on the architecture.
01:29:32.960 It's not about content moderation.
01:29:34.600 It's not about saying, oh, you can't say that.
01:29:36.100 You're not allowed to say that.
01:29:37.120 That's, that's a dead end.
01:29:38.280 We're never going to agree on that.
01:29:39.440 We can't, it's very difficult to find, find truth.
01:29:41.520 What is true in a, in a, in a tweet or a post.
01:29:43.900 Um, but we can change the dynamics so that, so that, uh, at present, the nastier you are,
01:29:51.500 the more outrageous you are, the more successful you are.
01:29:54.580 Now that's, I get it.
01:29:56.020 It's the same way you wouldn't go to a kid who's mentally struggling and say, you know,
01:30:02.180 with sort of a, a, a layout in front of him, here are all the ways available to ending
01:30:07.200 your life.
01:30:07.660 No, no same human would ever present that to a kid who is struggling.
01:30:11.560 And the same way we shouldn't have social media companies sort of presenting to them
01:30:15.680 the panoply of ways that their mental fragility can be exploited and used against others and
01:30:23.240 sort of used for, for nefarious purposes.
01:30:25.780 I want to say this, you, this is from the let it let grow project.
01:30:29.400 Uh, as far as your mission, we reject the idea that kids are in constant physical emotion
01:30:34.680 or psychological danger from creeps, kidnapping, germs, grades, flashes, frustration, failure,
01:30:39.100 baby snatchers, bugs, bullies, men, disappointing playdates, and, or the perils of a non-organic
01:30:44.180 grape.
01:30:45.960 Somehow our culture has become obsessed with kids' fragility and it's lost sight of their
01:30:50.920 innate resilience.
01:30:53.060 Let grow believes today's kids are smarter and stronger than our culture gives them credit
01:30:57.380 for, um, I, I love this.
01:31:00.020 I completely agree with this.
01:31:01.320 And I think, what is it?
01:31:02.920 How does it manifest?
01:31:03.920 What should people do today?
01:31:05.220 Cause there's one thing later where you talk about, tell your kid, I've got a homework
01:31:08.760 assignment for you.
01:31:09.880 You go home and you do something new on your own, climb a tree, run an errand, make a meal.
01:31:16.720 Uh, but so realistically, you know, how does the parent, cause I think most parents who
01:31:20.300 are like you or like I am, they're not, they don't need to be told this, but the people
01:31:23.860 who are holding on a little, who think maybe I am the helicopter parent.
01:31:26.580 Maybe I'm not preparing my kid.
01:31:28.360 What, what are realistic steps they can take to sort of reel back?
01:31:34.120 So once you recognize that your kid has to learn how to do things on her own, uh, that's
01:31:38.780 your job as a parent.
01:31:40.100 Now you can say, okay, well, let's, let's talk about the things that you could do on your
01:31:43.820 own.
01:31:44.000 And if you've never walked, you know, if you're six or seven, you've never walked the
01:31:46.380 dog on your own, would you like to?
01:31:48.440 Um, so if you sit down with your kid and you say, you know, what are some things that
01:31:52.100 you think you can do?
01:31:53.020 Do you think, would you, you know, do you think you can go to the store and get milk for
01:31:55.680 us?
01:31:56.340 Um, you know, you'll find that the kid actually wants to do things.
01:31:59.340 Now, what we suggest at let grow is do this in elementary school and get your elementary
01:32:04.260 school to do it.
01:32:04.920 So all the kids are doing.
01:32:06.840 Um, and if all the kids are, uh, coming up with something to do with their, you know,
01:32:11.060 at home, an errand, make dinner for us, whatever it is.
01:32:14.360 It's, it's an amazing thing that happens when the kid does this.
01:32:17.780 Um, they are so they're bursting with pride and then they do it.
01:32:21.880 They want to do it again.
01:32:23.020 Uh, so when my daughter was, uh, was six, uh, we had her bring me lunch here in New York
01:32:28.820 city.
01:32:29.020 She had across a somewhat busy street.
01:32:31.300 Um, and, uh, I was terrified and you know, my wife sent her off and I was waiting at the
01:32:36.380 office and I actually kind of like snuck around the corner to see.
01:32:39.600 But the point is, when she got to, when she got to my office, she was just bursting and
01:32:44.520 it was the most beautiful, beautiful thing.
01:32:46.260 And a few experiences like that.
01:32:47.940 And kids realize, you know what?
01:32:48.980 I can do things.
01:32:50.340 Whereas the way we're raising kids is to believe you can't do anything.
01:32:53.860 Everything's too hard.
01:32:54.620 Everything's too dangerous.
01:32:55.800 So I'll do it for you.
01:32:57.240 And that's the way to raise a kid who becomes depressed, anxious, fragile, and even suicidal.
01:33:01.760 And as you point out, let them play free play with kids of all ages, where there's a,
01:33:05.760 there's a social system where you get clipped, you know, before you get too out of
01:33:09.580 line at usually at proportionate levels and you learn, you learn by taking little risks
01:33:14.580 and having them either rewarded or punished appropriately, or sometimes inappropriately,
01:33:17.840 but like you learn.
01:33:18.920 And I love the distinction you drew between that experience and chronic bullying, which
01:33:23.260 is much different.
01:33:24.040 And you do need to step in on completely agree with all of that.
01:33:27.540 My God, this has been a great, great discussion.
01:33:29.560 I'm thrilled to meet you.
01:33:30.840 I absolutely loved your book and I can't wait.
01:33:32.620 What's the, the next one's coming out in 23, which is too long.
01:33:35.480 What's it called?
01:33:36.200 Because it's, it's all about Tower of Babel.
01:33:38.180 Yes.
01:33:38.840 Yeah.
01:33:39.260 The title is life after Babel adapting to a world we can no longer share.
01:33:44.760 It's about how we live in a world in which there are no shared narratives where this,
01:33:48.480 this kind of chaos is going to be with us forever.
01:33:50.260 As far as I know, it's for the rest of our lives.
01:33:52.500 So how do we make the best of it?
01:33:53.480 And I think we can.
01:33:54.500 Well, I will.
01:33:54.920 I a hundred percent would love to have you on, uh, when you release it to help promote
01:33:58.060 it.
01:33:58.280 And in between then and now I'm going to be working on taking down the president of Princeton.
01:34:02.660 Just kidding.
01:34:03.340 Just kidding.
01:34:03.840 John, what a pleasure.
01:34:06.780 Thanks for sharing your wit and intellect with us.
01:34:09.340 We appreciate it.
01:34:09.840 Thank you, Megan.
01:34:10.320 What a pleasure to be talking with you.
01:34:12.140 That was an in-depth discussion.
01:34:13.520 Really enjoyed it.
01:34:13.960 We're all talking about what, are we secretly helicopter parents?
01:34:16.340 Would we let our kids go get New York?
01:34:18.440 Anyway, don't miss it tomorrow.
01:34:19.680 We've got the guys from the fifth column and we'll talk to you then.
01:34:23.440 Thanks for listening to the Megan Kelly show.
01:34:25.280 No BS, no agenda, and no fear.