A Culture of Fear, Social Media Toxicity, and America's Descent Into Stupidity, with Jonathan Haidt | Ep. 327
Episode Stats
Length
1 hour and 34 minutes
Words per Minute
203.30838
Summary
Jonathan Haidt is the author of The Coddling of the American Mind, The Righteous Mind, and Why Good People Are Divided by Politics and Religion. He s also the co-author of Why the Past 10 Years of American Life Have Been Uniquely Stupid.
Transcript
00:00:00.500
Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.700
Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:15.140
Here's the question for you on a Monday morning.
00:00:17.460
Are we living through a uniquely stupid time in American history?
00:00:22.720
Have you woken up recently and said to yourself, how the hell did we get like this?
00:00:32.080
What's going on? Why can't we talk to each other?
00:00:38.140
Well, we've got some answers for you today, and they're great.
00:00:40.980
They're so insightful. I've really enjoyed preparing for today's interview.
00:00:44.060
Here with us today to explain our societal lapse in intelligence, among other problems,
00:00:48.780
is social psychologist at the New York University Stern School of Business, Jonathan Haidt.
00:00:55.000
Jonathan is also the author of the New York Times bestsellers, The Righteous Mind,
00:00:59.520
Why Good People Are Divided by Politics and Religion,
00:01:01.980
and the absolutely brilliant and just game-changing, The Coddling of the American Mind,
00:01:08.800
which he co-wrote with Greg Lukianoff, who's also been on the show earlier,
00:01:14.340
talking about what's happening on universities and his work to sort of document it and fight for free speech.
00:01:19.480
Now, Jonathan's latest Atlantic column from April is called
00:01:23.580
Why the Past 10 Years of American Life Have Been Uniquely Stupid,
00:01:27.120
and man, did it get everyone talking because of its brilliant insights.
00:01:30.380
His mission is to use research on moral psychology to help people understand each other
00:01:35.180
and to help important social institutions work better so you can sense his frustration
00:01:40.580
because those things are not going so well right now.
00:01:49.480
Thanks so much, Megan. What a pleasure to be talking with you.
00:01:54.520
Cannot wait to tap into your wealth of intellectual resources.
00:01:58.360
So let's start with, because I love The Coddling of the American Mind.
00:02:02.380
It just had such great insights, and it covered a lot of stuff that I'd been covering on the news
00:02:08.980
And when I think about your latest round of research, I think about the birth of my children.
00:02:14.940
I had a daughter in 2011, and then I had a son in 2013.
00:02:19.500
And so that's the time frame during which we lost our collective ever-loving minds as a country, right?
00:02:25.960
So that's going to sort of the mark of their arrival corresponds with the time where our society just went nuts.
00:02:36.080
I know the listeners to this show and the viewers have felt it, but perhaps not diagnosed.
00:02:43.940
All the stuff that, you know, Greg's been working against on college campuses and that led you guys to write The Coddling,
00:02:49.020
all of that, it's so far beyond what it was when you looked at it.
00:03:01.160
And I know we all know that, but why and how can it be stopped?
00:03:05.480
Because you can't arrest it unless you understand it.
00:03:07.500
That's where your latest round of research and writing and your next book come in.
00:03:12.420
So let's start there on, weirdly, the Tower of Babel.
00:03:17.640
Okay, explain that, what that is, and why that's your focus.
00:03:22.600
So I've been a professor since 1995, and I love being a professor.
00:03:29.320
And it just seemed like all of a sudden, in 2014, something changed.
00:03:33.320
Something changed, like, in the fabric of space-time.
00:03:43.820
We'd met through a mutual friend in May of 2014.
00:03:50.140
Well, it turns out our theory was partially wrong.
00:03:53.400
We thought universities were causing this to happen.
00:03:56.700
And now we know, no, this was a much bigger thing happening that affected Gen Z.
00:04:03.460
So your kids, my kids, my daughter is 12, my son is 15.
00:04:07.560
So I've been trying since 2014 to figure out what on earth happened.
00:04:12.420
Why did so many things change in such weird and strange ways?
00:04:17.420
I think we need metaphors to understand anything complicated, anything that we don't, doesn't
00:04:23.720
And it was when I went back and reread the Babel story.
00:04:26.860
It's, you know, it's a short little story in Genesis.
00:04:29.020
And it's that the descendants of Noah are spreading out across the plain of Shinar, and they decide
00:04:36.140
to build a city with a tower to reach onto heaven.
00:04:41.460
Well, there are a variety of theories I've heard as to why God didn't like this.
00:04:46.720
But in any case, God, he doesn't actually physically destroy the tower.
00:04:50.220
What the text says, and this is the key line, he says, let us go down and confuse their language
00:05:00.320
I found that story again a couple of years ago, and I thought, oh my God, that's it.
00:05:04.660
Because it's not just a story about tribalism, like left versus right.
00:05:09.900
It's a story about how everything has come apart.
00:05:12.740
And if you have a group of people that are entirely on the left or entirely on the right,
00:05:17.500
they're going to fight and find ways to fragment internally.
00:05:21.980
It feels like everything is crumbling since the early 2010s.
00:05:26.020
And in the metaphor, is the new God social media?
00:05:32.420
Or the devil or the whatever you want to call it.
00:05:39.340
And so what I, you know, I think, look, we've all seen dozens of articles about how
00:05:44.480
social media is destroying everything and destroying kids.
00:05:48.100
And I think why my article is a little different is that I'm a social psychologist.
00:05:52.120
And I wasn't just saying, you know, it's bad and here's why.
00:05:55.440
I was really trying to dig into what exactly is it that it did to social relations?
00:06:02.880
And so the story, I think the best way in is to put it in a narrative form, where once
00:06:08.700
upon a time, we had this incredible time of optimism in the 1990s.
00:06:13.560
Those of us old enough to remember, the 1990s went to the end of the Cold War and, you know,
00:06:21.940
And America even had a surplus in its debt for the first time in a long time.
00:06:26.440
It was this amazing time of techno-democratic optimism.
00:06:33.780
It's going to be liberal democracy from here on in.
00:06:35.840
And then when social media comes in, in the early days, 2004 or so, people think this is
00:06:48.160
And in 2011, the Arab, that year, that amazing year, begins with the Arab Spring, where Facebook
00:06:54.240
and a few other platforms helped the people of Egypt and Tunisia.
00:06:57.820
It helped them to bring down those Arab dictators.
00:07:00.580
And we thought democracy is going to break out in the Arab world.
00:07:05.420
And once again, populist movement, now this is more of a left-wing populist movement, but
00:07:10.620
So it was a time of enormous optimism about the power of this technology to help democracy.
00:07:19.300
And now democracy is on the back foot, as they say in Britain, and authoritarians, I know
00:07:26.880
Tristan points out that China is using this technology.
00:07:30.140
To make themselves better authoritarians, as it were.
00:07:33.520
And it's making us worse Democrats, worse at democracy.
00:07:37.720
So what I was trying to show is that the central problem with social media is that it doesn't
00:07:43.480
It doesn't, you know, we can communicate with text and Zoom and phone calls.
00:07:46.900
There's all kinds of ways to really talk to someone authentically.
00:07:49.680
But when you put something out on social media, you're performing.
00:07:52.460
And then you wait to see what everyone says about it.
00:07:54.780
And so it's this incentive that makes us want to perform at others to impress still
00:08:04.080
And that's what encourages us to just fight among ourselves constantly.
00:08:09.040
And we can't have a democracy if we're just fighting among ourselves all the time.
00:08:14.260
I remember all the good press about social media back in 2011 during the Arab Spring,
00:08:19.760
which also had a different ending than the one that we were hoping for when that broke.
00:08:26.460
But to me, it's almost like, you know, you go out on the date with the person.
00:08:29.820
And on the first date, the person sweeps you off of your feet and they're utterly charming
00:08:36.780
And you're thinking, oh, my God, this is wonderful.
00:08:40.680
This is our relationship with social media as well.
00:08:51.220
Like this is this is the 30,000 foot zoom out on what's happened between us and social media.
00:08:57.220
And by us, I mean, our our society, our world that goes beyond America, as you do a good
00:09:02.140
job of pointing out every time you get get back to.
00:09:04.680
Is it just American girls that are deeply depressed?
00:09:08.360
If you look at other societies, all these things are spiking in sort of the Western world
00:09:13.020
and the beginning, the onset of it is always linked back to right when social media, it's
00:09:19.940
not even just the creation of the iPhone in 2007.
00:09:21.940
It's the birth and explosion of social media, which was the game changer.
00:09:31.520
Very suddenly we are disoriented, unable to speak the same language back to the Babel
00:09:38.680
We are cut off from one another and from the past.
00:09:41.780
So I was feeling this just last week or recognize the same truth, you know, in the wake of the
00:09:50.320
You must have been watching this, John, like half the country went to this is all to be
00:09:57.220
I mean, there was literally an article in Rolling Stone saying that the shooting was
00:10:00.240
a mainstream Republican type of ideology at work.
00:10:03.700
And then, you know, the right looking at it and seeing something very different and
00:10:07.680
and being, I think, outraged that anybody tried to blame it on a political party as
00:10:13.240
opposed to radicalization by a guy who was drawn to the Internet and wasn't doing well
00:10:18.120
But it really is at the point where it's just two totally different truths.
00:10:21.380
You know, like if you watched MSNBC last week, you would have thought you are an evil person
00:10:27.440
You are hashtag part of the problem of massive white supremacy and mass shootings in America.
00:10:31.980
And you would have felt something very different had you turned into Fox News or any other
00:10:39.040
It's and you could do that on any given week of the year.
00:10:45.140
Well, so the first thing to keep in mind as we go through this is that most Americans
00:10:51.720
Most Americans don't want to attack anyone, destroy the reputation.
00:10:57.460
Most Americans are sick and tired of what's going on.
00:11:00.400
Part of what social media did is it changed who has voice, as it were, because it's always
00:11:07.660
It's always the case that the people on the far right and far left are going to be more
00:11:11.160
Whoever's more passionate is going to talk more.
00:11:13.240
So they're always going to have more voice, more representation than the people in the
00:11:16.880
But when social media becomes very widely used, and this is really around 2011, 2012, is when
00:11:22.540
most people now have a smartphone and now they can be on it every day, 10, 20, 50 times
00:11:29.220
When it gives when social media becomes widely used and it becomes much more viralized,
00:11:34.420
which we'll talk about in a moment, I hope, the extremes are going to have much more
00:11:38.680
And the middle 80% of the country, we just keep our head down.
00:11:44.520
And so it looks as though we all hate each other.
00:11:47.440
It looks as though all there is out there is extremists.
00:11:50.560
That's part of the hall of mirrors that social media does to us.
00:11:53.920
Our minds, we evolved to have a really, we really care what public opinion is.
00:12:00.980
And in a small community, you can actually tell what people are thinking, just try to
00:12:08.360
But in the social media world, we have no idea what people are thinking.
00:12:11.120
All we know is what people are tweeting or posting on Instagram, whatever it is.
00:12:14.180
And that's never representative of public opinion.
00:12:20.100
So let's go through it because you take on the three major forces that bind successful
00:12:25.560
And you make the case that social media has undermined all three in America and beyond.
00:12:31.320
Number one, social capital, which you describe as extensive social networks with high level
00:12:39.500
Those get much more interesting as we get into the details.
00:12:42.640
So the first one, social capital, extensive social networks with high levels of trust.
00:12:49.020
So social capital is a it's one of the most common terms in the social sciences and refers
00:12:54.240
to the fact that if you have two companies and one has a lot of financial capital, you
00:12:59.440
know, money that they can invest, that if everything else is equal, that company is going to outperform
00:13:06.760
Similarly, if you have two companies or sports teams or towns or nations, identical in all
00:13:12.940
respects, except one, people really trust each other.
00:13:17.220
Like if we're having an election, I don't think that you're going to cheat and steal.
00:13:22.360
That country is going to be much more successful than one with low trust.
00:13:26.340
And we've seen this throughout the 20th century.
00:13:29.660
The communist countries, everyone knows everyone is lying and there's no trust.
00:13:33.840
So America used to have very high trust up until the 60s or early 70s, equivalent to many
00:13:44.220
And so if you lose trust, if you lose social capital, now that brings us to the next issue,
00:13:50.580
which is strong or shared institutions, institutions that we trust.
00:13:55.700
A dictatorship is based on the strength of the ruler and the army and his ability to intimidate
00:14:02.540
A democracy such as ours or a republic or whatever you want to call it.
00:14:05.660
But if the founding fathers didn't want a king or a monarch, they believed in government of
00:14:12.500
And so they create, they took, we were lucky to inherit good British institutions and then
00:14:21.120
Obviously, they've performed badly at certain points.
00:14:23.400
But on a world historical scale, American institutions work very, very well.
00:14:29.980
We trust them less, in part because they are less trustworthy, but also in part because we
00:14:34.360
are just saturated, saturated with stories about their failures.
00:14:38.340
Some of those stories are true, some are false.
00:14:40.840
So if we don't trust each other, if we don't trust our institutions, including even our courts,
00:14:46.720
our legislatures, our public schools, if we don't trust them, it's going to be very
00:14:51.860
hard to have a country we could actually fail as a country.
00:14:56.480
I know the third of the third is the shared stories.
00:15:02.860
So the secret to binding people together, you know, in my writing, I'm very interested
00:15:12.020
I look at how other species get cooperation, and it's almost always because they're siblings.
00:15:18.240
You can have, you know, millions of bees or ants cooperating because they're all sisters.
00:15:22.320
Um, humans can cooperate at the level of millions too, but we're not siblings.
00:15:30.480
We have a common understanding of what we're doing.
00:15:40.700
Then we circle around and we worship something together, or we hold it as sacred together.
00:15:47.280
And America got a big boost from World War II and the Cold War.
00:15:50.420
We had a real story of who we were, why we were fighting for good.
00:15:53.780
We had really good, clear, evil enemies in the 20th century.
00:16:02.420
Everyone has their own little fragment of a story.
00:16:04.780
It's almost impossible to knit it together into a common story.
00:16:07.720
And identity politics has given people a new thing to latch onto at the expense of, yeah,
00:16:14.740
the, the American story and our history and pride and country, which is problematic.
00:16:21.180
And I know you've been raising the flag on that for a long time.
00:16:23.420
So you write that, um, back to the performance issue, because I think it's important what
00:16:27.280
happens on social media and why, why it's been so pernicious that it's no longer about,
00:16:32.680
um, coming up through middle school and high school and making a verbal error that your
00:16:41.320
It's about complete fear about what's going to happen to you on social media or actually
00:16:46.400
making a misstep and being absolutely ruined at a young age.
00:16:49.200
And at the same time, it's about posting the perfect selfie on Instagram, as opposed to
00:16:56.800
just spending time with your friends and laughing and swimming in the pool and riding your bike
00:17:02.060
around the neighborhood and all that stuff that actually formed true human bonding.
00:17:07.760
And so I think what we have to do is think here about what is a, what is a normal, healthy
00:17:13.120
And, um, in human, in, in human societies around the world, by the age of around seven,
00:17:21.000
They're not being supervised closely by adults.
00:17:23.300
They can bring the cattle down to the river or whatever it is.
00:17:26.240
They can certainly walk to the store and buy a quart of milk.
00:17:28.980
Um, and that was true all the way up into the 1990s.
00:17:32.060
Kids had, uh, sort of normal human childhoods, but in the 1990s, America in particular, we freaked
00:17:40.200
Uh, and this, I think is partly the, the saturation of cable TV and full-time, you know, 24 hour news
00:17:45.040
stations and they focus on story for whatever reason, the 1990s, just as the
00:17:54.100
There'd never been a safer time to raise kids or let them out outside.
00:17:58.060
Um, just at that time, we decided it's too dangerous and we say, no, you can't go outside
00:18:02.760
Uh, we also say after school, no, don't go out and play with your friends.
00:18:06.600
You have soccer practice or guitar practice or whatever it is.
00:18:12.240
And, um, when kids don't get to practice those skills, as you were saying, you, you, you,
00:18:16.480
you know, you say something, you make a mistake, you learn, um, kids have to make thousands
00:18:21.040
of mistakes and the consequences need to be very small so that they learn.
00:18:25.360
Imagine if you were trying to teach kids how to do the balance beam and they go out in the
00:18:29.580
balance beam, but every time they fall, they're going to fall 30 feet into a pit of alligators
00:18:34.100
Well, whatever, you know, they're going to get really, really hurt pretty quickly.
00:18:39.380
Um, so social media has made the consequences of a mistake so high, uh, you never know if
00:18:48.940
You know, we see clips of kids doing or saying stupid things.
00:18:52.480
So I think social media didn't take away childhood.
00:18:58.380
And then social media comes in when kids are heavily supervised.
00:19:01.600
The only place they can get away from adults is actually on, well, video games, which are not
00:19:05.480
as harmful, uh, but also social media platforms.
00:19:08.240
Um, and I think it completely distorts normal childhood learning and it deprives them of
00:19:12.700
the repeated experiences of trying something and failing or succeeding, but they need to
00:19:19.060
Yeah, no, it's truly the, I mean, when, when, when I have play dates at my house with, with
00:19:27.980
It's, and my kids aren't on social media and nor will they be until, you know, they're
00:19:33.620
Um, but I just find it so dangerous and I, and I, you know, I understand now it's not
00:19:42.260
And I think if more parents understood that they, they'd relax a little, you can still
00:19:48.240
Cause what I hear from all my friends is I need to pick them up.
00:19:51.380
I, you know, the other day he misses ride and he called me and I'm not willing to give
00:19:55.120
And, and you don't have to, you, Johnny can have the cell phone.
00:19:58.240
Johnny can play video games and he can text with his friends.
00:20:01.040
You know, there's just, there'll be limits, I think in terms of the time, the time that
00:20:04.280
that phone is open and available, but Johnny does not need to be on Facebook and Instagram
00:20:09.580
and Snapchat and Tik TOK and YouTube and all these other forums that are really potentially
00:20:15.840
A, but also B, um, have really pernicious societal effects and effects on him.
00:20:21.480
I mean, and we'll get to the suicide rates and the depression rates and so on.
00:20:24.920
So it's like, this is not in Johnny's best interest and not in society's best interest.
00:20:29.580
So, but the performance aspect, can you speak to that?
00:20:31.820
Because that's something I think we see a lot as grownups or grownups.
00:20:35.680
You can tell I have kids whenever you've heard adults as grownups, it means you have
00:20:38.700
Um, we see that as adults, but it's, it's very present and in the face of teens.
00:20:49.180
So if you wanted to train a seal to balance a ball on its nose, you would not explain the
00:20:57.240
You would not wait until the seal does it to give the seal a fish.
00:21:01.040
You, you reward any progress towards the behavior you want.
00:21:05.380
Um, and this is called operant conditioning in behaviorism and psychology.
00:21:10.000
Um, operant conditioning is incredibly powerful.
00:21:12.380
And if you could reinforce your kids within three seconds of them doing a behavior, you
00:21:17.920
could get your kids making their beds every morning within a week.
00:21:21.320
Um, so operant conditioning is very, very powerful.
00:21:23.860
Now, what happened all of a sudden when kids got phones with touchscreens, uh, around,
00:21:29.340
around 2009 to 2012, you know, the iPhone comes out 2007, but it's expensive.
00:21:35.620
That's when, that's when kids' lives switched from mostly not being on social media every
00:21:40.800
day to having a phone and having most experience come through the phone.
00:21:45.680
Um, and that phone is the most powerful operant conditioning machine ever invented.
00:21:50.960
What's the first word you're saying before conditioning?
00:21:57.060
Pavlovian conditioning, uh, which is like, gets your, your, um, uh, your autonomic immune
00:22:02.060
system going like Pavlov's dogs would salivate when they heard a bell, but the way you train
00:22:12.280
Now, all of us who have kids, we try to get our kids to sit up at the table.
00:22:21.220
We want to influence our kids very hard because we don't have a little shock box, like shock
00:22:27.360
We can't do operant conditioning to like train them like a, like a circus animal, but Facebook
00:22:32.480
and Instagram and Twitter and all these other platforms do.
00:22:35.260
And so what we've essentially done is we've given our kids, uh, an incredibly powerful operant
00:22:40.340
conditioning machine and the people giving them rewards are total strangers, total strangers.
00:22:45.240
We've given over the training of our children to total strangers.
00:22:47.960
And this, I think is a big reason why Gen Z is in terrible shape.
00:22:53.260
It's not a gradual change from the millennials to Gen Z.
00:22:55.860
It's very, very sudden right around birth year, 1996, um, kids born in 19, let's say kids
00:23:00.920
born in 1998 are much worse off, much more fragile, much more depressed and anxious than
00:23:11.040
Um, and so part of it is, um, we, we've taken away childhood in its place.
00:23:15.800
We've given them this training machine, which is sick, which is inhuman.
00:23:21.400
And there's so much to delve into when it comes to the children and, and, and how they
00:23:24.800
use these, uh, platforms and what it does to them.
00:23:27.820
But just to take a quick step back, as you were talking about, um, tribalism really, because
00:23:33.960
you were talking about who really has the microphone.
00:23:36.300
If, when we go online, you know, and it affects our children and it affects us, who are we listening
00:23:41.040
And we've heard, you know, our, I'm more center, right, but we talk about how, um, the
00:23:47.620
left wing controls media and the left wing certainly controls most social media.
00:23:51.140
Um, and we'll see what happens with Elon, but at Twitter right now, it's a very leftist
00:23:55.460
site and it's dominated for the most part by leftists, but really just by partisans, by
00:24:02.080
And that's indicative of most of social media, which people fail to factor in.
00:24:05.840
I always get a great reminder of this, John, when I go to visit my two close friends in
00:24:09.420
the Midwest, one lives in Chicago, one lives outside of Detroit, um, two girlfriends I met
00:24:14.580
many years ago and you go to the Midwest and sometimes you just get reminded, even though
00:24:18.440
I know it's happening there too, of like normal people, not like crazy New York hard partisans
00:24:27.380
Um, and you, you write about, um, what's the name of the study?
00:24:31.540
Oh, the hit, the hidden tribes study that really actually proved this and proved who it
00:24:38.080
is we are listening to whether we realize it or not when we go on these platforms.
00:24:45.740
So the hidden tribe study, it was a, an outfit, uh, from the UK.
00:24:52.340
They, um, interviewed, I forget how many thousands of people in, in 2017.
00:24:56.780
Uh, and they, by doing various Americans, 8,000 Americans, 2017, this is from your article.
00:25:03.280
Um, and they, they found that seven, seven clusters of people who gave similar kinds of
00:25:09.440
And it turns out that though the one on the far right, they called the devoted conservatives.
00:25:14.040
Um, that's where, that's where you'd find Trump's hardcore support.
00:25:18.940
And they're very different psychologically from, uh, I think the, the, the, the traditional
00:25:23.220
conservatives, I think it is, who are more like the, like the, like the conservative
00:25:27.260
intellectual tradition, like Edmund Burke, Thomas Sowell.
00:25:30.260
Uh, they're, they're cautious, they're prudent.
00:25:32.260
They believe in the importance of structures and the far right group is more radical.
00:25:37.160
So there's a kind of a, they're not conservative.
00:25:43.640
Now there's a corresponding group on the far left, uh, which is also not liberal.
00:25:48.500
Um, they, there's, those are called the progressive activists.
00:25:51.520
Uh, and those are the ones who have taken, uh, equality as quality of outcome as their central
00:25:58.740
Um, and it's easier to achieve equality by, um, tearing down the top than it is by pulling
00:26:03.800
So they tend to focus, and this is true across eras, um, radical egalitarian movements tend to
00:26:10.880
So these two groups are in, in my opinion, of course, they, there are reasons for their,
00:26:15.660
for them having the views that they have, but the net effect on democracy, if they're
00:26:26.860
Uh, I'm all about viewpoint diversity, but what social media did was it took, let's say
00:26:30.800
the, the, the Republican coalition that Ronald Reagan had built, uh, which in which you
00:26:34.580
had the business conservatives, you have the Christian conservatives.
00:26:37.140
And there were always people who are more prone, well, psychologically more prone to
00:26:41.860
They were all part of a group, but the far right group didn't have as much influence.
00:26:45.740
And all of a sudden social media, they have much more influence and they're able to intimidate
00:26:49.740
and basically push out a lot of the moderates on the left.
00:26:52.780
As you say, it's, you know, the media on the left, what I argued in my paper in the
00:26:58.060
Atlantic essay is that the democratic party still has a healthy debate between the far
00:27:04.040
The party itself has not lost that ability to debate.
00:27:07.560
The problem on the left, I believe, um, is that the left largely controls the, the epistemic
00:27:13.880
That is the institution that generate knowledge.
00:27:16.120
So universities, uh, journalism, media, the museums and the arts, a lot of areas.
00:27:22.260
And when you get homogeneity and social media, something happens, which is the extremes now
00:27:30.580
And so what we're left with is rather than just having a country where most of us are
00:27:34.340
fairly moderate, reasonable, and we have these extremes.
00:27:37.620
Now the extremes are so powerful that the rest of us just really go quiet.
00:27:41.460
And that's when you visit your friends in the Midwest, they're not out there tweeting
00:27:48.200
I mean, there was an interesting David Brooks piece in the times.
00:27:50.380
I don't know if you saw this recently talking about how he thinks there really is still
00:27:54.980
room for liberals who aren't pro cancel culture and pro demonization of the other side.
00:28:01.620
And, you know, he often wonders whether there's still room, but he still thinks there's still
00:28:08.520
But I mean, I know this firsthand from, from being somebody who's, you know, more right
00:28:12.920
leaning, but has immersed herself for the past 20, 30 years in very left leaning communities
00:28:21.140
They're my friends or my neighbors, but you know, they're lovely.
00:28:23.180
They're not hard partisans for the most part, and they don't want to mess with anybody else's
00:28:27.460
life, you know, but you go online and you see a very different version and it leads to
00:28:32.200
hate and more tribalism and intolerance on your own part, you know, and you really, it's
00:28:37.760
a, it's something you have to actively work to fight against.
00:28:41.160
And I do think if I lived in a more red community, I'd be more subject to that narrative because
00:28:48.120
I wouldn't be surrounded all the time by people who are liberal and who are absolutely
00:28:53.080
lovely and don't subscribe to any of this nonsense.
00:28:58.780
I'm very cautious about using the word liberal because in America, we use the word liberal
00:29:04.680
We can talk about progressives and conservatives.
00:29:06.400
We can talk about the far left and the far right.
00:29:09.240
I reserve the word liberal to a person who believes in the liberal tradition.
00:29:12.300
That is freedom of speech, freedom of association, freedom of religion, economic freedom.
00:29:17.360
We don't want to be telling people how to speak, how to dress, how to live their lives.
00:29:23.600
And in Europe, they speak about right liberals and left liberals.
00:29:26.600
So the problem, I think, our left is no longer liberal.
00:29:30.640
Our right is no longer liberal or conservative.
00:29:34.280
And, but to your point about, depending on where you are, if you're in a partisan community,
00:29:38.620
you are just deluged with evidence that the other side is horrible.
00:29:42.760
And you'll see videos of people saying horrible, horrible things.
00:29:46.320
And people on your side can list all the sins of the other side, but you tend to have no
00:29:51.120
idea that an equally strong case is being made on the other side where they are deluged
00:29:56.660
by horrible things that your co-partisans have said.
00:29:59.500
So, you know, it's, you know, why do you complain about the speck in your neighbor's
00:30:04.960
And it continues, Jesus says, first, take the plank out of your own eye.
00:30:14.960
And that's one of the things Tristan Harris was talking about, about how the social media
00:30:21.780
If you're relying on them, on your Facebook feed, on your Twitter feed for actual information
00:30:27.240
about what's happening in America, good luck, because your feed has been totally manipulated
00:30:30.680
to feed you only the things that will outrage you.
00:30:33.400
For the most part, you write about this as well.
00:30:38.520
And they are not in the business of delivering to you the truth, what is actually happening,
00:30:50.320
Let me pause it there, John, and squeeze in a quick, quick break.
00:30:57.260
It's not news now that they're trying to upset us on social media.
00:31:05.960
I think most people have heard that at least once or twice.
00:31:09.440
But you sort of link it to you've got the development of the ability to like something
00:31:16.180
on Facebook and retweet something on Twitter and then share something on Facebook.
00:31:21.860
And all of this sort of builds to the place where you write the newly tweaked platforms
00:31:26.880
were almost perfectly designed to bring out our most moralistic and least reflective selves.
00:31:40.720
So if we go back to when these platforms were new, there was MySpace and Friendster and Facebook
00:31:48.940
They were just like, you know, bulletin boards.
00:31:58.440
The early technology was actually, and I hope people will remember this, like in the 90s,
00:32:05.840
And, you know, there were pockets of toxicity, but there was a real positivity around a lot of
00:32:13.460
So from 2003 or 2004 to about 2009, social media is not particularly toxic.
00:32:21.640
And then what happens in 2009 is Facebook innovates.
00:32:26.360
They say, you know, because they're all about finding out engagement.
00:32:31.660
So they give you a like button, a button you can click to say, I like this.
00:32:35.220
And that way you're generating a lot of data for them about what you like.
00:32:38.640
And then they develop algorithms that can maximize the degree to which you're going to
00:32:43.520
Now there's not necessarily anything nefarious about this.
00:32:46.600
What's wrong with giving you more of what you like?
00:32:48.260
But the net effect is that it's much more addictive because now you're getting reinforced
00:32:52.500
And you're strategically liking things, which is kind of you're performing.
00:33:01.200
Even more damaging, I'd say, is the retweet button that Twitter develops.
00:33:12.060
You click on something, not just to say, I like it, but you can now forward it to, let's
00:33:19.000
And if it's really outrageous, they might forward it to each of their 500 followers.
00:33:23.220
And very quickly, you can get to millions and millions of people.
00:33:26.040
So 2009 is the year that social media changes radically.
00:33:30.220
Before 2009, you couldn't go viral very easily.
00:33:35.280
And now the game is on, who can win, who can go viral?
00:33:44.460
How should I say it to maximally increase my clicks and my likes, my retweets, my follower
00:33:51.260
So it's really then that it becomes much more hyper-viralized.
00:33:56.960
We all understand from COVID what happens when a virus is much more transmissible.
00:34:03.040
So that's what I argue in the piece, those small architectural changes.
00:34:07.660
That's why everything went haywire in the 2010s.
00:34:14.660
There were still much greater possibilities of co-partisanship, of people working together
00:34:20.420
We didn't feel like we were all attacking each other all the time.
00:34:24.900
So you say really between 2011 and 2015 was the apex of the problem.
00:34:30.700
And I think that's interesting because I think a lot of people, when they look at the collapse
00:34:33.520
of, or the, I don't want to say our society has collapsed, but it certainly isn't in good
00:34:39.800
And I think a lot of people look back and especially on the left and they say, it was
00:34:45.600
And I did a long documentary with PBS a couple of years ago where they actually took an open
00:34:53.240
And I made the case that there were a lot of very divisive things that happened during
00:34:56.660
the Obama presidency that set us on the road to, you know, this divisiveness we're feeling
00:35:08.720
There are a lot of people in the country who are not political at all, who are on their
00:35:12.500
phones, as you point out, 50 to a hundred times a day.
00:35:18.560
And you make the case that it was between 2011 and 2015.
00:35:22.240
So pre-Trump, that things really reached the height of awfulness or the pernicious effects
00:35:29.120
really sort of reached, I don't know, it was their apex and then they kind of stayed
00:35:33.020
But explain why you picked those dates and what you mean by those, by picking those four
00:35:38.520
So on campus, it was 2014 when this stuff first started.
00:35:42.140
Anyone who graduated from college in 2012 didn't see the speech of violence, the cancel
00:35:48.500
And in 2015, so Greg and I read our article, The Coddling in the American Mind, that comes
00:35:55.000
And then at Halloween 2015, with the Christakis affair at Yale and the students protesting
00:36:01.440
So things blow up on campus in 2015, really, 2014, 2015.
00:36:05.840
And only recently have I discovered how many other things were happening around 2014, because
00:36:10.600
it's only then that we can get global cancellations.
00:36:13.300
So many listeners will remember the story of Justine Sacco, a woman who was flying to South
00:36:19.500
She tweeted a joke in somewhat poor taste, but it wasn't racist.
00:36:23.520
So she tweets this joke, gets on her plane, lands in South Africa, and there's a global
00:36:28.480
outrage around her, and she's fired the next day.
00:36:35.180
There was no way to have a global mob around this woman in 2008.
00:36:43.720
In early 2014, Brendan Eich is promoted to CEO of Netscape.
00:36:49.060
And someone discovers that he gave $1,000 to a group in California that was opposed to
00:37:00.820
This sort of global, the ability to create a Twitter mob wasn't there before the retweet
00:37:06.340
And so we get a lot of things happening 2014, 2015.
00:37:12.400
And so the argument that I made in the paper, in the Atlantic essay, is that it's as though
00:37:17.720
with these hyper-viralized platforms, it's as though they gave out a billion dart guns.
00:37:22.860
Everyone in the world gets a dart gun with unlimited darts, and you get to shoot anyone
00:37:28.060
It's not going to kill them, but it'll shame them.
00:37:34.060
And the net effect of everyone having a dart gun is most people don't want to shoot anyone.
00:37:41.320
But the people on the extremes are psyched about being able to shoot not just their enemies
00:37:47.120
across the aisle, but the moderates on their own side.
00:37:50.960
And so we now have a communication environment, which is much more characterized by walking
00:38:00.980
And if you're walking on eggshells, you can't trust the people around you.
00:38:07.940
There used to be a lot of humor in the academy.
00:38:13.200
But all of that goes away when we're all afraid that if anyone takes offense, they've
00:38:20.640
So that's why I argue that this isn't just about polarization left, right.
00:38:25.680
This is about we're afraid of the person next to us or the person, the unknown person on a
00:38:42.460
I think you retweeted a Jonathan Turley article.
00:38:44.840
But I tweeted about this just recently today as well about what's happening to this Princeton
00:38:53.500
There's a Princeton professor who's now they're pushing to fire this professor, Joshua
00:38:59.140
He's a classics professor, which you're already not allowed to be because you study a bunch
00:39:06.840
And now the president of Princeton is calling on the university to fire this tenured professor.
00:39:19.160
He had a consensual affair with a student for which he's already been punished.
00:39:24.000
They already turfed him off campus for a year where he was suspended for a year.
00:39:31.060
You know, I mean, and now in a court of law, double jeopardy would have attached.
00:39:38.420
They're trying to get back into it and open back up the case.
00:39:42.240
Because he wrote an article critical of the reforms some of the black faculty members
00:39:48.160
are asking for at Princeton, like more sabbatical than their white colleagues.
00:39:58.940
But I mean, they were it was obviously like favoritism based on race.
00:40:02.920
He said he objected to faculty of color receiving special course relief and summer salary and
00:40:09.060
an extra semester of sabbatical and criticized, quote, extra perks for no reason other than
00:40:16.060
And he also criticized something called the Black Justice League, which was active for
00:40:21.900
two years on campus as he called it a local terrorist organization that made life miserable
00:40:26.700
for many, including many black students who didn't agree with its members demands.
00:40:30.560
And because it seems very clear because of that letter, they're now trying to meet to him
00:40:37.000
We covered what happened to Roland Fryer, something very similar at Harvard, which is still ongoing.
00:40:49.480
So, I mean, you know, the case is complicated and I don't know the I don't know the details.
00:40:53.280
But I think what is clear is that whatever whatever behaviors he did on campus, he was
00:40:58.320
investigated for, he did have affairs with students over time.
00:41:05.100
But what was clear to me was that the investigation was restarted and pushed to the point of firing
00:41:13.420
him only because he wrote an article in Quillette that offended many on campus.
00:41:19.100
And I think we see the same dynamics here that we saw in the Dorian Abbott case.
00:41:23.420
So Dorian Abbott, a physicist at University of Chicago, he wrote an essay in Newsweek.
00:41:28.720
Criticizing race-based affirmative action, criticizing certain aspects of DEI, agreeing with certain
00:41:34.180
goals, but being critical of the main thrust of DEI programs.
00:41:42.180
That's certainly something that a professor should be able to write about.
00:41:45.460
And he was invited to give a very prestigious lecture at MIT.
00:41:50.040
And some, I don't know if it was students or administrators, were upset that we're going
00:41:53.780
to give an honor to this guy who wrote this article that offends us.
00:41:58.460
And so they put pressure on the MIT administration to uninvite him because of an article he wrote
00:42:05.660
And so I think it's the same sort of thing here.
00:42:07.600
It's not that Princeton wants to do this, as far as I know.
00:42:09.680
Again, I don't know the details of a complicated case.
00:42:11.720
But I'm pretty confident that pressure is being put on the administration by students, I suppose,
00:42:19.620
Pressure is being put on, and we've seen this over and over again, when pressure is put on
00:42:28.880
So that's why I think the principle here is, if you say something that offends, if you say
00:42:34.920
something that violates the sacred values of a powerful group on campus, then if they can't
00:42:40.080
get you for what you said, they do what's called grievance archaeology.
00:42:51.760
That's part of due process is there's got to be a process to adjudicate bad behavior.
00:42:57.580
If you're found guilty, you receive your punishment, and then that's it.
00:43:02.280
And so, again, I don't know the details, but it seems as though Princeton is behaving this
00:43:08.840
way because of internal pressure groups rather than, I mean, I suppose if they could go back
00:43:13.500
and investigate every single professor, they would find a lot of malfeasance.
00:43:16.980
I hope somebody goes back right now and investigates the president of the university.
00:43:22.100
Christopher Eisgruber, who is the one calling on the university board to fire Professor Katz.
00:43:31.460
I guarantee you he's had some affair with some student or he's crossed some ethical line.
00:43:55.520
If these are the rules we have to play by, then let's play by them.
00:43:58.480
Because these people like woke CEOs and woke college presidents need to be taught a lesson
00:44:03.600
that you come for me and I will sick the anti woke mob on you.
00:44:10.400
It's the only terms they're going to understand.
00:44:12.140
I can certainly understand the need for counterforce.
00:44:16.840
I can certainly understand turning the tables, hoist on your own retard, all of that.
00:44:20.740
But I've been studying the culture war for a long time.
00:44:25.260
And, you know, in a military war, you can apply such force that you literally kill your
00:44:34.320
In a culture war, the harder you attack your enemy, the stronger he gets.
00:44:37.740
It's the only thing you can do is you can either give them more ammunition by doing
00:44:42.280
giving more anecdotes, more terrible things that people on your side have said you can
00:44:49.680
I'm not sure that this is a strategy with Disney and DeSantis, right?
00:44:52.920
Disney's lost billions of dollars in stock value.
00:44:56.000
All corporations are now they're not they're not coming and commenting on abortion suddenly.
00:45:00.120
And Disney's gone quiet on its woke agenda for the past few weeks.
00:45:09.280
And they got in a war that didn't really serve them well down in Florida.
00:45:15.400
You've studied this way more than I have, but I've lived it and I've been through it
00:45:21.480
OK, let's talk about it, because I really am not sure I'm right here.
00:45:24.220
In fact, I think what I have here is just an insight, not an overall strategy.
00:45:30.660
And you're right that Disney was under Disney was under pressure, let's say, from left wing
00:45:35.540
And so they moved to the left and then it's only if there's counterpressure that they'll
00:45:41.380
I think it was a quote from Margaret Thatcher during the during the Balkan War.
00:45:44.580
I can't find it, but I remember hearing it when I was young.
00:45:46.960
The problem is not whether we should use force in in Bosnia.
00:45:50.560
The problem is that force is being used by one side overwhelmingly every day.
00:45:56.820
That is the question of arming the Bosnians against the Serbs.
00:45:59.280
So by analogy here, I do understand the need for pushback.
00:46:03.460
Each time we have an escalation in the culture war, a new strategy is brought in, which is
00:46:09.900
And so what we're now seeing, which really alarms me, is things like the Texas abortion law.
00:46:13.960
And I think the Florida, you know, the law we're talking about here, I believe they have
00:46:18.860
provisions by which anyone can bring a lawsuit against a citizen, a teacher, a doctor.
00:46:27.320
This means that now people doing their job have to worry that they could suddenly be sued
00:46:33.060
by a bunch of activists on the other side, even if they've done nothing wrong.
00:46:37.200
That's certainly true in the case of the Texas law.
00:46:48.180
This is an interesting strategy and what what serves the community and what doesn't.
00:46:52.780
And then we've got to talk about Trump, because I think your observations on how
00:46:55.880
he was the first one to understand the Tower of Babel had fallen and how to work that
00:47:05.280
And don't forget, folks, you can find The Megyn Kelly Show live on Sirius XM Triumph Channel
00:47:08.940
111 every weekday at noon east, the full video show and clips by subscribing to
00:47:12.440
our YouTube channel, YouTube dot com slash Megan Kelly.
00:47:15.220
So, yeah, this has been a debate on the on the right in particular for a long time now
00:47:22.100
since cancel culture really sort of took over how to fight back, you know what to do.
00:47:25.860
And I think for a long time that the effort was to say, stop it, stop it.
00:47:31.560
That's what the Harper's letter was about, right, saying just just stop it.
00:47:34.800
That was mostly liberal saying, yeah, we hate Trump.
00:47:41.540
But this is about being truly liberal in the in the genuine sense of that word and allowing
00:47:46.580
free speech, a principle you've been fighting for your entire academic career.
00:47:51.780
So I think that that has led a lot of folks in this battle, left or right, to say, all
00:48:00.120
And I've come around to the belief of cancel them, cancel them all, cancel Chrissy Teigen,
00:48:07.860
Now, I realize, depending on the institution that they're at, that may not be realistic.
00:48:12.380
Joy Reid should have been canceled a long, long time ago, given her views.
00:48:17.160
But at least you can get some skin in the game.
00:48:19.620
I guarantee you this Princeton president has got something in his past he doesn't want
00:48:24.500
And the only thing preventing it from coming out is journalists and people who are anti
00:48:30.660
cancel culture from taking out their own microscopes and magnifying glasses and taking a hard look.
00:48:37.060
So the more you punish people for being this cruel, the less likely they are to dip a toe
00:48:46.520
OK, so during the break, during that nine minute break, I thought about what it is.
00:48:54.400
But I was saying, wait, you know, this is just going to escalate things.
00:49:00.140
So I've been studying polarization since about 2004.
00:49:03.860
And the path we're on is towards catastrophic failure as a country.
00:49:07.740
If we don't change what we're doing, the American experiment set up by our founding fathers who
00:49:13.500
understood our liability to faction, factionalism, fighting, it wasn't clear that this experiment
00:49:19.960
Now, it's worked very well on and off, extremely well.
00:49:26.500
If we keep going this way, we are headed towards catastrophic failure.
00:49:29.480
And so if you think about this is like a fight taking place on a ship or on a ship, the ship
00:49:39.060
We've got the crew is divided into the red team and the blue team.
00:49:44.260
And what you're saying is, look, the blue team has been really unfair.
00:49:47.600
It's time that the red team fought back using blue team's methods.
00:49:55.420
So we just keep doing this until in five or 10 years, the ship sinks.
00:50:00.920
And that's why the last quarter of my article was not about how one side can win, because
00:50:06.440
The last quarter was, what are the structural changes?
00:50:09.620
What are the changes to in Congress, changes to how we do elections, changes to social media?
00:50:15.180
What are the changes we could do that would return us just to levels of hatred we had back
00:50:20.760
If we could go back to those levels of hatred, then I think we can make it as a country.
00:50:24.380
But on our current path, we're in huge trouble.
00:50:28.880
So what I'm saying is, we need to think about structural reforms so that we can actually
00:50:33.320
sometimes, in some places, talk to each other and even work together sometimes.
00:50:38.780
But I see it more as like, and I see, I understand your point of it's holes in the boat either
00:50:45.220
I see it more like there's generals up on the left and there's generals up on the right
00:50:49.000
at the front of the boat, admirals, I guess, sailing it.
00:50:52.200
And the left keeps taking out only one side's admirals.
00:50:57.000
They just keep continuing to take down admiral after admiral after admiral who they think
00:51:02.920
And the red team keeps looking at them saying, please stop doing that.
00:51:07.480
We're not going to make it if you get rid of all this brain trust up at the front.
00:51:14.180
And finally, the right says, we're going to hurt some of your generals or your admirals.
00:51:18.660
And finally, the hope is that the offensive side will realize, well, this is stupid.
00:51:23.740
We can't survive without our leaders and stop the firing.
00:51:30.760
Reaching out across the aisle and saying, please don't.
00:51:33.780
I mean, look, what difference did the Harper's letter make?
00:51:37.540
Everybody on that was like, and I love I love Thomas Chatterton Williams, and I thought
00:51:46.060
OK, but what but there's no point at which the other side, whichever side it is, is going
00:52:03.340
Like, no, no, I just mean, I just mean, think about it.
00:52:06.560
This this president of Princeton, if he really thought that I was going to devote the resources
00:52:10.000
of my entire team over the next month to digging up dirt on him, which I 100 percent could and
00:52:14.440
maybe will do, I think he'd be a little scared.
00:52:18.780
And I think if I found something, if I found some young woman he slept with, this is made
00:52:25.200
He'd be scared shitless that I was going to turn around.
00:52:28.180
And then maybe the next university president would hesitate a little before they decided
00:52:32.640
to use a human failing for which someone has already been held accountable against him
00:52:37.900
to punish him for his divergent viewpoint on a separate matter.
00:52:43.240
You're talking about a kind of counterforce which would have an effect.
00:52:48.360
In fact, there's an organization co-founded originally by some Princeton professors called
00:52:57.820
And what they do is in cases where, in fact, Jonathan Katz and Robbie George are founding
00:53:08.160
What the Academic Freedom Alliance does is it sends a letter to universities saying, if
00:53:14.400
And this has been FIRE, the Foundation for Individual Rights in Education, also does similar work.
00:53:18.740
So I think that kind of pressure, I think, is a positive way of doing something which
00:53:27.000
I think the politics of personal destruction, I do understand if they do it, then why can't
00:53:34.220
But I think that way lies just continual escalation and the death of our country.
00:53:38.960
So I'm doing everything I can to think of how do we change the venue?
00:53:43.600
How do we change things so that people don't do this on either side?
00:53:46.660
That's the challenge, I think, that we have as a country for the next 10 years.
00:53:54.040
You actually have, very condensed, a three-point plan that might at least help.
00:53:59.920
I guess I am feeling right now, just given being in the news and covering this so much,
00:54:04.100
it's just the constant indignity of what they do to people.
00:54:07.020
Like one of those Braveheart warriors, you know, with the face paint and like no underpants
00:54:18.160
Look, when you're repeatedly attacked and viciously attacked, yeah, you're not gonna, you know,
00:54:23.540
the strategy is not, oh, you know, let's make peace.
00:54:27.780
I'm trying to break us out of the binary of, and it's like, look at it this way.
00:54:39.860
Now, if you change the venue, you put in cameras.
00:54:45.940
And in the same way, people used to talk to each other.
00:54:52.320
But with social media, what it's done is it said, here, why don't you guys fight it out
00:54:58.000
And it's almost as if they said, hey, we're a venue, we're a platform for people to talk
00:55:04.320
to each other in the middle of the Roman Colosseum.
00:55:06.920
All conversations are going to take place with an audience that wants blood.
00:55:13.180
And so when Facebook developed threaded comments, this was 2013, they said, it's not enough that,
00:55:20.120
you know, President Obama posts something and people can yell and scream at him in the
00:55:26.240
We want people to yell and scream at each other in the comments.
00:55:30.200
And so anybody types anything, you can now respond to them and people can respond to you.
00:55:38.400
Well, I understand Facebook wanted to increase engagement and it worked.
00:55:42.660
What I'm saying is as long as our entire environment pushes us to fight with each other and be mad
00:55:47.940
at each other and be drowning in outrage stories, there is no way out of this.
00:55:52.260
We have to find a way to break this dynamic to get out of the Roman Colosseum.
00:55:59.600
It's do you stand by that understanding, though, because you're in academia.
00:56:03.700
So your world has been completely saturated with this for I mean, it's it's up to the gills.
00:56:11.040
And I do believe that these and you make the great case that sort of this the loudest members
00:56:16.320
of cancel culture have an have a disproportionate voice.
00:56:19.720
But I think the vast majority, as you also say, of the country is not with them.
00:56:23.720
They don't they don't really put political differences aside.
00:56:26.400
I think the people who are pro cancel culture are a small minority with a very big voice.
00:56:34.280
Why can't the society survive if we just destroy them?
00:56:38.140
Because the rest of us, the vast majority of us aren't for that stuff.
00:56:44.440
And then we can go once we destroy them, then we can go.
00:56:47.540
We can argue about abortion and the Florida law and all.
00:56:50.580
We can do all the old fashioned arguing we used to do over politics, but this is where
00:56:55.860
metaphors can either illuminate or lead us astray.
00:56:59.180
And as the linguist George Lakoff said long ago in a brilliant book called Metaphors We
00:57:03.420
Live By, we think about argument using the metaphor of war.
00:57:09.220
Now, if we literally mean, why can't we destroy them?
00:57:11.860
What you mean is either kill them or lock them up or cut out their tongues or sometimes
00:57:18.040
I mean, hurt them so they will stop doing this.
00:57:21.800
If you hurt people, are they going to stop doing it?
00:57:24.540
There's no, again, there's no way to win a culture war.
00:57:28.620
Well, yes, you can get, you can, you can have victories.
00:57:34.060
And this is something that the left has a lot of.
00:57:36.360
The left has a lot of, so a Pyrrhic victory is from a story in ancient Greece where a general,
00:57:40.800
I believe is, you know, he wins the battle, but he loses so many men that he ultimately
00:57:45.700
And so I think what we're seeing is, you know, my argument is that while the, I think the
00:57:52.900
Republican Party has in many ways gone off the deep end more than the Democratic Party,
00:57:57.200
but the cultural left has gone off the deep end much more than the cultural right.
00:58:01.900
And that's what you're talking about is you've got all these institutions where the left is
00:58:06.680
And what I'm arguing is that we need to, you can't beat something with nothing and you
00:58:16.220
What we need, I think, is a much clearer notion.
00:58:18.920
We've got to all start talking about professional responsibility, a sense of duty, a sense of what
00:58:26.520
And if you're a university, your job is to, as a faculty member, it's to do research and
00:58:33.780
And as a teacher, it's to educate and bring up students.
00:58:37.300
If you're a journalist, again, it's to find the truth, but using very different methods.
00:58:41.020
So each institution has a telos, is the Greek word for end or purpose.
00:58:45.920
I think we're not going to end the culture war just by silencing our opponents.
00:58:52.480
We have to develop, the middle 80% of us, if we can develop a notion of basically, do
00:58:58.960
We live in a world with people who have different views.
00:59:01.180
Yeah, well, I had a comedian on the show not long ago.
00:59:03.520
I think it was Ryan Long who said, if everybody could just do their job instead of feeling the
00:59:09.540
need to cross lanes and judge and comment on everybody else's job, we'd be a lot better
00:59:14.700
But let me turn the camera back on you because you've been fighting for these principles
00:59:23.280
I mean, I was so impressed at the number of organizations that you're a part of that I
00:59:26.720
love, you know, Heterodox Academy and FIRE and all, we'll get to the Leck Rowe project
00:59:32.600
about children and so on, but it's not working.
00:59:38.320
And so what makes you think there's hope for it?
00:59:41.120
So you're right that it has not been working so far.
00:59:45.740
It is true that the concerns that Greg Lukianoff and I had in 2014, 2015 have spread far beyond
00:59:53.580
The universities have bought into a certain mindset that has brought them away from their
01:00:03.240
I'd have to say the trends have been against us.
01:00:09.760
When I wrote the Atlantic article that came out six weeks ago today, actually, I was expecting
01:00:16.040
to get attacked from the left and the right and nobody attacked me at all.
01:00:19.340
In fact, hundreds of people wrote me, just regular people just wrote me thank you notes
01:00:27.760
I think what we're seeing is we had sort of mounting insanity throughout the 2010s.
01:00:33.140
The pendulum kept swinging and swinging and swinging, and there was no sign it was going
01:00:37.200
And then, of course, after George Floyd and that year of COVID, things went even further
01:00:41.080
and a lot of schools implemented like Ibram Kendi style programs.
01:00:47.880
And I think what we're seeing now is that most people are recognizing this is crazy.
01:00:51.820
This is just completely crazy what's happening to us.
01:00:53.740
Um, so I, I am perceiving, like, look at, for example, the New York Times, the New York
01:00:58.440
Times dared to publish an op-ed, uh, an editorial praising free speech.
01:01:05.080
Uh, you know, isn't that just speech for racists?
01:01:08.480
And I, we're seeing this more and more that the leaders of organizations who are generally
01:01:12.820
true liberals, that is they're on the left and they believe in free speech and freedom
01:01:17.920
Um, they've been intimidated, uh, uh, uh, and, and pushed around.
01:01:22.180
But I think we're beginning to see more of them stand up.
01:01:24.620
Uh, we're seeing corporations that Netflix announcing, you know what, if you can't work
01:01:28.360
on a project that doesn't share your values, maybe you shouldn't work here.
01:01:31.860
I think we're going to see in the next few months, a lot of companies, a lot of companies
01:01:39.020
And, and you know what, frankly, it's just like, it's what Sirius, for example, XM has already
01:01:43.540
You know, Sirius has got lefties on, on its lineup.
01:01:49.160
Um, that's the principle of the organization, you know, let, let more conversations happen.
01:01:55.800
There's a huge marketplace for ideas here and you can go to the ones that you agree with.
01:01:59.740
You can go to the ones you disagree with, but that's the American way.
01:02:06.060
And I think it took the Dave Chappelle crisis to remind them of their core mission and of
01:02:12.860
They don't want Netflix to be just this woke corporation.
01:02:15.780
That's shoving social messages down our throats that all align with one worldview.
01:02:25.960
The American people ultimately have a good sense.
01:02:28.080
And while it seems as though everything has been moving in one direction since around 2014,
01:02:32.600
um, I do think that now, and, and here it's incumbent on people to stand up for principles,
01:02:38.040
to stand up for the professional responsibilities, but to do it in a way that doesn't just trigger
01:02:43.120
This is my fear that the dynamics of polarization and of culture are, Hey, I'm so mad at you.
01:02:50.840
Um, breaking out of the cycle, carrying ourselves with more dignity, more civility, still standing
01:02:57.440
I think in the long run, um, I think this is the way to go.
01:03:05.280
I've got the brave heart based pain on when it comes to those canceled culture warriors and
01:03:09.160
woke university presidents who are casting judgment on everybody.
01:03:11.760
But I'm open-minded as always to, to the possibility that I may be the wrong one.
01:03:17.960
Cause I want to get into solutions and what you actually think might help.
01:03:20.600
And there's a, there's an interesting law on the books.
01:03:22.620
Well, not on the books, but being proposed in California that might help with the kid
01:03:29.100
Um, it's the first time I've ever seen a law in California that I think I might get behind.
01:03:32.660
Um, but I do think Trump is interesting because there's a line, um, from your article that
01:03:37.360
says you talking about how you date sort of the, the crisis, uh, peaking to the years
01:03:42.820
between 2011 and 2015, a year marked by the great awakening on the left and the ascendancy
01:03:47.960
Then you write, Trump did not destroy the tower, meaning the tower of Babel, as we've
01:03:54.320
Then you add, he was the first politician to master the new dynamics of the post Babel era
01:04:10.540
And then you say, and in which Twitter can overpower all the newspapers in the country
01:04:15.420
That was like, as a politician, forget his policies as a politician.
01:04:19.820
That was the thing he got before anybody else got it.
01:04:26.380
Um, because in the, in the mass media age where there, there was some sort of professionalism
01:04:31.920
in politics and journalism, you can question how good it was, but you know, if someone said
01:04:35.880
something atrocious, that could be the end of their campaign.
01:04:38.100
There were certain principles and rules and processes that we understood from what you might
01:04:42.140
call the pre Babel era when it was possible for a narrative to emerge about Jimmy Carter
01:04:47.600
or Paul Tsongas or whatever candidate, you know, um, um, there was a shared narrative that
01:04:53.440
could emerge, but Trump wasn't paying attention to any of that.
01:05:01.380
And I think if he had run four or eight years previously, I don't think he could possibly
01:05:07.040
Um, he just happened to come in at this time when everything was shredded.
01:05:13.900
Um, there's wide distrust in institutions and we have such high, it's called negative partisanship.
01:05:19.780
That is Americans since the early 20, uh, since the early 2000s, we don't vote for the
01:05:32.240
Now, the fact that some of the, some similar things happen in Canada and the UK, certainly
01:05:36.100
the universities are identical in Canada and UK, the teen mental health crisis is identical
01:05:42.000
Uh, now I do think that Trump made our politics much more coarse.
01:05:44.860
I think he greatly amplified, uh, uh, polarization.
01:05:47.840
I think he certainly drove people on the left insane, making them say and do things that
01:05:51.640
then, uh, you know, they attack people on the right with extra passion.
01:05:54.920
And that's how our culture war got so much more heated.
01:05:57.320
We're much more polarized than any Western democracy.
01:05:59.540
And that's partly why we're in such trouble now.
01:06:03.440
Uh, I don't, I mean, I can't disagree with any of that.
01:06:05.380
I think he, um, he saw the seam in the story and got himself in there and then totally
01:06:12.400
You know, he called it the fifth Avenue rule that he could shoot somebody at fifth Avenue and
01:06:16.100
wouldn't lose any supporters, but you saw it happen time and time again with his, his, the
01:06:19.640
crazy things that emerged about him or from him during the campaign that didn't, didn't
01:06:25.960
It, it concerns me that all the future politicians are going to think they have to do all the
01:06:33.460
It telegraphed to the base that he didn't care what the old party thought or did, but if
01:06:39.540
it continues, it scares me about what we're going to get on an ongoing basis in
01:06:44.160
the white house, cause at least Trump did wind up having some good policies, at least
01:06:48.820
Um, I don't know whether the next guy will, or whether he'll just be skilled at dividing
01:06:52.600
us, fighting back, flipping the middle finger and, you know, getting himself in the office.
01:07:02.520
Um, that, and that's, what's appealing about somebody like a Glenn Youngkin, right?
01:07:05.320
He seems like his little fleece sweater vest seems unthreatening though.
01:07:12.900
So let's go back to, um, social media because one thing we didn't talk about was Instagram
01:07:18.500
and how in particular, this is a pernicious force.
01:07:22.480
I mean, we, Twitter, we know Facebook, we know Instagram has been outed by many, including
01:07:26.920
the, the whistleblower and, and you've written a long piece on this too, about how, just how
01:07:33.040
Cause it's not just, wow, they're really divisive.
01:07:36.040
They're really undermining institutions and faith of America, Americans in each other.
01:07:42.320
And in their country, it's, they're actually seriously causing mental health problems that
01:07:48.780
That actually may be fatal in a lot of cases with young girls in particular.
01:07:54.840
You know, I like to believe I'm raising healthy children that Instagram, there's only so much
01:07:58.820
damage you could do, but, um, I'm sure that's what most of the families believe.
01:08:06.900
Um, so first I'd encourage listeners and viewers to go to, uh, I just created a page where I
01:08:13.880
So if you go to jonathanheit.com slash social media, all one word, um, I've put there my
01:08:20.980
I gave a testimony in front of a Senate committee two weeks ago where I, I created a document
01:08:27.540
What's the evidence that social media is a cause of this problem?
01:08:31.020
And so what the evidence shows clearly is that rates of anxiety, depression, self-harm
01:08:36.080
and suicide were relatively flat, um, in the early two thousands.
01:08:40.460
And then around 2010 to 2012, there are very sharp upturns in all of those graphs, especially
01:08:47.140
Um, uh, suicide is certainly for both, but self-harm is primarily for girls.
01:08:53.280
Um, and, and so that certainly points to social media as the cause, but the question is correlation
01:09:00.140
There've been a lot of previous moral panics over television and video games that turned
01:09:06.760
Um, so I've been focusing on gathering all the academic research together to get a sense
01:09:12.300
And it turns out the evidence is a lot of correlational studies that kids who use it more, especially
01:09:16.320
heavy users are two to three times more likely to develop depression or anxiety
01:09:24.080
When you randomly assign people to either use more or less, uh, social media accounts,
01:09:28.320
you generally see either, uh, uh, you know, a downturn or an upturn in their, in their mental
01:09:35.040
Ask any group of girls and their studies have done this.
01:09:37.380
Why do you know, why do you think that, uh, depression is rising?
01:09:41.580
So if you have all these sources of evidence, um, I think it is pretty clear that social media
01:09:47.180
and particularly Instagram is bad for girls' health.
01:09:49.620
The thing to really keep in mind is it's not just being on a screen.
01:09:53.260
It's especially, I believe I can't prove this part, but I think the most active ingredient
01:09:56.640
is when a girl puts a photo of herself up and waits for strangers or even friends, just
01:10:05.000
There's new evidence that, that when girls do this during puberty, when you're going through
01:10:09.520
puberty, 11 to 13, that's when there's maximum damage.
01:10:12.460
So what I'm proposing in my article is we've, the age of internet adulthood was set to 13
01:10:21.980
Um, it needs to be 16 and needs to be in four 16 or 18, but we can't have kids, especially
01:10:26.240
girls going through puberty, self-conscious, so uncomfortable in their bodies, putting
01:10:33.700
And then what if someone else gets more validation?
01:10:35.480
What if someone, your friend is more beautiful than you because of filters or whatever?
01:10:39.360
So, uh, I think the evidence that these visual media, especially Instagram is harmful for
01:10:49.880
Like I mentioned, the California has got a law right now, uh, that would crack down on
01:10:54.300
It's being proposed that would crack down on the social media companies with respect to
01:10:58.420
children and would make it tougher for them to do to them what they do to us in terms of
01:11:02.920
the addictive nature, uh, tracking them everywhere.
01:11:06.660
Um, things like autoplay where the next video just comes up and, you know, makes you want to
01:11:11.940
Uh, notifications past a certain time of night.
01:11:14.740
I mean, those all make sense, but how would it work?
01:11:17.600
So when you, when a 13 year old gets an iPad, you as the parent would have to program in
01:11:22.420
this third, this device belongs to a 13 year old.
01:11:26.220
Hello, not a grownup and just that information, or you'd have to, as a parent, like type in
01:11:32.980
Cause we already have some restrictions we can put on.
01:11:34.940
Yeah, no, it has to be that the default is that kids can't get on until 16.
01:11:39.380
And so the way it needs, there are a lot of schemes to do this.
01:11:42.060
So, um, so for example, you know, what if you could, if anybody could go to Twitter or
01:11:47.960
Facebook or one of these, any of these sites, and it, suppose you could open an account,
01:11:52.320
um, uh, um, suppose you can open an account, um, but you have to get verified by, to show
01:11:59.480
that you're old enough to be using the platform, especially if you want to post, that's the most
01:12:03.160
And 10 years ago, it was like, well, how are we going to know?
01:12:06.220
Like, how can you possibly know that the kid is, is, you know, is 16?
01:12:09.180
Like, are they going to have to show the driver's license?
01:12:10.960
But now there's all kinds of companies that figured out how to do this.
01:12:14.300
So the banking industry, gambling industry, there's all kinds of companies that figured
01:12:20.160
How do we verify age in ways that are not taking your driver's license and giving it to
01:12:28.700
Um, and what I'm arguing is that the only reason I, I don't know if your kids are on
01:12:33.820
yet, you're 12 year old, but both of my kids, when they entered sixth grade, they said, daddy,
01:12:39.140
Everyone, or at least my, my, my, my son did, uh, you know, everyone has an Instagram account.
01:12:43.840
Um, and the only reason everyone has one is because everyone said to their parents, mom,
01:12:50.980
And that's the central idea of that, of the documentary, the social dilemma.
01:12:57.020
Um, and so as long as we can keep most kids off until 16, even if a few are able to sneak
01:13:01.460
on, that doesn't matter because that won't put pressure on everyone to be on.
01:13:06.980
Yeah, no, I, I took the road less traveled on this one and I said, no, I mean, my kids
01:13:13.620
are still young, but my 12 year old now he has a phone and just a phone and there's
01:13:18.880
no, there's no social media for him and there won't be for any of my children.
01:13:21.820
And you know, my, I hadn't even considered 16 as an opener.
01:13:25.840
I was thinking, enjoy college and good luck, but certainly no time before then, because
01:13:34.740
Now, if we get to the point where every single kid in the class is on some social media app,
01:13:40.560
I mean, I had one guest come on and say the thing that's really, that's you really want
01:13:43.900
to avoid is do not let Snapchat or Facebook or Twitter or one of these become the main
01:13:50.160
place where they text because these apps are not dumb.
01:13:54.280
So Snapchat has the ability now to create group chats and group texts so that they go
01:13:58.600
through the app to do all their party planning and so on.
01:14:01.700
And that's the absolute worst thing you could do.
01:14:04.820
But right now, John, I'm kind of in a good place because I'm a public figure and I basically
01:14:10.480
Because you won't be able to if you go on those websites and post something stupid.
01:14:15.900
And, you know, that's, they kind of accepted that.
01:14:19.860
So I can offer some advice to all the parents out there.
01:14:26.980
It's an organization that I co-founded with Lenore Skenazy, a wonderful woman who wrote this
01:14:31.480
brilliant book, Free Range Kids, about how to give your kids a childhood where they'll
01:14:36.220
They'll learn how to take care of themselves and how to have conflict and cooperation.
01:14:40.480
So at letgrow.org, we've got lots of ideas, lots of suggestions.
01:14:44.320
What I can add as a social psychologist is we can each put controls on our own kid, but
01:14:49.320
our kids, when they're teenagers, what matters most to them, of course, is their friends,
01:14:57.860
And so if you're the, if your kid is the only one who's not on, that will be painful.
01:15:02.520
Now in the long run, maybe that's good, but it will certainly be painful along the way.
01:15:05.340
Far better is if you can really make an effort to find some other parents, find some other
01:15:10.460
parents that share your idea, especially parents close enough where your kid can walk back and
01:15:20.160
There's all kinds of ideas on my website, jonathanhyte.com slash social media.
01:15:24.780
This is a social dilemma and we have to work together to break it.
01:15:28.080
There's limited, it's hard for us as individual parents to keep our kids away from these platforms.
01:15:35.680
It's so hard, you know, I mean, I can definitely see a situation where somebody says, you know,
01:15:40.680
they're all, they're all drinking or they're all smoking pot, you know, or they're all vaping.
01:15:46.760
And, you know, if your kid's the only one, he's not going to get invited in which I'd be
01:15:52.680
So social media is different though, because it's basic communication.
01:15:59.240
I mean, it really can be exclusionary if they don't, if they're all on one app and your
01:16:05.300
But I also plan on, on being like a little inspector Clouseau if they ever, I mean, I
01:16:09.900
am going to spy on everything and get ahead of problems because I do think, I don't really
01:16:16.100
believe in trust when it comes to your teenage kids and social media.
01:16:23.460
But this, but this is the difficulty is that at what point do they learn to, to moderate
01:16:27.620
themselves now these platforms are so powerful.
01:16:30.780
The law of the law of the reinforcement is so powerful that yet, if you don't do any
01:16:39.640
Whenever you open the account, you just lie about your age.
01:16:46.360
You have to monitor it, especially, especially early on, especially when they're going through
01:16:49.500
puberty, you know, 11, 12, 13, 14 kids must not be on, on social media, especially
01:16:54.920
Um, but at a certain point before they go to college, you have to give it the more autonomy.
01:16:59.880
Um, my son, uh, knew that he couldn't have an Instagram account in middle school, but
01:17:06.340
Uh, when he joined the track team in 10th grade, uh, now he's, he's with a group of friends.
01:17:11.040
He just went ahead and opened an Instagram account by himself.
01:17:13.100
Didn't ask me for permission, but that was appropriate in my family.
01:17:20.220
Um, I, you know, I do trust him and, uh, you know, maybe he'll betray that trust, but
01:17:24.520
you have to, of course, you have to go with what your kid is like and what the situation
01:17:28.880
But, you know, we have to, the job of a parent is to work him or herself out of a job.
01:17:33.680
Um, that's something Greg and I say in our book and, uh, we, it's, it's hard with social
01:17:41.700
Mine are still relatively young and it is painful to think about, but, uh, I know you
01:17:47.020
I want to talk a little bit about that, about the, um, sort of the, the approach that you,
01:17:52.320
that you were speaking of in the, in the Lekero project, because I'm, I'm a big believer
01:17:55.840
in it and the total lack of autonomy for children today is a massive problem and it's feeding
01:18:00.420
Uh, we'll pick it up there with John height after this quick break.
01:18:09.840
John, the, the three proposals that you have among, among others, but the three to sort
01:18:14.140
of address some of these issues, the, the catastrophic fall of the tower of Babel include
01:18:20.620
harden our democratic institutions, reform social media and prepare the next generation.
01:18:31.060
Um, what do you mean by harden our democratic institutions?
01:18:33.740
Um, so, um, the, the key to a healthy democracy is having good institutions.
01:18:41.220
And if you go around the world, those places settled by great Britain tend to have more
01:18:44.820
stable democracy than those places settled by Spain, uh, for example.
01:18:48.680
Um, and so, especially now that we're going through rapidly rising political polarization
01:18:54.500
and cross-party hatred, and we're seeing some beginnings of political violence, which is
01:18:58.760
Um, we have to make sure that our democratic institutions are trusted and trustworthy and
01:19:04.560
that they can function even if things get a lot worse in terms of cross-partisan hatred.
01:19:08.960
And so, for example, I'm so disheartened by what has happened with the Supreme Court in
01:19:13.940
terms of, as I see it, I'm a, I'm a nonpartisan centrist as I see it, what Mitch McConnell did
01:19:19.800
in, in denying Obama a Supreme Court nomination, I think was, uh, it was a hardball baseball move
01:19:25.920
that I think damaged legitimacy and the, and the respect of the institution.
01:19:29.060
That was very bad for the Supreme Court for the country.
01:19:31.020
Um, and now where we are is, uh, you know, many people on the left are not going to trust
01:19:36.420
They don't think the current makeup is what it should be.
01:19:38.700
Anyway, whatever you think about it, my point is just that it should not be as much of a
01:19:45.100
The fact that we're picking judges based on, you know, how old they are.
01:19:54.540
Every president gets an appointment every two years, things like that.
01:19:58.060
If we do that, that just regularizes the process.
01:20:01.280
Now the process, of course, there's always politics in the process, but it's not, we're
01:20:05.640
not fighting to the death because so much is at stake over each, over each appointment.
01:20:09.440
So it's, you know, and gerrymandering of electoral districts, there's just a lot of things we
01:20:13.620
can do so that, you know, you want the Yankees and the Red Sox to have a good baseball game.
01:20:17.460
You don't want, say the Yankees to get to control all of the rules or the Red, I mean, or
01:20:22.420
We've got to fix the game and then we can have the two teams play, play ball.
01:20:26.540
Yeah, it's hard to find nonpartisans to oversee something like elections.
01:20:31.500
I know that's one of your things, like make sure we have somebody we have in a position
01:20:39.660
That's always a partisan person and they tend to, you know, push it for whatever side they're
01:20:45.980
But I don't know, like a few people, the people who are truly nonpartisan don't get
01:20:52.320
True, but it doesn't have to be, it doesn't have to be that every person is nonpartisan.
01:20:55.820
Suppose you had a commission to draw electoral districts in your state and the rule is you
01:21:01.720
go for generally compact, you know, you can't have long stringy districts, generally compact.
01:21:07.140
Now you have some people on the left, some people on the right, but they're not far left
01:21:11.120
And of course they're partisans, but they also, they live in the same town.
01:21:15.320
Maybe they're, you know, they have a lot in common.
01:21:18.400
They can work it out just as the jury works things out.
01:21:26.860
I think that as much as I agreed as a legal matter with Citizens United, I do think it
01:21:31.080
opened up such a floodgate of corporate cash into campaigns that it made individual politicians
01:21:36.820
beholden to like one donor instead of feeling any need to work across the aisle.
01:21:41.880
And so I don't know exactly the reform that's going to solve that.
01:21:45.220
But just because it's constitutional doesn't mean it's good.
01:21:49.440
And it's something we might take a hard look at.
01:21:54.180
Think about it like if you love America, if you think that America is and has been and
01:21:58.620
should be a beacon to the world about self-governance, that we can govern ourselves.
01:22:02.920
We don't, you know, authoritarians, they can do certain big things well, but in the long
01:22:09.320
If you want, if you want the American experiment to succeed, you've got to think about the rules
01:22:15.480
So, you know, we want people running for office to be responsive to their constituents.
01:22:23.080
The more they're incentivized to pay attention just to a few rich donors or just to their
01:22:27.220
partisan extremes, the worst system of government we have and the more China ultimately wins.
01:22:33.360
But I mean, as I say that, and I, you know, I said how I feel, but I hear the left saying,
01:22:38.380
you know, hate speech isn't free speech all the time.
01:22:40.720
You know, they said that literally our pal Michael Knowles just caused a controversy
01:22:49.420
He's not, he doesn't really believe in affirming gender pronouns and all that.
01:22:53.940
And they said, oh, no, we believe in free speech, but we just, but hate speech is not
01:23:00.640
And it's the reason the first amendment was created and all that.
01:23:03.520
So they want to burn down the first amendment in the constitution.
01:23:06.860
You know, I just can see the consequence of why free speech laws are often ones that
01:23:11.060
may not be absolutely perfect for the, for the union.
01:23:15.180
So let's move on to the second bucket of reforms, which is reform social media to make it less
01:23:19.820
Now, whenever you say reform social media or regulate, people think what we're talking
01:23:24.720
about is the government's going to decide who gets to speak.
01:23:27.600
The government's going to decide what content is legal.
01:23:34.000
And that's what almost everyone talks about, but what got us into this mess, isn't that
01:23:39.240
some people can post crazy conspiracy theories.
01:23:41.640
They could always do that back before the internet.
01:23:44.660
What got us into this mess is that, is that now since 2009, the more outrageous something
01:23:51.580
It's a change in the dynamics of the platforms.
01:23:54.160
That's what has really, that's what knocked over the tower of Babel.
01:23:57.940
So in this bucket of reforms to social media, um, uh, it's things like, um, just, uh, reforming
01:24:06.360
the, doing things that things don't go viral so quickly.
01:24:09.760
So one of the most important things we could do is actually verify, um, verify identity.
01:24:15.280
It doesn't mean you have to post with your real name.
01:24:16.960
You can still post anonymously, but if you want to post content, banks have no, your customer
01:24:24.480
You can't just open an account with any bank and give a fake name.
01:24:27.860
And I think it should be the same on, on at least the large platforms, the ones that really
01:24:37.940
But if you want to reap the advantages of these viral dynamics on a platform that has a special
01:24:43.420
protection from section two 30, the platform has a minimum obligation to verify that you're
01:24:48.060
a human being and not a Russian agent or a Russian bot that you're old enough to be
01:24:55.160
So whether we're just authenticating that you're a human, whether authenticating that
01:24:58.600
you're a human, you're old enough, but a few things like this, that would knock out
01:25:01.380
almost all the bots and it would reduce some of the really nasty behavior.
01:25:05.040
Now, who wants to be in a place where you say something and you just attacked by, by,
01:25:09.240
you know, thousands of, uh, thousands of accounts.
01:25:13.740
If this, if they're going to be important to our democracy, we need to make them places
01:25:20.920
And there's a second layer of what do we do to protect our children?
01:25:23.180
We talked about that, uh, on, in terms of online.
01:25:25.900
Um, then the third is very interesting and I, I love it and it's a cause near and dear to
01:25:34.920
Um, your, your position is, and I share it entirely.
01:25:46.040
This is a wonderful notion from Nassim Taleb, uh, you know, glass is fragile.
01:25:52.020
Plastic is resilient, but there are certain things where if you drop them, they get stronger
01:25:58.820
Obviously I'm not saying physically drop your kids, but the point is if you protect your
01:26:02.460
kids, like if you protect their immune systems, they don't encounter bacteria, you're not helping
01:26:06.900
You're actually crippling the development of their immune system.
01:26:09.720
And if you protect your kids that nobody ever teases them, nobody ever insults them.
01:26:16.480
So, um, we have to prepare them for a world in which a lot of people don't share their
01:26:24.820
Now we got concerned about bullying and of course their bullying is a real thing, especially
01:26:29.860
when it goes on for multiple days, it ruins a kid's life.
01:26:32.420
So we have, it's a fine line between preventing bullying and preventing conflict.
01:26:38.660
We have to, kids have to have a lot of unsupervised experience.
01:26:41.040
Um, and you know, what we did in, in, in 2010 or so is they're supposed to have a lot of
01:26:45.780
experience, but we put them on experience blockers.
01:26:49.300
Once you, you know, once you have it, um, you know, once you, once your kid is, uh, is,
01:26:54.060
is on an experience blocker, they're not going to have the normal sorts of conflicts.
01:26:57.580
They're going to have, um, they're going to, everything's going to be immediate through
01:27:01.420
So, um, so we have to attend to child development.
01:27:03.920
We have to give them a lot more unsupervised experience.
01:27:14.020
We have our kids on experience blockers and it's the phone that you have right in your
01:27:17.280
hand that you let your kid use or the one you gave him or her.
01:27:22.580
So both of these two and three in terms of your reforms are getting at one of my questions
01:27:29.340
here, which is, and it relates to the entire discussion we've had over these two hours.
01:27:32.480
The, the means to kill somebody socially can very much be found in social media.
01:27:41.220
You know, the Twitter mob piles on with the retweets and so on.
01:27:45.140
And they, they take somebody down, they cancel somebody that ruined somebody's life.
01:27:48.200
The means to kill is very much embedded in social media, but, but the desire to kill,
01:27:59.720
And we just finally found the means to, you know, express it or is the desire to kill
01:28:05.980
amongst this younger set in particular related to the things we're talking about?
01:28:11.880
If we get our kids, you know, quote prepared, and if we don't treat them as fragile and if
01:28:17.420
we expose them to different ideas and if we do all the things, are they going to be less
01:28:22.400
likely to want to use the social media for evil and, and cruelty in this way?
01:28:28.260
Well, I think if they're mentally healthy, um, uh, I think they will be stronger and kinder.
01:28:35.160
And I think if you are, uh, anxious, insecure, and fragile, you're more likely to seek solace
01:28:40.560
and comfort in a, in a mob, in a movement, in a group.
01:28:43.500
Um, and when that group engages in something, you're going to want to fit in with that group.
01:28:47.480
You're not going to have the guts to stand up against it.
01:28:50.060
Um, so I think for so many reasons, look, our kids' mental health is plummeting and this
01:28:57.540
The surgeon general recently put out an advisory basically saying we have mental health, uh,
01:29:02.260
epidemic in this, in this country for teenagers.
01:29:05.020
Um, so I can't say that if we give kids normal childhoods and we let them have conflicts and
01:29:10.300
experiences on the playground, let them make teams, let them enforce rules, you know, that's
01:29:14.280
going to certainly be good for their mental health and their development.
01:29:16.400
And I can't say that's going to keep them from being nasty on social media.
01:29:19.920
In fact, look, you know, you and I know a lot of the people attacking us, almost all
01:29:25.580
Um, so preparing them for adulthood alone, isn't going to stop the viral dynamics.
01:29:30.600
That's why I keep focusing on the architecture.
01:29:39.440
We can't, it's very difficult to find, find truth.
01:29:43.900
Um, but we can change the dynamics so that, so that, uh, at present, the nastier you are,
01:29:51.500
the more outrageous you are, the more successful you are.
01:29:56.020
It's the same way you wouldn't go to a kid who's mentally struggling and say, you know,
01:30:02.180
with sort of a, a, a layout in front of him, here are all the ways available to ending
01:30:07.660
No, no same human would ever present that to a kid who is struggling.
01:30:11.560
And the same way we shouldn't have social media companies sort of presenting to them
01:30:15.680
the panoply of ways that their mental fragility can be exploited and used against others and
01:30:25.780
I want to say this, you, this is from the let it let grow project.
01:30:29.400
Uh, as far as your mission, we reject the idea that kids are in constant physical emotion
01:30:34.680
or psychological danger from creeps, kidnapping, germs, grades, flashes, frustration, failure,
01:30:39.100
baby snatchers, bugs, bullies, men, disappointing playdates, and, or the perils of a non-organic
01:30:45.960
Somehow our culture has become obsessed with kids' fragility and it's lost sight of their
01:30:53.060
Let grow believes today's kids are smarter and stronger than our culture gives them credit
01:31:05.220
Cause there's one thing later where you talk about, tell your kid, I've got a homework
01:31:09.880
You go home and you do something new on your own, climb a tree, run an errand, make a meal.
01:31:16.720
Uh, but so realistically, you know, how does the parent, cause I think most parents who
01:31:20.300
are like you or like I am, they're not, they don't need to be told this, but the people
01:31:23.860
who are holding on a little, who think maybe I am the helicopter parent.
01:31:28.360
What, what are realistic steps they can take to sort of reel back?
01:31:34.120
So once you recognize that your kid has to learn how to do things on her own, uh, that's
01:31:40.100
Now you can say, okay, well, let's, let's talk about the things that you could do on your
01:31:44.000
And if you've never walked, you know, if you're six or seven, you've never walked the
01:31:48.440
Um, so if you sit down with your kid and you say, you know, what are some things that
01:31:53.020
Do you think, would you, you know, do you think you can go to the store and get milk for
01:31:56.340
Um, you know, you'll find that the kid actually wants to do things.
01:31:59.340
Now, what we suggest at let grow is do this in elementary school and get your elementary
01:32:06.840
Um, and if all the kids are, uh, coming up with something to do with their, you know,
01:32:11.060
at home, an errand, make dinner for us, whatever it is.
01:32:14.360
It's, it's an amazing thing that happens when the kid does this.
01:32:17.780
Um, they are so they're bursting with pride and then they do it.
01:32:23.020
Uh, so when my daughter was, uh, was six, uh, we had her bring me lunch here in New York
01:32:31.300
Um, and, uh, I was terrified and you know, my wife sent her off and I was waiting at the
01:32:36.380
office and I actually kind of like snuck around the corner to see.
01:32:39.600
But the point is, when she got to, when she got to my office, she was just bursting and
01:32:50.340
Whereas the way we're raising kids is to believe you can't do anything.
01:32:57.240
And that's the way to raise a kid who becomes depressed, anxious, fragile, and even suicidal.
01:33:01.760
And as you point out, let them play free play with kids of all ages, where there's a,
01:33:05.760
there's a social system where you get clipped, you know, before you get too out of
01:33:09.580
line at usually at proportionate levels and you learn, you learn by taking little risks
01:33:14.580
and having them either rewarded or punished appropriately, or sometimes inappropriately,
01:33:18.920
And I love the distinction you drew between that experience and chronic bullying, which
01:33:24.040
And you do need to step in on completely agree with all of that.
01:33:27.540
My God, this has been a great, great discussion.
01:33:32.620
What's the, the next one's coming out in 23, which is too long.
01:33:39.260
The title is life after Babel adapting to a world we can no longer share.
01:33:44.760
It's about how we live in a world in which there are no shared narratives where this,
01:33:48.480
this kind of chaos is going to be with us forever.
01:33:50.260
As far as I know, it's for the rest of our lives.
01:33:54.920
I a hundred percent would love to have you on, uh, when you release it to help promote
01:33:58.280
And in between then and now I'm going to be working on taking down the president of Princeton.
01:34:06.780
Thanks for sharing your wit and intellect with us.
01:34:13.960
We're all talking about what, are we secretly helicopter parents?
01:34:19.680
We've got the guys from the fifth column and we'll talk to you then.