The Joe Rogan Experience - March 19, 2024


Joe Rogan Experience #2121 - Jonathan Haidt


Episode Stats

Length

2 hours and 1 minute

Words per Minute

180.90594

Word Count

21,965

Sentence Count

1,685

Misogynist Sentences

17


Summary

In this episode of the Joe Rogan Experience, the comedian and author Jeff Perla joins me to talk about his new book, The Coddling of the American Mind, and how the over-emphasis on social media has led to an epidemic of mental illness in America's youth. We talk about the dangers of too much screen time, the impact it's having on our mental health, and what we can do to mitigate the problem. It's a great episode, and I hope you enjoy it as much as I enjoyed recording it. If you like what you hear, please HIT SUBSCRIBE and leave us a rating and review on Apple Podcasts and other podcasting platforms. You can also join the conversation by using the hashtag and tag in the comments section below. Thanks for listening and Happy Listening! Joe and Jeff - See linktr.ee/TheJoeRoganExperience Subscribe to the show on iTunes. Learn more about your ad choices. Rate, review and subscribe to the podcast. Enjoy this episode and the others like it on your favorite streaming platform. And spread the word to your friends about it! Timestamps: 1:00 - How much time does it take to make a podcast? 2:30 - What are you spending on your social media? 3:15 - What is too much time on the internet? 4:40 - What do you spend on the Internet? 5: How much does it really mean to you? 6: What is a good day? 8:00 9:00 | How much of it matters? 11: What does it matter? 12:30 | What is enough? 13: How do you need to be productive? 15: Is it possible to have a good life? 16:30 17:40 | How do I have it all? 18:40 What is it better than a good night? 19:00 / 16:00 Can you have a full life ? 21: What do I get it? 22:00 Is it better? 27: What can I get more than one hour of that I could I have? 24:00 or less than that? 25:00? 26:00 Should I have more time to do something else? 29:00 + +?


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:13.000 Good to see you, sir.
00:00:14.000 Good to see you again, Jeff.
00:00:16.000 The same problems that you talked about when you were here last that I've referenced many times since on the podcast have only exacerbated, unfortunately.
00:00:25.000 And that's why you wrote this, The Anxious Generation.
00:00:30.000 And it could not be more true how the great rewiring of childhood is causing an epidemic of mental illness.
00:00:38.000 I don't think anybody can dispute that.
00:00:40.000 Yeah.
00:00:41.000 When I was on last time, there was a dispute.
00:00:43.000 There were some psychologists who said, oh, this is just a moral panic.
00:00:46.000 They said this about video games and comic books and, you know, this is not a real thing, they said.
00:00:54.000 Now they don't.
00:00:55.000 Yeah.
00:00:55.000 I think it was pretty obvious.
00:00:57.000 I think it was only their preconceived notions that were keeping them from admitting it before or at least looking at it before.
00:01:03.000 Or maybe they don't have children.
00:01:06.000 You know, it could be that.
00:01:07.000 I think a lot of older people, particularly boomers, they're a little bit disconnected from it because they're not, unless they're addicted to Twitter, you know, they're not engaging in this stuff.
00:01:16.000 Yeah.
00:01:16.000 And they're often thinking, you know, when I was a kid, we watched too much TV and we turned out okay.
00:01:20.000 Yeah.
00:01:20.000 But part of the message of the book is that social media and the things kids are doing on screens are not really like TV. They're much, much worse for development.
00:01:28.000 Yeah, and even watching too much TV, I don't agree that they turned out okay.
00:01:33.000 I think it had a pervasive effect.
00:01:37.000 It did.
00:01:38.000 But nothing like this.
00:01:39.000 Well, that's right.
00:01:40.000 Because when we were watching TV, I'm a little older than you.
00:01:43.000 I was born in 1963. So I grew up watching a lot of TV. Maybe an hour or two a day, weekdays, and then two or three hours on the weekends.
00:01:50.000 But it was a bigger screen.
00:01:52.000 You're watching with your sisters or with your friends.
00:01:54.000 You're arguing about things.
00:01:56.000 You're eating.
00:01:56.000 So it's actually pretty social.
00:01:58.000 But now kids are spending...
00:02:01.000 The latest survey Gallup finds that it's about five hours a day just on social media.
00:02:07.000 Just social media, including TikTok and Instagram.
00:02:10.000 And when you add in all the other screen-based stuff, it's like nine hours a day.
00:02:14.000 And that's not social.
00:02:15.000 It's private on your little screen.
00:02:17.000 You're not communicating with others.
00:02:19.000 So in all these ways, the new way that kids are digital is really not like what we had when we were watching TV. It's also an extraordinary amount of wasted resources.
00:02:32.000 I'm always embarrassed when I look at my phone.
00:02:34.000 When I see my screen time, I'm like, four hours?
00:02:36.000 Like, that's four hours I could have done so many different things with.
00:02:40.000 That's right.
00:02:41.000 And so that's the concept of opportunity cost is this great term that economists have, which is the cost of, you know, if you buy something, you know, if you invest, you know, an hour of your time and $100 to do something, how much does it cost?
00:02:53.000 Well, you know, $100, but you could use that $100 and that hour for something else.
00:02:58.000 So what are the things you gave up?
00:03:00.000 And when screen time goes up to, now it's about nine hours a day in the United States.
00:03:06.000 Nine hours a day, not counting school.
00:03:07.000 Average?
00:03:08.000 Average.
00:03:09.000 Average.
00:03:09.000 Is that for a certain age group?
00:03:11.000 Teenagers, yeah.
00:03:12.000 Not little kids.
00:03:13.000 But, you know, 13 to 15, 17, that range, that's when it's heaviest.
00:03:18.000 It's around nine hours a day, and so the opportunity cost is everything else.
00:03:24.000 Imagine if somebody said to you, Joe, you've got a full life.
00:03:27.000 Here, you have to do this additional thing for nine hours.
00:03:30.000 That's insane.
00:03:31.000 That would push out everything else, including sleep.
00:03:34.000 Yeah.
00:03:37.000 When you are now talking to people that agree that this is an issue, what changed?
00:03:46.000 So, you mean, what changed?
00:03:47.000 Like, why is there now more agreement?
00:03:48.000 Yes.
00:03:49.000 Yeah.
00:03:49.000 So, in 2019, when I was last here with you, my book, The Coddling of the American Mind, had just come out.
00:03:55.000 And back then, people were beginning to sense that, you know, this internet, the phones, the social media that we were all so amazed by, you know, there was a very positive feeling about all this stuff in the early part, you know, like in the 2000s.
00:04:09.000 Sentiment was beginning to turn, but there was a big academic debate because when you look at studies that look at how, you know, do kids who spend a lot of time on screens, do they come out more depressed?
00:04:20.000 The answer is yes, but the correlation is not very big.
00:04:22.000 So there was a big argument among researchers, and that's when I got into this around 2019, really getting into that debate.
00:04:29.000 And I think that Gene Twenge and I really had good data showing, you know, there is an issue here.
00:04:34.000 And then COVID came, and that confused everything.
00:04:37.000 Because, you know, basically when I was on with you last time, 2019, I was saying, you know, what kids most need is less time on their devices and more time outside playing unsupervised.
00:04:48.000 Let them be out unsupervised.
00:04:49.000 That's what we needed, 2019. COVID comes in, boom, exactly the opposite.
00:04:53.000 What do kids get?
00:04:54.000 No more time unsupervised.
00:04:56.000 You can't even go out.
00:04:57.000 I mean, in New York City, they locked up the playgrounds.
00:04:59.000 They locked up the tennis courts.
00:05:01.000 It was insane.
00:05:02.000 No time outside with your friends.
00:05:04.000 Oh, spend your whole day on screens.
00:05:05.000 So that made everything worse.
00:05:07.000 But people thought, oh, yeah, the kids are really messed up now from COVID. But they were wrong.
00:05:13.000 COVID was terrible for a lot of kids.
00:05:15.000 When you look at the mental health trends over the last 20 years, COVID was a blip.
00:05:21.000 Actually, you know what?
00:05:21.000 I've actually got some charts.
00:05:23.000 If you don't mind, I'd like to actually show these.
00:05:27.000 Did you send the data to Jamie so he could pull it up?
00:05:29.000 I haven't sent it yet, but I'll...
00:05:31.000 Oh, right, because you want...
00:05:32.000 Yeah.
00:05:33.000 Do you want to stop and do that?
00:05:34.000 Yeah, let's pause real quick so you can give out...
00:05:36.000 Jamie will give you the email address.
00:05:38.000 Okay, we're back.
00:05:40.000 All right.
00:05:41.000 What are those things?
00:05:43.000 Oh, so these are stickers for your kids.
00:05:46.000 So as part of the book, I'm trying to launch a movement called Free the Anxious Generation.
00:05:50.000 Here you go.
00:05:51.000 You have two younger kids.
00:05:54.000 And so I've teamed up with the artist who did the book cover, Dave Cicerelli, who's created these incredible artworks.
00:06:01.000 There's going to be billboards.
00:06:02.000 He's putting together a 12-foot-tall milk carton, which is going to be traveling around different cities with this...
00:06:12.000 Missing childhood.
00:06:14.000 Do they do that anymore?
00:06:15.000 The milk carton thing?
00:06:16.000 No, I don't think so.
00:06:18.000 Mmm.
00:06:18.000 Yeah.
00:06:19.000 So, I don't know what your kids think about social media and whether they think it's a good thing or a bad thing, but we are hopeful that members of Gen Z are going to start, and they are starting to advocate that, you know what, this is messing us up.
00:06:31.000 Mmm.
00:06:32.000 Okay, so here's the graph.
00:06:34.000 Okay.
00:06:35.000 So this is the graph that I showed last time I was on.
00:06:39.000 What it shows, because I know most of your listeners are probably just listening to the audio, it shows that from 2005 to 2010, the rates of depression in girls was about 12% of American girls had a major depressive episode in the last year.
00:06:51.000 And for boys, it was about 4% to 5%.
00:06:53.000 And it's flat.
00:06:54.000 There's no change.
00:06:54.000 Then all of a sudden, around 2012-2013, the numbers start rising, especially for girls.
00:07:01.000 And it goes all the way up to 20% for girls.
00:07:03.000 So that was a huge rise, and that's what I showed you last time.
00:07:07.000 What is the difference between boys and girls?
00:07:10.000 So girls suffer from more internalizing disorders.
00:07:13.000 That is, when girls have difficulties, they turn it inwards, they make themselves miserable.
00:07:19.000 So girls suffer from higher rates of anxiety and depression.
00:07:22.000 That's always been the case, especially once they hit puberty.
00:07:25.000 Boys, when they have psychological problems, they tend to turn it outwards.
00:07:29.000 They engage in more violent behavior, deviant behavior, substance use.
00:07:33.000 So boys, it's called externalizing disorders.
00:07:37.000 But you can see both boys and girls are getting more depressed.
00:07:39.000 It's just that the effect is bigger for girls.
00:07:42.000 So boys have gone up to about 7% and girls are way up to 20. That's right.
00:07:47.000 And that was 2019. So one out of five girls.
00:07:50.000 That's what it was, that's right.
00:07:52.000 Was?
00:07:53.000 Was, that's right.
00:07:54.000 And then COVID comes in, so if we can have the next slide.
00:07:58.000 So then COVID comes in.
00:07:59.000 And now this is the exact same data set, just this federal data, just got a few extra years of data.
00:08:06.000 And what you can see is that it goes way the hell up.
00:08:10.000 And if you look at the 2021 data point, you can see that little peak at the very top there.
00:08:15.000 That's because of COVID. That is, COVID did increase things.
00:08:19.000 It did make kids more depressed.
00:08:21.000 But as you can see, it's a blip.
00:08:23.000 COVID was just a tiny effect compared to this gigantic increase.
00:08:28.000 And so on the last slide, it was 20% of girls.
00:08:31.000 Now it's almost 30% of girls who had a major depressive episode in the last year.
00:08:37.000 And for boys, it's up to 12%, which is still quite a lot.
00:08:40.000 It's more than a doubling, although much less than for the girls.
00:08:44.000 It's still, even if you look at boys, or excuse me, if you look at girls from 2018 pre-COVID, that ramp is very steep, the upward ramp.
00:08:53.000 That's right.
00:08:54.000 And that might be TikTok.
00:08:57.000 So what happens is a lot of things change around 2011, 2012. 2010 is when you get the front-facing iPhone.
00:09:06.000 It's when Instagram is founded.
00:09:09.000 It's when kids are getting high-speed data plans.
00:09:12.000 So my argument in the book is that we had a complete rewiring of childhood between 2010 and 2015. In 2010, most of the kids had flip phones.
00:09:21.000 They didn't have Instagram.
00:09:22.000 They didn't have high-speed data.
00:09:24.000 So they would use their flip phones to get together with each other.
00:09:27.000 They'd communicate with each other.
00:09:28.000 By 2015, about 80%, 70%, 80% have a smartphone.
00:09:32.000 Most of them have high-speed data, unlimited plan, Instagram accounts.
00:09:37.000 And this really messes up the girls.
00:09:39.000 So that's what I think happened between 2010 and 2015. TikTok becomes popular only really more 18, 19, 20. And it's so new, we don't have good data on just TikTok.
00:09:51.000 But I suspect that that sort of extra acceleration might be due to TikTok.
00:09:55.000 What specifically about TikTok?
00:09:58.000 So this is something I'm just really beginning to learn.
00:10:01.000 I don't even have much on it in the book.
00:10:05.000 So kids love stories, and stories are great.
00:10:09.000 All around the world, people tell children stories.
00:10:12.000 There are myths.
00:10:14.000 We see plays.
00:10:15.000 We see television shows.
00:10:17.000 And so I asked my undergrads at NYU, I said, how many of you use Netflix?
00:10:23.000 Almost everybody says yes.
00:10:24.000 How many of you wish Netflix was never invented?
00:10:27.000 Nobody.
00:10:28.000 Nobody.
00:10:29.000 Watching stories is not a bad thing.
00:10:31.000 TikTok is not stories.
00:10:33.000 It's little tiny, tiny bits of something.
00:10:37.000 And they're short.
00:10:39.000 They don't add up to anything.
00:10:40.000 They're incoherent.
00:10:41.000 They're often disturbing and disgusting.
00:10:43.000 I mean, people are being hit by cars, people are being punched in the face.
00:10:47.000 And it's much more addictive and with no nutritive value.
00:10:51.000 They're not really stories.
00:10:52.000 And so it seems to be much more addictive.
00:10:54.000 Kids really get hooked on it, much more so than Netflix or anything else.
00:10:58.000 And it depends on what you're watching, but I suspect that so many of them are consuming stuff about mental illness.
00:11:06.000 It has a variety of effects that we don't even understand yet.
00:11:10.000 Now, I know that there's some push right now currently to ban TikTok.
00:11:16.000 And there's a lot of people that are very torn on this because they don't want to give the government the ability to ban social media.
00:11:23.000 What is the argument about banning TikTok?
00:11:25.000 What specifically are they talking about?
00:11:28.000 The main thing they want to do is separate them from the company ByteDance that owns them and just make them an American company.
00:11:34.000 Yeah.
00:11:35.000 So they can still operate, I suppose.
00:11:36.000 So it's a data issue?
00:11:38.000 Well, it's a national security issue.
00:11:41.000 Yeah, right.
00:11:41.000 So thank you.
00:11:42.000 Let's separate the national security issue from the mental health issue.
00:11:46.000 I have a lot of libertarian friends.
00:11:48.000 I have a lot of libertarian sympathies.
00:11:49.000 I would be uncomfortable about the government banning a company or a product because it's harmful to children.
00:11:56.000 I personally think we should just have age verification.
00:11:59.000 We should not have kids on certain things.
00:12:01.000 But if it was just a question of, you know, this is really bad for children, let's ban it.
00:12:05.000 Like, no, I don't think I would support that.
00:12:07.000 But TikTok is different because it is a Chinese-owned company.
00:12:12.000 And as many of your listeners will know, China, it says in whatever, it doesn't have a constitution, I don't think.
00:12:18.000 But by law, every Chinese company must do what the Chinese Communist Party tells it to do.
00:12:24.000 And that's what's so scary.
00:12:28.000 Instagram Reels and YouTube Shorts, they might have similar effects to TikTok, but the Chinese government can literally tell ByteDance to change what kids are seeing.
00:12:41.000 And they do that in China.
00:12:43.000 They tell them in China, you have to have this kind of content and not that kind of content.
00:12:47.000 There was an incredible episode of—you had Tristan Harris on.
00:12:53.000 Tristan Harris has this amazing podcast episode where they go into the national security risks, and they show that the day that Russia invaded Ukraine, TikTok in Russia changed radically.
00:13:04.000 Like, the government was on—like, you know, TikTok was on it.
00:13:07.000 Like, yep, we're going to do what Putin wants us to do.
00:13:10.000 So the idea that the most influential—the most influential platform on American children, The idea that that must do what the Communist Party tells it to do at a time when we have mounting tension with China and the possibility of a war.
00:13:25.000 I mean, as Tristan says, imagine if in the 1960s, the Soviet Union owned and controlled PBS, ABC, NBC, and all the kids' programs.
00:13:36.000 We would never have allowed that.
00:13:38.000 So I hope, listeners, I really strongly support this bill.
00:13:41.000 I think Representative Mike Gallagher, I think, was one of the ones proposing it, or at least certainly advocating for this issue.
00:13:50.000 I hope people will not see it as a TikTok ban, but they'll see it as an urgent national security move to force ByteTance to sell to a non-Chinese owner.
00:14:02.000 And specifically, what are they pointing to when they say national security risk?
00:14:06.000 What specifically have they seen?
00:14:09.000 So a lot of it seems to have to do with the data question.
00:14:14.000 Facebook pioneered this model in which the person using the product is not really the customer.
00:14:20.000 They don't pay the money.
00:14:21.000 They're the product.
00:14:23.000 The user is the product, not the customer.
00:14:26.000 And they give them data.
00:14:27.000 And the data can be used for all sorts of purposes, especially marketing and advertising.
00:14:31.000 And so TikTok has enormous amounts of data, and they can get all psychological on it because they know exactly how long you hesitated, how much you liked certain kinds of videos.
00:14:41.000 You know, many people have written articles on how TikTok seems to have known they were gay before they did, that sort of thing.
00:14:46.000 So TikTok has extraordinary amounts of data on most Americans, certainly most young Americans.
00:14:53.000 And they say, oh, but, you know, we don't share, like, it's in a server over here in Singapore, I don't know where, but, you know, it's not in China.
00:15:00.000 You know, oh, come on, come on.
00:15:02.000 You know, there's no way it could possibly be the case that the data is really separated and not available to the Chinese Communist Party.
00:15:10.000 And what are they pointing to in terms of the danger of this data that makes them want to have it sold to an American company?
00:15:18.000 I don't know whether the motivation behind the bill, I don't know whether it's that the Chinese would have some access to data on American citizens or whether what most alarmed me when I heard the Tristan Harris podcast was the ease of influencing American kids to be pro this or pro that on any political issue.
00:15:40.000 You're seeing that with Palestine and Gaza.
00:15:44.000 Yeah, I think so.
00:15:45.000 You're definitely seeing that now.
00:15:47.000 It's very obvious.
00:15:49.000 Well, it's very obvious with many things with TikTok.
00:15:54.000 Trans stuff, and there's a lot of different things that they're encouraging.
00:15:59.000 And people that are opposed to that are being banned, which is also very odd.
00:16:07.000 Specifically, like, female athletes.
00:16:09.000 We had Riley Gaines, who was the female athlete that competed against Leah Thomas.
00:16:13.000 And she has said that biologically male athletes should not be able to compete with biologically female athletes because they have a significant advantage.
00:16:21.000 And she was banned from TikTok just for saying that.
00:16:24.000 Yeah, that's right.
00:16:25.000 So this relates to the larger issue that we talked about last time and that I hope we'll continue to talk about today.
00:16:31.000 Which is that social media has brought us into an environment in which anyone has the ability to really harm anyone else.
00:16:49.000 Greg Lukianoff and I saw this in universities.
00:16:51.000 Why don't the university president stand up to the protesters who are shouting down visiting speakers?
00:16:57.000 Isn't there a grown-up in the room?
00:16:58.000 And then we saw it in journalism, newspapers and editors who wouldn't stand up for journalistic principles.
00:17:04.000 And so I think what has happened here is that social media allows whoever is angriest and can mobilize most force to threaten, to harass, to surround, to mob anyone.
00:17:17.000 And when people are afraid to say something, that's when you get the crazy distortions that we saw on campus.
00:17:32.000 Yeah.
00:17:40.000 That it's when social media became super viral after 2009, 2010. You get the like button, the retweet button.
00:17:46.000 Social media wasn't really bad or harmful before.
00:17:49.000 It wasn't terribly harmful before then.
00:17:50.000 But by 2012, 2013, it had really become as though everyone had a dart gun.
00:17:54.000 Everybody could shoot everyone.
00:17:55.000 And that's when we began sort of like teaching on eggshells in universities because our students could really do a lot of damage if we said one word they didn't like.
00:18:02.000 And it's not just the students, which is really disturbing.
00:18:05.000 We've talked about this before.
00:18:07.000 There was an FBI security specialist who estimated that somewhere in the neighborhood of 80% of the Twitter accounts were bots, which is very strange because that means that they're mobilizing specifically to try to push different narratives.
00:18:25.000 Yeah, that's right.
00:18:26.000 So if you think of, you know, people say, well, you know, now Twitter is the public square or things like that.
00:18:31.000 It's not a public square.
00:18:33.000 It's more like the Roman Colosseum.
00:18:35.000 It's more like a place where people say things and the fans in the stands are hoping to see blood.
00:18:43.000 To move our discussions onto platforms like that, that can be manipulated, that anyone—it doesn't have to be a foreign intelligence service.
00:18:53.000 It could be anybody who wants to influence anything in this country or anywhere in the world— They can, you know, for very little money, they can hire someone to create thousands, millions of bots.
00:19:04.000 And so we're living in this sort of funhouse world where everything is weird mirrors and it's very hard to figure out what the hell is going on.
00:19:12.000 Have you ever sat down and tried to figure out a solution to this other than trying to encourage people not to use it?
00:19:18.000 Jamie, does something happen if the volume just dropped lower?
00:19:20.000 Okay, so what was I just saying?
00:19:24.000 We're talking about solutions other than asking kids to not use it, which is very hard to do.
00:19:31.000 Yeah, that's right.
00:19:32.000 So, when we're talking about the democracy problems and the, you know, manipulation of politics or anything else, those are really, really hard.
00:19:40.000 I have a few ideas of what would help and we're not going to do them because, you know, all of them are like the left likes and the right doesn't or vice versa.
00:19:46.000 Like what are those ideas though?
00:19:47.000 Oh, things like, you know, like identity authentication.
00:19:51.000 If large platforms had something like know your customer laws, That is, you know, if you want to open an account on Facebook or on X, you have to at least prove that you're a person.
00:20:02.000 And I think you should be able to have to prove that you're a person in a particular country, I think over a certain age.
00:20:09.000 You prove those to the platform, not directly, you go through a third party.
00:20:13.000 So even if it's hacked, they wouldn't know anything about you.
00:20:15.000 You establish that you're a real person and then you're cleared.
00:20:18.000 Go ahead.
00:20:18.000 You open your account.
00:20:19.000 You don't have to use your real name.
00:20:21.000 If we did that, that would eliminate the bots.
00:20:24.000 That would make it much harder to influence.
00:20:26.000 That would make us have much better platforms for democracy.
00:20:29.000 Is that possible to do internationally?
00:20:32.000 Well, the platforms can certainly require whatever they want for membership.
00:20:37.000 Right now, they are legally required to ask you if you're over 13. If you're 13 or over, they ask it, and then they accept whatever you say, and that's it.
00:20:44.000 You're in.
00:20:46.000 So those rules could be changed, and they could be required to do more.
00:20:51.000 And they're based most in the United States, but their users are all around the world.
00:20:57.000 So yeah, that could be done.
00:20:58.000 So one of the things that people are nervous about when it comes to authentication is that if you could do that, then you could target individuals that wouldn't be allowed to be anonymous.
00:21:11.000 So you eliminate the possibility of whistleblowers.
00:21:14.000 No, no, no.
00:21:14.000 The point is that...
00:21:18.000 Right.
00:21:34.000 So I understand the concern, and there are values to having anonymity, but I think what we're seeing now is that the craziness, the way it's affecting, it's making it harder for democracies to be good, vibrant democracies, and it's making it easier for authoritarian countries like China to be powerful and effective authoritarian countries.
00:21:53.000 So I think we have to start weighing the pluses and minuses of the costs and benefits here.
00:21:58.000 Right, but how would you ramp that up?
00:22:00.000 How would you implement that internationally?
00:22:03.000 Say, if you're talking about people in Poland, just pick a country.
00:22:08.000 Well, the platforms can do whatever they want, but then, yes, if a company starts in Poland, then the US Congress would have no influence on that.
00:22:17.000 Right, like China could pretend and they could falsify the data that shows that these are individuals.
00:22:24.000 Oh, I see.
00:22:25.000 They wanted to empower a troll farm.
00:22:27.000 Oh, I see.
00:22:28.000 You're saying even if American companies did this, the Chinese could still get around it.
00:22:31.000 Yeah, that's true.
00:22:33.000 You're never going to have a perfect system.
00:22:35.000 But right now, it's just so easy and cheap and free to have massive influence on anything you want.
00:22:43.000 But the larger question here was, you asked me, what can we do?
00:22:45.000 And what I'm saying is, there are some things like identity authentication that I think would help, but yes, there are implementation problems.
00:22:52.000 There's all kinds of political questions.
00:22:53.000 So my basic point is, man, those problems, I don't know that we can solve, but we can do better.
00:22:58.000 Oh, and I should point out, a lot of these have to do with the basic architecture of the web.
00:23:02.000 When we move from web one, which was put up information, it's amazing, you can see things from everywhere.
00:23:07.000 To Web 2, which was directly interactive, now you can buy things, you can post stuff, and it's the Web 2 that gave us these business models that have led to the exploitation of children and everyone else.
00:23:22.000 And I'm part of a group, Project Liberty, if you go to projectliberty.io, that's trying to have a better Web 3, where people will own their own data more clearly.
00:23:33.000 As the architecture changes, it opens us up to new possibilities and risks.
00:23:37.000 So there are some hopes for a better internet coming down the pike.
00:23:42.000 Actually, I just wanted to put all this stuff out there about democracy to say this is really hard, but when we talk about kids and mental health, this is actually amazingly doable.
00:23:52.000 We could do this in a year or two, and the trick, the key to solving this whole problem with kids is to understand what's called a collective action problem.
00:24:02.000 So there are certain things where, you know, like if you have a bunch of fishermen and they realized, oh, we're overfishing the lake, let's reduce our catch.
00:24:13.000 And if one person does that and no one else does, well, then he just loses money.
00:24:17.000 But if everyone does it, well then actually you can solve the problem and everyone can do fine.
00:24:22.000 With social media, what we see over and over again is kids are on it because everyone else is.
00:24:28.000 And parents are giving their kids a phone in sixth grade because the kid says everyone else has one and I'm left out.
00:24:34.000 And over and over again, you see this.
00:24:36.000 When you ask kids, you know, how would you feel if I took your Instagram or TikTok away?
00:24:43.000 Oh, I'd hate that.
00:24:43.000 I hate that.
00:24:44.000 But then you say, well, what if it was taken away from everyone?
00:24:47.000 What if no one had it?
00:24:48.000 And they almost always say, that would be great.
00:24:51.000 There's an academic article that showed this with college students.
00:24:55.000 I did it as a test with my students at NYU. And a review of the book of The Anxious Generation in the Times of London The UK Times.
00:25:03.000 The woman ended by asking her 16-year-old, would you have liked there to be a social media ban until you were 16?
00:25:10.000 I think the daughter was like 18 at the time.
00:25:12.000 This was last month.
00:25:14.000 And the daughter says, would everyone else be off it too?
00:25:17.000 And she says yes.
00:25:18.000 And then the daughter says...
00:25:19.000 Yeah, I would have rather liked that.
00:25:21.000 And so you have this consumer product that the people using it, they don't see value in it.
00:25:26.000 They're using it because everyone else is.
00:25:29.000 And there's evidence suggesting it's messing up their mental health.
00:25:32.000 So anyway, this is a solvable problem if we act together.
00:25:37.000 And that's really what the book is about.
00:25:38.000 How would you do that, though?
00:25:40.000 Would you get all the parents to do it?
00:25:43.000 Would you get the social media companies to do it?
00:25:44.000 Like, how would you do that?
00:25:45.000 Yeah.
00:25:45.000 I'm not counting on the social media companies or Congress.
00:25:48.000 I'm assuming we'll never get help from either one.
00:25:50.000 Now, I hope I'm wrong about Congress.
00:25:52.000 But as a social psychologist, I'm trying to point out, you know, we can actually solve this ourselves.
00:25:58.000 And so the simplest one is this.
00:26:00.000 So I propose four norms.
00:26:02.000 If we can enact these four norms ourselves as parents and working with schools, we can largely solve the problem.
00:26:08.000 We can certainly reduce rates of mental illness a lot.
00:26:11.000 The first norm is the simplest.
00:26:12.000 No smartphone before high school.
00:26:15.000 Now people say, oh my god, but my kid needs a phone.
00:26:17.000 Sure, give him a flip phone.
00:26:19.000 Millennials had flip phones, and they were fine.
00:26:21.000 Flip phones did not harm millennials' mental health.
00:26:23.000 They're good for communication.
00:26:25.000 You text, you call, that's it.
00:26:27.000 So the first rule is no smartphones before high school.
00:26:30.000 And as long as a third of the parents do this, well, then the rest of the parents are free to say when their kid says, Mom, you know, I need a smartphone.
00:26:37.000 You know, some other kids have one.
00:26:39.000 Then you can say, well, no, here's a flip phone.
00:26:41.000 You'll be with the kids who don't have one.
00:26:44.000 Oh, and by the way, you're also going to get a lot more freedom to hang out with the other kids.
00:26:48.000 So we don't need everybody, but we need to break the feeling that everyone has to have one because everyone else has one.
00:26:55.000 Yeah, that sounds great on paper.
00:26:58.000 I just can't imagine that most parents would agree to it because there's just so many parents that don't pay attention.
00:27:06.000 That's true.
00:27:07.000 Especially two families where two people are working.
00:27:11.000 Yeah.
00:27:11.000 No, you're right.
00:27:12.000 You're right.
00:27:12.000 Just when we look right now, kids with married parents are trying harder to keep the kids off.
00:27:20.000 These things are good babysitting devices in the sense that the kids are off doing their thing.
00:27:24.000 You don't have to think about them.
00:27:26.000 So it is true that this would not be adopted universally at first.
00:27:32.000 But I think we could still develop a norm that it's just not appropriate for children to have a smartphone.
00:27:38.000 They should have flip phones.
00:27:40.000 And I think that any community that wants to do this, because what I find over and over again is that most parents are really concerned about this.
00:27:47.000 And this is across social classes.
00:27:49.000 Most parents are seeing the problems.
00:27:51.000 And so I don't have to convince parents to change their minds about something.
00:27:56.000 What I'm trying to do with the book is show them here are four norms that are pretty easy to do if others are doing them, and these are going to make your kids happier, less mentally ill.
00:28:09.000 Yeah, like I said, it sounds like a good suggestion.
00:28:12.000 I just don't imagine that with the momentum that social media has today and the ubiquitous use that kids are going to give it up.
00:28:20.000 They're not going to want to give it up.
00:28:21.000 I think there's a lot of kids that have had problems that if you talk to them alone and you say, wouldn't it be better if social media didn't exist, if they've been bullied or what have you, they'd say yes.
00:28:32.000 But the idea of getting...
00:28:35.000 A massive group of people to adopt this.
00:28:38.000 It's highly unlikely.
00:28:40.000 Well, you know, you may be right, but I'm encouraged because whenever I speak to Gen Z audiences, and, you know, I've spoken to middle schools, high schools, college audiences, I always ask, you know, do you think I got this wrong or do you think this is a correct description of what's happening?
00:28:55.000 They agree.
00:28:56.000 They're not in denial.
00:28:58.000 They see the phones are messing them up.
00:28:59.000 They see that social media is messing up the girls especially.
00:29:02.000 So, you know, even in middle school, certainly high school, the kids actually agree that this is a problem.
00:29:08.000 And so if it was offered to them, you know what, let's do the other three norms.
00:29:12.000 Let's get them all up for all the tape.
00:29:13.000 Okay, yeah, please.
00:29:14.000 All right, so the first is no smartphone before high school.
00:29:17.000 Second is no social media until 16. That one's going to be a little harder to do.
00:29:21.000 But the big platforms like Instagram, where you're posting and the whole world is seeing and strangers are contacting you, I think the age is currently 13 and it's not enforced.
00:29:31.000 I think that needs to go up to 16. Here, it would be nice if Congress would raise the age to 16 and make the companies enforce it.
00:29:39.000 But even if they don't, Parents, as long as many other parents are doing it, me, I as a parent, you know, my kids are 14 and 17, as long as many other parents are saying 16 is the age, then it's very easy for me to say that also.
00:29:52.000 That's the second norm.
00:29:53.000 Yeah, again, if you could get them to say it.
00:29:56.000 And I think the kids would push back so hard because so many other kids are on it and that's how they interact with each other.
00:30:02.000 Right, but Joe, you're just reiterating the collective action problem.
00:30:06.000 You're just saying they react because all the other kids are on it.
00:30:08.000 Yes.
00:30:08.000 So it does require a big push.
00:30:10.000 But I think we're ready.
00:30:11.000 I don't think we were ready in 2019. It wasn't as clear.
00:30:13.000 But now that we're through COVID, now that the numbers are through the roof, I think we're ready.
00:30:19.000 And if it starts in some places and not others, that's okay with me.
00:30:22.000 That's the way it's going to be.
00:30:23.000 And then we'll see whether it spreads.
00:30:25.000 And then we'll see the data.
00:30:26.000 Yeah, because look at smoking.
00:30:27.000 Smoking is highly addictive.
00:30:29.000 It was very common up through the 1990s.
00:30:31.000 And now it's very rare in high school.
00:30:33.000 Very few high school kids smoke.
00:30:34.000 So it's possible to change norms.
00:30:35.000 And what was the third?
00:30:37.000 The third is phone-free schools.
00:30:39.000 And this one is happening.
00:30:41.000 This is already happening.
00:30:42.000 So I've published articles in The Atlantic and on my substack, afterbabble.com, bringing together the research.
00:30:51.000 When kids have a phone in their pocket in school, they're going to be texting.
00:30:55.000 Because if anyone is texting during the day, during the school day, they all have to check because they don't want to be out of the loop.
00:31:01.000 They don't want to be the one who doesn't know.
00:31:03.000 So, when kids started bringing smartphones into school instead of flip phones, academic achievement actually went down.
00:31:10.000 Kids are stupider today than they were 15 years ago.
00:31:12.000 I mean stupider meaning measuring their academic progress.
00:31:15.000 After 50 years of improvement, it turns around after 2012. And this is true in the US and internationally.
00:31:21.000 So there's just no reason why kids should have the phone on them.
00:31:24.000 They should come in in the morning, put it in a phone locker or a yonder pouch, go about their day, and guess what?
00:31:29.000 The schools that have tried it, after a week or two, everyone loves it.
00:31:33.000 The kids are like, oh, wow, we actually talk in between classes.
00:31:36.000 We have five minutes in the hallway, we actually talk.
00:31:38.000 And you hear laughter, whereas right now in a lot of schools, it's just zombies looking at their phones in between as they're walking from class to class.
00:31:45.000 Yeah.
00:31:46.000 So the assumption is that from 2012 kids are just much more distracted?
00:31:52.000 Oh my god.
00:31:52.000 I mean look.
00:31:54.000 Joe, I think I heard you say in one of, yeah, it was a conversation you had a few weeks ago with a comedian friend of yours.
00:32:00.000 And I think this was a direct quote from you.
00:32:02.000 My fucking phone runs my goddamn life.
00:32:05.000 Does that sound like you?
00:32:05.000 Yeah, it sounds like me.
00:32:06.000 Okay.
00:32:08.000 So, you know, as adults, you know, we have a fully formed prefrontal cortex.
00:32:13.000 You and I had a normal childhood.
00:32:15.000 Our brains developed.
00:32:16.000 We have the ability to stay on task.
00:32:17.000 And man, it is hard.
00:32:19.000 With notifications coming in, there's always so many interesting things you could do instead of what you need to do.
00:32:24.000 So it's hard enough for us as adults.
00:32:26.000 Imagine if you didn't have a normal childhood where you developed executive function, where you developed that ability as a teenager.
00:32:33.000 Because puberty is when the prefrontal cortex, the front part of the brain, that's when it rewires into the adult configuration.
00:32:40.000 So the fact that we're scrambling kids' attention at the time when they're supposed to be learning how to pay attention I think is terrible.
00:32:50.000 Where do you think this is going?
00:32:52.000 This is my concern, is that this is just the beginning of this integration that we have with devices and that the social media model and it's been immensely profitable and incredibly addictive and there's a massive,
00:33:11.000 massive amount of capital that's invested in keeping us locked into these things.
00:33:16.000 Where do you think this goes from here?
00:33:18.000 Have you paid attention to the technology?
00:33:22.000 Like AI. Yeah.
00:33:23.000 Yes.
00:33:24.000 So let me just draw a very, very sharp, bright line between adults and children.
00:33:30.000 I'm very reluctant to tell adults what to do.
00:33:33.000 If adults want to spend their time on an addictive substance or device or gambling, I'm reluctant to tell them that they can't.
00:33:39.000 So when we're talking about adults, I think where this is going is, well, where it's gone so far is everything that you might want becomes available instantly and for free with no effort.
00:33:53.000 And so in some ways that's a life of convenience, but in other ways it's messing us up and it's making us weaker.
00:34:00.000 So, you know, you want sexual satisfaction?
00:34:03.000 Okay, here you go, free porn.
00:34:05.000 And it gets better and better and more and more intense.
00:34:08.000 You want a girlfriend or boyfriend who you can customize?
00:34:11.000 You have that already.
00:34:14.000 Advances in robotics are such that it's just a matter of time before AI girlfriends are put into these incredible female bodies that you can customize.
00:34:22.000 So I think the adult world, for young adults especially, is going to get really, really messed up.
00:34:28.000 And again, I'm not saying we need to ban it now.
00:34:31.000 But what I'm saying is, for God's sakes, don't let this be 11-year-old children's lives.
00:34:37.000 Let's at least keep children separate from all this craziness until their brains develop, and then they can jump into the whirlpool and the tornado.
00:34:46.000 But the fact that our 11-year-old girls are now shopping at Sephora for anti-wrinkle cream or, you know, all sorts of expensive skin treatments, this is complete insanity.
00:34:57.000 So let's at least protect the kids until they're through puberty.
00:35:01.000 Well, that would be nice.
00:35:03.000 That would be nice.
00:35:04.000 It's kind of essential, I think.
00:35:06.000 It's just the way I see adults being so hooked on these things.
00:35:11.000 There's so many adults that I know that are engrossed in this world of other people's opinions of everything they think and say.
00:35:18.000 And it just doesn't give you enough time to develop your own thoughts and opinions on things.
00:35:24.000 So many people are dependent upon other people's approval.
00:35:29.000 And there's just so many people that are addicted to interacting with people online and not interacting with exceptional people in the real world.
00:35:38.000 Yeah, that's right.
00:35:40.000 One way to think about this is let's look at junk food, which became very popular after the Second World War.
00:35:48.000 You know, the manufacturing of food became very good.
00:35:50.000 There were science labs.
00:35:52.000 At Frito-Lay, they studied the exact degree of tensile strength that a chip should have before it snaps.
00:35:58.000 And, you know, how do you make this?
00:35:59.000 What's the perfect crunch?
00:36:00.000 So they designed the foods to be as addictive as possible.
00:36:03.000 And in the 70s and 80s, Americans switched over to a lot of junk food and we became obese, like a huge increase in obesity.
00:36:10.000 And that kept going on for a few decades.
00:36:12.000 As I understand it, obesity has finally turned around a little bit.
00:36:16.000 And many people are still eating huge amounts of junk food, but at least some people are beginning to say, you know what, I'm going to resist that deep evolutionary programming for fat and sugar.
00:36:30.000 The companies played to that, they hijacked those desires, and they got us hooked on junk food.
00:36:35.000 But after 50 years, we're making some progress in pushing back and having healthier snacks and eating less.
00:36:42.000 What's the root of that progress?
00:36:44.000 I don't actually know the numbers.
00:36:45.000 I just know a few years ago, I saw something that for the first time, obesity actually went down in the United States.
00:36:50.000 I don't know that that's still true today, but this was like three or four years ago.
00:36:52.000 Before COVID, I saw something.
00:36:54.000 Do we know what caused it to go down?
00:36:57.000 I don't.
00:36:58.000 I'm just assuming that this is an issue that we dealt with as a society and we didn't know what we were doing at first and we got hooked and the efforts to educate people and to develop healthier alternatives.
00:37:13.000 So again, I should have looked at the data before I came here.
00:37:18.000 But I'm just using this as an analogy.
00:37:20.000 I'm sure Jamie can find something that points to it or doesn't point to it.
00:37:24.000 I'm surprised.
00:37:25.000 Is obesity still rising in the United States or is it actually a little lower than it was 10 years ago?
00:37:29.000 That's the question.
00:37:29.000 I mean, I quickly found this study here, but I haven't even got a chance to look at it yet.
00:37:34.000 This is the second time I've done this.
00:37:36.000 Something about this is giving me anxiety.
00:37:38.000 I'm spilling this.
00:37:39.000 Update on the obesity epidemic.
00:37:41.000 After the sudden rise, is the upward trajectory beginning to flatten?
00:37:45.000 Okay, so it's a question.
00:37:46.000 In what year was this?
00:37:46.000 So do you think it's just people recognizing that they're developing health issues and they're taking steps to discipline themselves and mitigate some of these issues?
00:37:56.000 Or is there some sort of information push that's leading them in that direction?
00:38:02.000 Yeah, that I don't know because it's not my field.
00:38:03.000 But I would say that that is a probably necessary precondition, like understanding the problem and developing people a desire to change it.
00:38:13.000 And then it's hard to change.
00:38:14.000 You know, I love chips.
00:38:16.000 I love chocolate.
00:38:17.000 I love ice cream.
00:38:18.000 It's hard to change.
00:38:19.000 But over time, a society adapts.
00:38:21.000 And now the question is...
00:38:23.000 Will we adapt to social media?
00:38:25.000 Because the desire for sugar and fat and salt is very deep.
00:38:30.000 The desire for others to think well of us, to hold us in esteem, I would say is just as deep and much more pervasive.
00:38:39.000 It's much stronger, I would say.
00:38:41.000 And so because, you know, as adults, we're very concerned.
00:38:45.000 You know, like when I put out a tweet, you know, I know all this stuff.
00:38:50.000 I know how terrible this is for me to check.
00:38:51.000 I'm busy.
00:38:52.000 I've got things to do.
00:38:53.000 But I'll go back and I'll check how the tweet is doing 30 seconds later.
00:38:57.000 And then I'll chat again five minutes later.
00:38:59.000 So it's hard for me to resist that.
00:39:02.000 What are people saying about the thing that I just said?
00:39:04.000 But the question is, will we adapt to it in some way so that we begin, as with junk food, we're still going to be consuming junk food, but maybe we'll keep a lid on it.
00:39:14.000 I don't know.
00:39:15.000 I don't know.
00:39:15.000 But what I can say with not confidence, but what I think is the case, is as long as our kids...
00:39:20.000 Are going through puberty on social media and video games, and they're not developing executive control, I do not think they will be able to keep a handle on this as adults.
00:39:30.000 I do not either.
00:39:31.000 Again, as you're saying, we are adults.
00:39:34.000 We grew up without the internet, and we grew up without all these problems, and it is hard.
00:39:37.000 I try to tell all my friends to use my strategy, which I call post and ghost.
00:39:42.000 I don't read anything.
00:39:44.000 I just post things.
00:39:45.000 I post things and I don't read comments.
00:39:47.000 That's really smart.
00:39:48.000 It's made me immensely more happy.
00:39:51.000 It's a massive difference.
00:39:53.000 I very rarely use Twitter, or X, whatever.
00:39:57.000 The only reason why I use it is to see information, to see things.
00:40:01.000 I don't read anything about myself, and I certainly don't.
00:40:03.000 I very rarely post at all.
00:40:05.000 And if I do post, I certainly don't read what...
00:40:08.000 Because first of all, I'm aware of this number, this FBI security specialist, the 80% of it, and I see it all the time.
00:40:15.000 There's so many times where I'll see any social issue, any political issue, anything that's in the zeitgeist.
00:40:23.000 When you see someone post about it, you'll see these people posting.
00:40:27.000 And I'll look at it.
00:40:28.000 It's like a couple of letters and a bunch of numbers.
00:40:31.000 And I'll go, okay, is that a real person?
00:40:32.000 And then I go to their page.
00:40:33.000 Nope, nope, not a real person.
00:40:35.000 How many of them are there?
00:40:35.000 Oh, I haven't done that.
00:40:36.000 There's a lot of them.
00:40:37.000 There's a lot of them.
00:40:38.000 Especially when it comes to things like Ukraine, Israel, Gaza.
00:40:42.000 Right, because those are areas where various actors are.
00:40:46.000 Parties and actors and countries are trying to manipulate us.
00:40:49.000 Yes.
00:40:50.000 And they're doing it.
00:40:51.000 They're doing a great job of it.
00:40:52.000 They're very focused.
00:40:53.000 It's really incredible.
00:40:54.000 It's incredible to see the impact that it has when you see 50 posts on us, 50 comments, and 35 of those seem to be not real people.
00:41:05.000 That's right.
00:41:05.000 I think your strategy is very wise and for this reason.
00:41:09.000 When social media began, you would put something up and then people could comment on it.
00:41:15.000 Okay, that goes until about 2013, 2014. I think it's 2013 when Facebook introduces threaded comments.
00:41:23.000 So now you put something up, someone says some horrific, nasty, you know, racist, whatever thing in your comment thread, and now everyone can reply to that comment.
00:41:33.000 And people can reply to that.
00:41:34.000 So you get basically everyone fighting with each other In the comment thread.
00:41:38.000 And what social media is good for is putting out information quickly.
00:41:42.000 I'm a professor.
00:41:44.000 I'm a researcher.
00:41:46.000 I am engaged in various academic disputes and debates.
00:41:49.000 And Twitter is amazing for finding current articles, for finding what people are talking about.
00:41:54.000 So the function of putting information out is great, but the function of putting something out and then watching everyone fight Right.
00:42:10.000 My concern is that we are paddling upriver.
00:42:14.000 Yeah.
00:42:18.000 That's powering this whole thing that you cannot fight against and that we are moving in a direction as a society with the implementation of new, more sophisticated technology that's going to make it even more difficult unless you completely opt out.
00:42:36.000 And some people are going to opt out, but it's going to be like my 600-pound life.
00:42:41.000 People that are realizing, oh my god, what have I done?
00:42:45.000 I've ruined my body.
00:42:47.000 I've ruined my life.
00:42:48.000 How do I slowly get back to a normal state?
00:42:52.000 And it's going to take a tremendous amount of effort.
00:42:54.000 Think about the amount of effort.
00:42:57.000 The amount of focus that people have on comments and things, if you're addicted, if you're currently deep into it right now, where you're tweeting constantly.
00:43:07.000 There's people that I follow that I know they're tweeting 12 hours a day.
00:43:10.000 Yeah, that's right.
00:43:11.000 It's sad.
00:43:11.000 It's so sad.
00:43:12.000 Yeah, they're addicts.
00:43:14.000 My fear is that this is only going to get greater.
00:43:18.000 Yeah, I share that fear.
00:43:20.000 And if current trends continue, It's really not clear how we get out of this.
00:43:26.000 Something might break in a big way.
00:43:29.000 Humanity has faced many crises before.
00:43:32.000 That doesn't mean, as they say, past performance is no guarantee of future success.
00:43:37.000 So we've faced many crises and we've always come out stronger.
00:43:40.000 But we've never faced anything like this.
00:43:43.000 That's right.
00:43:44.000 This is a rewiring.
00:43:45.000 Exactly.
00:43:46.000 That's right.
00:43:47.000 So we face many external threats.
00:43:49.000 We face diseases.
00:43:50.000 We face wars.
00:43:51.000 Those have come and gone.
00:43:52.000 But this is a rewiring of the basic communication network of society in ways that link up with so many of our deepest motivations.
00:44:01.000 This is a challenge unlike any we've ever faced.
00:44:07.000 You know, what we really need, I'm speaking as a university professor, is we really need great social sciences.
00:44:12.000 We need great sociologists.
00:44:14.000 We need people really studying this.
00:44:18.000 But, you know, it's all happening so fast.
00:44:21.000 And then the problems in universities of, you know, political concerns sweeping in.
00:44:27.000 So I fear that we're sort of heading towards this.
00:44:29.000 Well, you said, like, going upstream to a waterfall.
00:44:31.000 I think it was, like, going, like, downstream.
00:44:33.000 We're about, you know, at the top of a waterfall, going to go over the edge.
00:44:36.000 Right.
00:44:37.000 That too, yeah.
00:44:38.000 Well, we're trying to paddle, but that's the direction that we're moving in.
00:44:41.000 Yeah, that's right.
00:44:42.000 That might be the case.
00:44:43.000 So, yeah, we live at a very interesting time in history when in the 90s the future looked so bright, and yeah, now it doesn't.
00:44:52.000 My fear is that we are no longer going to become human.
00:44:56.000 That we will no longer be human.
00:44:58.000 We'll be a different thing.
00:45:00.000 And I think the implementation of technology is what's going to facilitate that.
00:45:04.000 I think we're, you know, how many years away from Neuralink and something similar to it that's going to change how we interact completely.
00:45:14.000 And that it's not going to be a question of whether or not you opt out, whether you pick up your device.
00:45:20.000 Your device is going to be a part of you.
00:45:22.000 And there'll be incentives.
00:45:25.000 That whether it's performance incentives, whether it's you're going to have more bandwidth, whatever it is.
00:45:32.000 You have a competitive advantage.
00:45:33.000 Yeah, that's the real fear of something like Neuralink or whatever.
00:45:37.000 If they can figure out a Neuralink that doesn't require surgery, if they can figure out something that does that without surgery, the advantage of having that in a competitive sense in terms of business and And technology and industry.
00:45:50.000 It's going to be massive and it's going to be so difficult to get people to not do that, that it's going to be like phones.
00:45:58.000 I mean, I remember when I moved to Los Angeles in 1994, I bought a Motorola StarTAC and I was like, look at me.
00:46:06.000 I had a phone in 1989. Oh, wow.
00:46:09.000 One of the big ones that went to a satellite?
00:46:11.000 It was actually connected to my car in 1989. And it was very advantageous.
00:46:16.000 My friend Bill Bluenreich, who owns the Comedy Connection, he owns the Wilbur Theater now in Boston.
00:46:22.000 And I got a lot of gigs from him because he could call me when someone canceled.
00:46:28.000 Someone got sick and they said, hey, can you get the Dartmouth at 10 p.m.?
00:46:32.000 I'm like, fuck yeah.
00:46:33.000 And so I got gigs from that.
00:46:36.000 We joke about it to this day that I was like the first guy that he knew that had a cell phone.
00:46:39.000 It was a huge advantage.
00:46:42.000 And I remember when I had one in 94, I was like, this is great.
00:46:45.000 I can call my friends.
00:46:47.000 I don't even have to be home.
00:46:48.000 There were so many positives to it.
00:46:50.000 And it gave you an advantage.
00:46:53.000 It gave you an advantage.
00:46:54.000 You didn't have to be home.
00:46:55.000 If there was a business thing that I had to deal with, there was something going on with my career, I could deal with it on the phone at Starbucks or wherever I was.
00:47:04.000 My fear is that this is going to be that times a million.
00:47:07.000 It's going to be you have to have it in order to compete.
00:47:10.000 Just like you kind of have to have an email today.
00:47:13.000 You kind of have to have a cell phone today.
00:47:15.000 Yeah, that's right.
00:47:15.000 That's right.
00:47:16.000 Yes, we are certainly headed in that way.
00:47:19.000 And I think the word human is a very good word to put on the table here.
00:47:23.000 Some things seem human or inhuman.
00:47:27.000 And when you simply connect people, you know, Mark Zuckerberg sometimes says, how could it be wrong to give more people more voice?
00:47:33.000 If you're simply connecting people, making it easier for them to contact each other, you know, I think that's mostly going to have good effects.
00:47:40.000 And that happened with the telephone.
00:47:41.000 You know, we all got telephones and we could do all sorts of things.
00:47:44.000 We could coordinate with our friends.
00:47:46.000 Telephones are great.
00:47:48.000 But when it became not technology making it easier for this guy to reach you or me to communicate with you, But rather, it's a way to put things out to try to gain prestige for me in front of thousands or maybe millions of people.
00:48:03.000 Now it changes all of our incentives.
00:48:05.000 It changes the game we're playing.
00:48:07.000 You know, what games are we playing as we go about our day?
00:48:10.000 And the more people are playing the game of I'm struggling to get influence in an influence economy where everyone else is on these seven platforms.
00:48:18.000 So I have to be too or they have an advantage over me.
00:48:21.000 That is the way that things have been rewired already.
00:48:24.000 Already we're there.
00:48:25.000 Now, you're raising the possibility that the next step is more hardware-based, that it's going into our bodies, and I think that is likely to happen.
00:48:34.000 And so I hope what we'll do now, and I hope my book, The Anxious Generation, will sort of promote a pause.
00:48:43.000 Let's think where we are.
00:48:44.000 Let's think what we've done.
00:48:45.000 Let's look at what has happened.
00:48:47.000 When our kids got on phones and social media, we thought, oh, this could be amazing.
00:48:52.000 Like, they can connect.
00:48:54.000 They can form communities.
00:48:55.000 It's going to be great.
00:48:56.000 And now it's clear, no, it's been horrible.
00:48:58.000 It's been really, really terrible.
00:49:00.000 As soon as they got on, their mental health suffered.
00:49:02.000 You know, they might feel like they have a community, but it's much worse than what it replaces.
00:49:06.000 So I think what we're seeing is the sort of the techno-optimists, the sort of the futurists who say, oh, it's going to be amazing.
00:49:12.000 You know, we'll have Neuralink.
00:49:13.000 We'll have all this technology.
00:49:14.000 We'll be able to do everything.
00:49:16.000 Like, here's where we have to heed, I think, the warnings of the ancients, of religious authorities, of those who warn us that we are leaving our humanity and we're stepping into an unknown zone where, so far, the initial verdict is horrible.
00:49:33.000 So, if we keep going without putting on some brakes, yeah, I think we're going to a horrible place.
00:49:40.000 Yeah, my fear is that it won't be horrible.
00:49:44.000 Oh, it'll feel good.
00:49:45.000 Yeah, that it'll be amazing.
00:49:47.000 So my fear, my genuine fear, is the rewiring of the mind in a way that can enhance dopamine, enhance serotonin, and do things that can genuinely make you feel better.
00:50:04.000 Yeah.
00:50:04.000 In the short run.
00:50:06.000 Yes.
00:50:07.000 And that we will decide that this is a better thing.
00:50:10.000 You know, just like, look, regardless of how you feel about SSRIs, most people think that they're being dispensed too readily.
00:50:19.000 And that too many people that get on antidepressants could have solved that issue with exercise and diet.
00:50:27.000 Because this is a big part of the reason why people are feeling shitty in the first place is their bodies failing.
00:50:32.000 Yeah, and having less sex.
00:50:33.000 I read recently that the SSRIs are suppressing sex drive in many people.
00:50:38.000 So there's that.
00:50:39.000 There's a lot of issues that come along with those, and yet there's an immense profit in making sure that people take those and stay on those.
00:50:47.000 My fear is that if you can do something that allows people to have their mind function, have their brain, their endocrine system, have all these things function at a higher level, then everyone is going to do it.
00:51:03.000 You would not want to just be natural and depressed if you could just put on this little headset and feel fantastic.
00:51:11.000 And maybe it could be a solution to so many of our society issues.
00:51:16.000 Maybe bullying would cease to exist if everyone had an increase in dopamine.
00:51:21.000 It sounds silly, but if dopamine increased by...
00:51:24.000 Look, if you have an entire society that's essentially on a low dose of MDMA, You're not going to have nearly as much anger and frustration.
00:51:34.000 You also are not going to have as much blues.
00:51:37.000 You're not going to have as many sad songs that people love.
00:51:41.000 You're not going to have the kind of literature that people write when they feel like shit.
00:51:45.000 It's unfortunate, but also as a whole, as a society, it probably would be an overall net positive.
00:51:56.000 We're good to go.
00:52:15.000 We didn't anticipate it.
00:52:17.000 It has negative consequences.
00:52:18.000 We thought about it in a positive way.
00:52:20.000 Oh, this is going to be great.
00:52:21.000 We're all going to be connected.
00:52:22.000 How would it be bad that people could have more voices, like Zuckerberg says.
00:52:27.000 Right, right.
00:52:29.000 My fear is that it's going to just change what it means to be a human being and my genuine feels that this is inevitable and that as technology scales upward this is unavoidable.
00:52:42.000 Right now it certainly feels that way.
00:52:47.000 And while I'm not optimistic about the next 10 years, I share your vision of what's coming.
00:52:53.000 But I'm not resigned to it.
00:52:55.000 People always say to me, I go around saying, we need to do these four norms, we can do them.
00:52:59.000 And people say, oh, that ship has sailed.
00:53:01.000 Like, you know, the train's left the station.
00:53:03.000 You know, but if a ship has sailed and we know that, you know, it's going to sink, we can actually call it back.
00:53:09.000 I've been on airplanes where it leaves the jetway, and then they call it back because they discover a safety issue.
00:53:16.000 So we are headed that way, I agree.
00:53:20.000 But I think we humans are an amazingly adaptable species.
00:53:28.000 I think we can figure this out, and there are definitely pathways to a future that's much better.
00:53:34.000 These technologies could, in theory, give us the best democracy ever, where people really do have the right kind of voice.
00:53:40.000 It's not just the extremes who are super empowered, as it is now.
00:53:45.000 So, you know, we're at a point in space and time, let's say, right now, and I can imagine a future that's really fantastic, but how do we get there?
00:53:55.000 And are we able to get there?
00:53:57.000 Is there a path?
00:53:58.000 Or is it like, you know, there's no path from A to B? So I don't know, but I think we sure as hell have to try.
00:54:04.000 And the first thing we have to do is not be resigned and just say, oh, well, the world's going to hell.
00:54:09.000 What are you going to do about it?
00:54:10.000 It's too big.
00:54:11.000 So let's start, I have a proposal, let's start with the one area that we can all agree on, which is our kids.
00:54:19.000 It's the most amazing thing.
00:54:20.000 In Congress, you can't, you know, any issue, if the right likes it, the left won't, and vice versa.
00:54:25.000 Except for this one.
00:54:26.000 This is the only thing in Washington that's really bipartisan.
00:54:30.000 The senators and congressmen have kids, they see it.
00:54:33.000 So let's test the proposition that all is lost and we're helpless.
00:54:37.000 Let's test that proposition.
00:54:38.000 And let's test it in the place where we're most likely to succeed, which is rolling back the phone-based childhood and replacing it with a more play-based childhood.
00:54:49.000 Oh, so actually, I said there are four norms.
00:54:51.000 We talked about three.
00:54:51.000 So if you don't mind, I'll put in the fourth norm now.
00:54:53.000 Yeah.
00:54:54.000 So the first three are about phones.
00:54:56.000 No smartphone before high school, no social media till 16, phone-free schools.
00:55:00.000 Okay, but if you take away the phones and you don't give kids back each other and playtime and independence, what are they going to do?
00:55:09.000 You're going to keep them at home all day long without screens?
00:55:12.000 So the fourth norm is more independence, free play, and responsibility in the real world.
00:55:19.000 And this is a thing that you and I talked about last time.
00:55:21.000 I think we actually had a small disagreement.
00:55:25.000 I'm a big fan of Lenore Skenazy, the woman who wrote Free Range Kids.
00:55:28.000 She and I co-founded an organization called Let Grow.
00:55:31.000 Parents, please go to letgrow.org.
00:55:33.000 All kinds of ideas for how to help your kid have more independence, which makes them more mature, which makes them less fragile.
00:55:42.000 So this fourth norm, this is the harder one.
00:55:45.000 This is the one that we have to really overcome our fears of letting our kids out.
00:55:49.000 And so actually, let me ask you, I think our disagreement last time was, I talked about this, and I said letting kids go for sleepovers and spend more time with other kids and unsupervised.
00:56:00.000 And then you said, I think you said, no, I'm not letting my kid go to sleepovers because I don't trust the other families.
00:56:05.000 Does that sound familiar to you?
00:56:06.000 I don't believe that's what I said.
00:56:07.000 I think our concern was with people wandering around with kids being free to walk home in cities.
00:56:14.000 Yes, you had that also.
00:56:15.000 We did talk about sleepovers.
00:56:17.000 My kids have sleepovers.
00:56:18.000 They've always had sleepovers.
00:56:19.000 If you know the parents and you trust the parents, it's a great way to give the kids independence and have them interact with other people.
00:56:25.000 So tell me, what was your policy with your kids, with all three?
00:56:30.000 When you let them out, like they could go out the door, get on a bicycle, walk seven blocks to a friend's house without any adult with them.
00:56:36.000 Do you remember what age or grade?
00:56:38.000 No, I don't.
00:56:39.000 I mean, it's fine if you live in a good neighborhood.
00:56:41.000 The problem is if you're, you know, childhood predators are real.
00:56:47.000 Not really.
00:56:47.000 Not anymore.
00:56:48.000 What I mean is- What do you mean?
00:56:49.000 Well, when you and I were growing up, there were childhood predators out there in the physical world approaching children.
00:56:57.000 And I think you said there, you told a story about one who approached you when you were doing magic tricks.
00:57:02.000 So there were child predators out there.
00:57:04.000 That's true.
00:57:05.000 They're all on Instagram now.
00:57:07.000 The kids aren't out and Instagram, and especially Instagram, makes it super easy for them to get in touch with children.
00:57:15.000 So this is my point.
00:57:18.000 I can summarize the whole book with a single sentence.
00:57:21.000 We have overprotected our kids in the real world and underprotected them online.
00:57:25.000 I would agree to that.
00:57:27.000 So, yes, child predators are terrible, but guess what?
00:57:30.000 We actually locked up most of them.
00:57:32.000 When you and I were growing up, they weren't all locked up.
00:57:34.000 They were just eccentrics who were exposing themselves.
00:57:36.000 Remember flashing, flashers?
00:57:38.000 That doesn't happen anymore, because if you do that now, you're going to jail for a long, long time.
00:57:43.000 So we actually locked up most of the predators, and they know, don't approach kids on a playground, approach them on social media.
00:57:49.000 I don't know if we are doing that, and there's this new push.
00:57:52.000 Oh, yeah.
00:57:52.000 Once you're identified as a sex offender, you are gone for a long time, and then there's a sex offender.
00:57:58.000 No, we've really done a lot since the 90s to make the real world safer.
00:58:03.000 But there is push against that.
00:58:06.000 You're aware of this term, minor attracted persons, that's being pushed?
00:58:10.000 Disgusting.
00:58:10.000 Disgusting and freaky.
00:58:12.000 It's such a bizarre term that I got to imagine is only being done by people who don't have children.
00:58:19.000 And they're pushing this thing that it's an identity and that it's not the fault of the person who has this issue.
00:58:28.000 What's the root of that?
00:58:29.000 Have you investigated that?
00:58:31.000 Yes.
00:58:31.000 Not that specific issue, but I can...
00:58:33.000 So look, I study moral psychology.
00:58:36.000 That's my academic discipline.
00:58:38.000 And I study the roots of it evolutionarily, historically, and child development.
00:58:43.000 What is our moral sense?
00:58:44.000 And there are different moralities, and in some ways that's good, and left and right push against each other.
00:58:51.000 So I'm very open to different moralities.
00:58:54.000 But when a group makes something sacred, and they say, this is the most important thing, and nothing else matters other than this...
00:59:03.000 Then they can kind of go insane and they kind of lose touch with reality.
00:59:08.000 And I think, you know, again, I don't know the history of this particular movement, that horrible term, but there is a certain kind of morality which is all about, you know, oppression and victimhood.
00:59:20.000 And once you, you know, someone, I guess, somewhere said, oh, you know, men who are attracted to boys or, you know, little girls are being, you know, are victims, I don't know what.
00:59:31.000 Some, in some little eddy of weird morality, someone put that forward as a new victim class, because we've been trying to address victimhood all over the place.
00:59:41.000 Once someone puts that up as a new victim class, and you have to do that, you have to change the terms.
00:59:45.000 This is very Orwellian.
00:59:47.000 You change the terms, and then some others who share this morality, which is focused on not making anyone feel marginalized, not allowing any labels that will slander someone or make them look bad, I think people who approach children for sexual goals,
01:00:05.000 I'm very happy to have them slandered and labeled and separated.
01:00:11.000 But I suspect that some people, once they lock this in as a group that's being marginalized, they say, well, we have to defend them.
01:00:19.000 And we don't think about what the hell we're actually saying.
01:00:23.000 It seems purely an academic thing.
01:00:25.000 It seems that this is something that with people that only exist in sort of an academic space where it's almost like An intellectual exercise in understanding oppression.
01:00:41.000 You can't apply it in the real world.
01:00:45.000 It's just too fucked up.
01:00:47.000 The consequences of it are horrific.
01:00:50.000 Normalizing, victimizing children.
01:00:53.000 Before we go any further with this particular topic, I would want to point out one of the problems that our social media world has given us, which is Somewhere in all of the academy and all the universities, some philosopher,
01:01:09.000 let's say, proposed that term or raised an idea.
01:01:11.000 So this has been going on for thousands of years.
01:01:13.000 Someone in a conversation proposes a provocative idea.
01:01:16.000 What if we think about this as a minor attracted person?
01:01:20.000 They put that idea out, and then other people say, no, that's really stupid, and it doesn't catch on, because this is not an idea that's going to catch on, even in the academy.
01:01:29.000 But I think where we are now is, I'm guessing, someone proposed this, somebody else got wind of it, posted it online, and now you're going to have a whole media ecosystem going crazy about this terrible idea.
01:01:44.000 So maybe can you look up a minor attracted person?
01:01:48.000 Is this just like a thing that was from one academic talk?
01:01:50.000 Or is this an actual movement?
01:01:52.000 Well, I've seen politicians discuss it.
01:01:54.000 No way.
01:01:55.000 Wait, wait, wait.
01:01:56.000 It's like decriminalizing or de-stigmatizing?
01:01:59.000 De-stigmatizing.
01:02:00.000 Oh, God.
01:02:01.000 There was a recent politician that went viral for this discussion.
01:02:06.000 Oh, no.
01:02:06.000 All right.
01:02:06.000 Maybe I'm wrong.
01:02:07.000 More than one.
01:02:07.000 There was two specific women that were doing that.
01:02:10.000 And I didn't investigate whether these women had families or what it was.
01:02:15.000 But this push to...
01:02:20.000 To try to alleviate bullying or alleviate shame or alleviate the stigma that's attached to what they're calling an identity.
01:02:31.000 Yeah, that's right.
01:02:32.000 So that brings us to the issue of identitarianism, which I think is a useful term for us these days.
01:02:42.000 I think a lot of what's happened on campus is the move to focus on identity as the primary analytical lens in a number of disciplines, not in most disciplines, but in a lot of the humanities, the studies departments.
01:02:56.000 So putting identity first and then ranking identities and saying some identities are good, some are bad— This really activates our ancient tribalism.
01:03:05.000 And I think that the liberal tradition, going back hundreds of years, is really an attempt to push back against that and to create an environment in which we can all get along.
01:03:15.000 And so, as I see it from inside the academy, we've always been interested in identity.
01:03:21.000 It's an important topic.
01:03:22.000 There's a lot of research on it going back many decades.
01:03:24.000 But something happened in 2015 on campus that really elevated identitarianism into the dominant paradigm, not dominant in that most people believed it, but dominant in the sense that if you go against it, you're going to be destroyed socially.
01:03:38.000 And that's what cancel culture is.
01:03:40.000 That's what Greg Lukianoff and Ricky Schlott, their new book, The Canceling of the American Mind, is about.
01:03:45.000 So, yes, it's the people who are putting identity first, and that's sort of their religion and their morality.
01:03:52.000 I mean, they're welcome to live in the United States, but when they get influence in universities or in school boards, yeah, bad stuff will happen.
01:04:01.000 It's just bizarre the effect that it does have when people push back against identity politics.
01:04:08.000 It's a small, very vocal minority that pushes this agenda.
01:04:13.000 And it's not the majority of people.
01:04:16.000 The majority of people mostly disagree with these ideas.
01:04:19.000 Yeah, absolutely.
01:04:21.000 This is, again, a really important point about how our society has changed.
01:04:25.000 Those of us from the 20th century still think in terms of public opinion, like, do most people believe this, or do most people not believe it?
01:04:32.000 And most people are sane.
01:04:34.000 Most people are not at all crazy.
01:04:35.000 Most people are pretty reasonable.
01:04:38.000 And I think what's happened since social media became much more viral in 2009-2010 is that the extremes are now much more powerful and they're able to intimidate the moderates on their side.
01:04:49.000 So on the right, sort of the center-right, what I call true conservatives, or like Berkey and Edmund Burke conservatives, You know, they get shot and they get excluded and there's not many of them in Congress anymore.
01:05:01.000 And on the left, you have the far left, the identitarian left, you know, shooting darts into, you know, people like me, into anybody who is, you know, anybody who questions.
01:05:09.000 So they shoot their moderates.
01:05:10.000 And what you have is even though most people are still moderate and reasonable, our public discourse is dominated by the far right, the far left, and all these crazy fringe, you know, I mean, it can be, you know, neo-Nazis on one side and then these, you know, identitarians defending minor attracted people on the other side.
01:05:26.000 So don't lose faith in humanity.
01:05:32.000 Recognize that we've moved into this weird, weird world because of social media in which it's hard to see reality and in which people are afraid to speak up.
01:05:42.000 And so we get warped ideas rising to dominance, even though very few people believe them.
01:05:48.000 And I think this is where bots come into play.
01:05:50.000 Yeah, they can really amplify it.
01:05:52.000 I really do believe that this is being amplified, whether it's by foreign governments or by special interest groups or by whoever it is is trying to push these specific narratives.
01:06:03.000 Absolutely.
01:06:04.000 And this can bring us right back to TikTok and the national security threat.
01:06:08.000 So Vladimir Putin was a KGB agent in the 20th century.
01:06:12.000 And the KGB going back, I think it was in the 50s, they had some sort of a meeting or something where they decided that they were going to take, I think it's called active measures.
01:06:21.000 They were going to try to mess up American democracy.
01:06:23.000 And they'd spray paint racial slurs.
01:06:26.000 They'd put swastikas on synagogues.
01:06:28.000 They saw that we're a multi-ethnic democracy.
01:06:30.000 We're making a lot of progress towards tolerance.
01:06:33.000 And the Russians, the Soviets, were trying to put a stop to that and make us hate each other.
01:06:37.000 So they were doing that back since the 1950s.
01:06:40.000 And it was expensive.
01:06:41.000 They had to fly people over or they had to try to win people over.
01:06:44.000 You couldn't scale the operation.
01:06:46.000 But that's the tradition that Vladimir Putin comes from.
01:06:49.000 Now, the Soviet Union falls in 1991. I think he's in Berlin.
01:06:55.000 I can't remember where he was, but he was very influenced by this and the humiliation of the Soviet Union.
01:07:01.000 And so he rises to power again in the 21st century.
01:07:05.000 Do you think he suddenly no longer wants to mess with American democracy?
01:07:09.000 Did he suddenly drop that desire?
01:07:12.000 We basically handed them the tools.
01:07:13.000 We said, okay, you can open as many Facebook accounts as you want, Twitter accounts.
01:07:19.000 Open as many as you want.
01:07:20.000 There's no identity authentication.
01:07:22.000 There's no age verification.
01:07:24.000 Create bots all you want and have them mess with us.
01:07:28.000 And Renee DiResta has a book coming out soon.
01:07:30.000 She really did amazing work to get to the bottom of this.
01:07:33.000 You know, they started running tests in 2013. They created accounts on all these platforms long before, but they started running tests.
01:07:40.000 Could they get Americans to believe that an explosion had occurred at a refinery plant in Louisiana?
01:07:47.000 Yes, they made it all up and people believed it.
01:07:50.000 Could they get Americans to believe some extreme BLM post that was completely outrageous?
01:07:56.000 Yes.
01:07:57.000 And same thing to enrage people on the left.
01:08:01.000 So we know that the Russians are messing with us.
01:08:04.000 We know that the Russians know our weak point.
01:08:07.000 And by Russians, again, I don't mean the Russian people.
01:08:09.000 I mean Vladimir Putin.
01:08:11.000 The government.
01:08:11.000 The government.
01:08:12.000 So we're handing them the tools and the instruction book.
01:08:16.000 For how to divide us, how to weaken us, how to make us lose our resolve and our will.
01:08:20.000 Have you seen Yuri Bezmenov give a speech about the ideological subversion?
01:08:27.000 And he did this in the 1980s.
01:08:29.000 I think it was 84. And he was talking about how the work is already done.
01:08:33.000 And that is just a matter of these generations now going into the workforce with Marxist ideas and with all this ideological subversion that the Soviet Union has injected into the universities.
01:08:46.000 That's right.
01:08:46.000 That could be right.
01:08:47.000 I mean, it is chilling to watch and it is prophetic.
01:08:51.000 But, you know, they were playing a long game.
01:08:53.000 I mean, the communists planning the communist revolution, they were patient and they were playing the long game.
01:08:58.000 Yeah, as is China.
01:09:00.000 Yeah, that's right.
01:09:01.000 They're very smart.
01:09:01.000 That's right.
01:09:02.000 There's so much more – because they're dictatorships, they have complete control over what they choose to do.
01:09:11.000 They don't have to meet with subcommittees.
01:09:13.000 They don't have to have congressional hearings.
01:09:16.000 They just can just do it.
01:09:18.000 Oh, okay.
01:09:19.000 That's a good point because that – That brings us to the big difference between democracies and autocracies.
01:09:26.000 Back in the 1930s, when the West was in economic collapse, and it was the Soviet Union and then the Italian fascists and then Hitler, the German fascists, They were making rapid economic progress.
01:09:41.000 And the criticism of democracy has always been, it's chaotic.
01:09:46.000 There's no good leadership.
01:09:47.000 They can't plan ahead.
01:09:49.000 And that's all true.
01:09:50.000 But why did we triumph in the 20th century over all these other models?
01:09:54.000 Because democracy gives us a degree of dynamism.
01:09:57.000 Where we can do things in a distributed way.
01:10:00.000 We have people just figuring stuff out.
01:10:02.000 We have an incredibly creative economy and business sector.
01:10:06.000 And so democracies have this incredible ability to be generative, creative, regenerative.
01:10:13.000 Unless you mess with their basic operating system and say, let's take this environment in which people talk to each other, share ideas, take each other's ideas, compete, try to get a better company.
01:10:25.000 Let's take that and let's change the way people talk so that it's not about sharing information.
01:10:31.000 It's about making them spend all day long, nine hours a day, competing for prestige on social media platforms and in a way that empowers everyone to complain all the time.
01:10:42.000 This, I think, really saps the dynamism.
01:10:45.000 I think this social media, what I'm suggesting, I haven't thought this through, but I'm suggesting is that whatever the magic ingredient that made democracy so triumphant in the 20th century, Western liberal democracy, American style democracy, whatever made it so triumphant is being sapped and reduced by the rapid rewiring of our society onto social media.
01:11:05.000 Yeah, I would agree with that.
01:11:06.000 And I think it's also being influenced, again, by these foreign governments that have a vested interest in us being at each other's throats.
01:11:13.000 Why wouldn't they?
01:11:13.000 It's so cheap.
01:11:14.000 It's so cheap, it's so effective, and it seems to be the predominant way that people interact with each other.
01:11:22.000 That's right.
01:11:23.000 When you say that you've been attacked, what have you specifically been attacked about?
01:11:27.000 Oh, it's just in the academic world, if you say anything about any DEI-related policy, you'll be called racist or sexist or homophobic or something.
01:11:43.000 I was always on the left.
01:11:45.000 I was always a Democrat.
01:11:46.000 Now I'm nothing.
01:11:47.000 I'm an extremely alarmed, patriotic American citizen who sees my country going to hell.
01:11:53.000 I'm in that camp.
01:11:55.000 A lot of us are.
01:11:56.000 A lot of us are politically homeless now.
01:12:00.000 But I sort of started my career in political psychology.
01:12:04.000 So my original work was on how morality varies across cultures.
01:12:08.000 I did my dissertation research in Brazil, and then I did some work in India.
01:12:12.000 And it was only in the 90s that our culture were heated up and I began to see that left and right were like different countries.
01:12:19.000 We had different economics textbook, different American history, different US constitution.
01:12:26.000 It was like different worlds.
01:12:28.000 And I began actually trying to help the left stop losing elections like in 2000, 2004. As a Democrat, I thought I could use my research in moral psychology to help the Democrats understand American morality, which they were not understanding.
01:12:44.000 Al Gore and John Kerry, I thought, did a very bad job.
01:12:47.000 So I've all along been sort of critical of the left, originally from within the left.
01:12:53.000 And that's a pretty good way to get a bunch of darts shot at you.
01:12:55.000 Nothing terrible ever really happened to me.
01:12:57.000 I don't want to, you know, lots of people have been truly canceled, you know, shamed, lost their jobs, considered suicide.
01:13:03.000 So nothing like that has ever happened to me.
01:13:06.000 But, you know, when there's some minor thing on, you know, people take a line out of one of your talks.
01:13:12.000 They put it up online with a commentary about what an awful person you are.
01:13:15.000 Thousands of people comment on it or like it or retweet it.
01:13:18.000 It hurts.
01:13:19.000 It's frightening in a way like nothing else I've ever known.
01:13:22.000 And how many of those people are even real people?
01:13:24.000 Yeah, that's right.
01:13:25.000 This is the real question.
01:13:26.000 That's right.
01:13:26.000 Because it really is in dispute.
01:13:28.000 It was one of the major disputes when Elon bought Twitter.
01:13:31.000 I mean, one of the things that's come out of Elon buying Twitter, and thank God he did, as much as people want to talk about the negative aspects, which are real, which I've seen racism and hate go up on Twitter.
01:13:43.000 I've seen it being...
01:13:44.000 Openly discussed, which is very disturbing.
01:13:47.000 But what we did find out is that the government was involved in this, that the federal government was interfering with people's ability to use these platforms for speech.
01:13:59.000 Over COVID. You mean because of COVID? Yes, that's right.
01:14:00.000 Yes.
01:14:01.000 But I feel like that's just a test run.
01:14:04.000 Being able to implement that for that.
01:14:07.000 Then you can implement it for so many different things.
01:14:09.000 Dissent about foreign policy issues, dissent about social issues.
01:14:13.000 There's so many different ways they can do it if they can somehow or another frame it in a way that this is better for the overall good of America.
01:14:22.000 That's right.
01:14:23.000 So that's why I never talk about content moderation.
01:14:25.000 I'm not interested in it.
01:14:27.000 There has to be some, but most people focus on the content and they think if we can clean up the content or change the content or, you know, in those Senate hearings we saw a couple months ago, you know, just, you know, if we can reduce the amount of, you know, suicide promoting or self-harm promoting content that our kids are seeing,
01:14:43.000 then all will be well.
01:14:45.000 Like, no, it's not primarily about the content.
01:14:47.000 I agree with you that the government was influencing these platforms to suppress views that they thought were wrong and some of which turned out to be right.
01:15:05.000 I'm a big fan of my friend Greg Lukianoff, who runs the Foundation for Individual Rights and Expression.
01:15:09.000 So I think we shouldn't be thinking about social media like, well, how do we keep the wrong stuff off and only have it have the right stuff?
01:15:17.000 I think almost only about architecture.
01:15:20.000 How is this platform designed?
01:15:22.000 And can we improve it in ways that are content neutral?
01:15:25.000 Can we improve it in ways that aren't going to advantage the left or the right, but are going to make it more truth-seeking?
01:15:30.000 And so Frances Haugen, the Facebook whistleblower, when she came out, She had all kinds of ideas about settings, things that Facebook could have done to reduce the incredible power of the extremes.
01:15:42.000 The farthest right, 3%, the farthest left, 3%, and then a bunch of just random weirdos who just post a lot.
01:15:49.000 They have extraordinary influence.
01:15:51.000 And that's not about a left-right thing.
01:15:53.000 That's about, do we want an information ecosystem that super-duper empowers the extremes and silences the middle 80%?
01:15:59.000 Hell no!
01:16:00.000 So that's the kind of regulation that I favor, focusing on making these platforms less explosive and more useful.
01:16:09.000 And there's also this discussion that comes up a lot about algorithms.
01:16:14.000 Algorithms have essentially changed the entire game because it's not just what's online.
01:16:19.000 It's what do you interact with more frequently.
01:16:23.000 And that's accentuated.
01:16:24.000 And the problem with that is most people interact with things that rile them up.
01:16:29.000 And so you're developing these platforms that are immensely profitable that ramp up dissent and ramp up anger and ramp up arguments.
01:16:40.000 And like in the case of yourself, Instead of just debating you on these issues and doing it in a good faith manner, Jonathan Haidt believes this.
01:16:53.000 This is why I disagree.
01:16:55.000 I think of this or that.
01:16:57.000 Instead, they'll label you as whatever.
01:17:00.000 That's right.
01:17:00.000 Racist, sexist, homophobic, Islamophobic, xenophobic, whatever they can say, whatever pejoratives they can throw at you that...
01:17:09.000 Essentially this reductionist view of your perspective that makes it incredibly negative.
01:17:13.000 That's right.
01:17:14.000 And then you'll get bots that interact with that, that push that.
01:17:18.000 That's right.
01:17:19.000 So Twitter only went to algorithms, I think, in 2017. So before then, you know, people who tweet a lot, you know, People talk a lot about algorithms as though that's the cause of the whole problem.
01:17:36.000 And they're not the cause of the problem, but man, are they amplifiers.
01:17:39.000 And I think that's what you're saying.
01:17:40.000 They're just super-duper amplifiers on whatever craziness would be there even without them.
01:17:46.000 And so that certainly is shaping what we receive, what our children receive.
01:17:52.000 And so this is some of the stuff that I think, again, we have to really protect our children from.
01:17:57.000 To have a company able to micro-target their exact desires, even when they don't know what their desires are, It's a degree of control and influence over children in particular that I think they should just be protected from.
01:18:14.000 Do you think that if you looked at algorithms, do you think that it's an overall net negative?
01:18:20.000 And could the argument be made that algorithms should be banned?
01:18:25.000 Yeah, no, I don't think.
01:18:26.000 I mean, algorithms are there for a reason.
01:18:28.000 You know, we all know on Amazon...
01:18:30.000 You know, if you look up a book, it's going to suggest some other books you might be interested in.
01:18:33.000 And it's pretty darn good.
01:18:34.000 Like, yeah, you're right.
01:18:35.000 I would be interested in that.
01:18:36.000 So no, I would never say, oh, we can't have algorithms.
01:18:39.000 I mean, that would just be a Luddite sort of move to make.
01:18:43.000 You know, I think, again, as a social psychologist who studies morality, I just see everything going up in flames.
01:18:50.000 So here's a metaphor that I sometimes use.
01:18:53.000 Suppose you're the California Department of Parks, and you have 100 years of experience fighting forest fires.
01:18:59.000 You know everything about the wind, the humidity, you know, what season.
01:19:03.000 You've got it down to a science, and you're doing the best you can to keep forest fires under control.
01:19:07.000 And then one day, God decides to just mess with the world and changes the atmosphere from 20% oxygen to 80% oxygen.
01:19:16.000 And if we suddenly were in a world where 80% of the atmosphere was oxygen, everything would go up in flames.
01:19:21.000 Every electronic device would be burning right now.
01:19:23.000 So that's kind of what happened after 2009, 2010. That's kind of what happened once we switched over to be about...
01:19:31.000 So I would say the retweet button.
01:19:34.000 That move to virality, that I think is even more guilty of causing the problems even than algorithms.
01:19:40.000 I don't know that it's necessarily one versus the other, but that's the way I see it, that we're in a world where the technology is so quick to ramp up whatever will most engage us, and that's mostly emotions such as anger.
01:19:53.000 So yeah, that's why it feels like everything's burning.
01:19:58.000 And this doesn't seem like it's slowing down.
01:20:02.000 It seems like it's ramping up and it seems like they've gotten more efficient at the use of algorithms and all these different methods like retweeting and reposting and different things that sort of accentuate what people are upset about and what people get riled up about.
01:20:18.000 Yes, I think it is accelerating, and for two reasons.
01:20:21.000 One is that it's just the nature of exponential growth.
01:20:24.000 It's the nature of progress.
01:20:26.000 I think in the 19th century, a guy named Adams gave us the Adams curve.
01:20:30.000 He was noticing, like, wow, the amount of work we're able to do now that we're harnessing steam and coal keeps growing and growing and growing.
01:20:37.000 And at some point, it's going to be going up so fast that it'll go up an infinite amount every day or something.
01:20:42.000 You reach an asymptote.
01:20:43.000 You reach a point at which it's insane.
01:20:47.000 And yeah, so many people think that we're now at the singularity.
01:20:51.000 We're at the point at which things are changing so fast that we just can't even understand them.
01:20:56.000 And we haven't yet mentioned the word AI. Now you bring in AI, and of course, you know, AI could unlock extraordinary material progress.
01:21:05.000 And Marc Andreessen has been arguing that.
01:21:07.000 But as a social scientist, I fear it's going to give us material progress and sociological chaos.
01:21:14.000 It's going to be used in ways that make our already unstable social structures and systems even less stable.
01:21:22.000 Well, what's very bizarre that we're seeing with the initial implementation of it, specifically with Google's version of it, is that it's ideologically captured.
01:21:30.000 That was so horrible.
01:21:32.000 And that was so irresponsible of Google to do.
01:21:34.000 So, no, I'm glad we have a chance to talk about this because I'm really horrified by what Google did in introducing Gemini.
01:21:39.000 And just to give a little background here, so I'm sure many of your listeners know, Google Gemini was programmed to answer in ways that basically, you know, the most extreme DEI officer would demand that people speak.
01:21:53.000 And so, you know, if you ask for a picture of the Founding Fathers, they're multiracial or all black.
01:21:58.000 Or Nazi soldiers.
01:22:00.000 Yeah.
01:22:01.000 Even Nazis had to be multiracial or black.
01:22:04.000 So there's two things to say about this.
01:22:06.000 The first is that Google must be an unbelievably stupid company.
01:22:11.000 Like, did nobody test this before they released it to the public?
01:22:14.000 And obviously, Google is not a stupid company, which leads me to my next conclusion, which is if Google did such a stupid, stupid thing, so disgraced its product that it's banking so much on—I mean, it depends a lot on the success of Gemini— And now they've alienated half the country right away.
01:22:31.000 On the first day, practically, they alienated them.
01:22:33.000 They couldn't be that stupid.
01:22:36.000 I think what's happening to them is what happened to us in universities, which is what I've called structural stupidity.
01:22:41.000 So you have very smart people.
01:22:43.000 But if anyone...
01:22:46.000 questions a DEI-related policy on campus, they would get attacked.
01:22:51.000 And that's what most of the early blow-ups were.
01:22:53.000 I think you probably had Brett Weinstein on here.
01:22:56.000 That's what Erika Christakis at Yale and Nicholas Christakis at Yale.
01:23:01.000 If people wrote these thoughtful, caring memos about opposing a policy, There would be a conflagration, they'd be attacked, and they would sometimes lose their jobs.
01:23:12.000 So that's what happened to us in universities in 2015 to usher in our now nine years of insanity, which I think might be ending.
01:23:19.000 I think last fall was so humiliating for higher ed that I think we might be at a turning point.
01:23:25.000 But my point is for Google.
01:23:26.000 I suspect that Google was suffering from an extreme case of structural stupidity because surely a lot of those engineers could see that this is terrible.
01:23:35.000 This is a massive violation of the truth and part of Google's brand is truth and trust.
01:23:41.000 So I suspect they were just afraid to say anything.
01:23:44.000 And that's why Google made this colossal blunder of introducing woke AI at a time when we desperately need to trust our institutions that are related to knowledge.
01:23:56.000 And Google was trusted, and now they've lost a lot of it.
01:23:59.000 And it's not just Google.
01:24:00.000 It's ChatGPT.
01:24:01.000 But ChatGPT is not as explicit.
01:24:03.000 It's not as explicit, but it does do certain things.
01:24:06.000 Like if you ask it to say something positive about Donald Trump, it refuses.
01:24:10.000 You ask it to say something positive about Joe Biden, it'll gaslight you.
01:24:14.000 No, that's right.
01:24:14.000 And there was recently, was it David Rosado or who was it, who put out some listing of how far left each of the different AI products are.
01:24:24.000 So you can certainly say that ChatGPT is not politically neutral, but you wouldn't say from that that the people at ChatGPT or OpenAI are stupid.
01:24:34.000 You would not look at this product and say, how could they be so dumb as to have it be left-leaning?
01:24:40.000 But with Google, you have to say, how could they be so dumb as to produce black Nazis for us?
01:24:46.000 Right.
01:24:47.000 I just don't think they played it all out.
01:24:50.000 That's right.
01:24:56.000 With DEI and with the universities and the education system, it just seemed like you had to apply that to artificial intelligence because you're essentially, you're giving artificial intelligence these protocols.
01:25:11.000 You're giving it these parameters in which it can address things.
01:25:16.000 And if you're doing it through that lens, this is the inevitable result of that.
01:25:20.000 You're going to get black Nazis.
01:25:22.000 Oh, no, I don't know about the black Nazi.
01:25:24.000 I don't think it goes that extreme.
01:25:26.000 So to the extent that...
01:25:27.000 But if you say DEI, if you apply that to everything across the board and don't make exceptions in terms of historical accuracy, the founding fathers of America being all black...
01:25:38.000 Yeah.
01:25:40.000 Again, I'm not an expert in AI, but large language models are basically just consuming everything written and then spitting stuff back out.
01:25:48.000 And so it might be that most stuff is written.
01:25:52.000 The people on the left are dominant universities.
01:25:55.000 They probably publish more books, whatever.
01:25:57.000 Right, but there's nothing written about black Nazis.
01:26:00.000 That's right.
01:26:00.000 That's right.
01:26:02.000 Is that I could see AI seeming to lean left, even if it wasn't programmed to lean left.
01:26:08.000 That might just be the data input that it takes.
01:26:11.000 But to get black Nazis, you need somebody had a program in those commands.
01:26:14.000 Somebody had to consciously say, you know, anything about representation is going to, everything's going to look like a banditon.
01:26:20.000 No, it's not even like a banditon.
01:26:22.000 Benetton ads had much more diversity in the 1980s and 90s.
01:26:25.000 So no, I would agree that the Gemini case, clearly someone deliberately programmed in all kinds of rules that, yeah, they seem to come from a DEI manual just without much thinking.
01:26:35.000 Yeah, how do they come back from that?
01:26:38.000 I don't know.
01:26:38.000 That's a good question.
01:26:39.000 I don't know how deep the rot runs.
01:26:40.000 I don't know how bad things are.
01:26:42.000 You know, Google used to have an amazing corporate culture.
01:26:46.000 Oh, boy.
01:26:46.000 Look at this.
01:26:47.000 Apple is in talks to let Google Gemini power iPhone AI features.
01:26:51.000 Oh, my God.
01:26:52.000 Go back.
01:26:53.000 Oh, sorry.
01:26:53.000 I mean, I was adding that, too.
01:26:54.000 Yeah, go back.
01:26:57.000 Companies considering AI deal that would build on search pact.
01:27:01.000 Apple also recently held discussions with OpenAI about deal.
01:27:06.000 On this news, then a big investment happened too.
01:27:09.000 Magnificent Seven adds $350 billion on Gemini's reported iPhone deal.
01:27:13.000 So, because Google has implemented AI into their phones, specifically Samsung, Samsung's new Galaxy S24 Ultra, It has a bunch of pretty fantastic AI features, one of them being real-time translation, your ability to summarize web pages instantaneously,
01:27:34.000 summarizing notes, bullet points, very helpful features.
01:27:39.000 So because of that, another one is your ability to circle any image and it automatically will search that image for you.
01:27:46.000 Like, what is that?
01:27:47.000 Circle it, boom.
01:27:48.000 The Samsung phone will immediately give you a result and tell you what it is.
01:27:51.000 So very, very helpful.
01:27:53.000 But now there becomes, this is something that Apple has to compete with.
01:27:58.000 So Apple's decided to try to implement AI, but it has to outsource.
01:28:04.000 Yeah.
01:28:05.000 Yeah.
01:28:05.000 No, it is alarming.
01:28:08.000 I guess the point that I'd like to add on, which I hope will be useful for people, is part of what we're seeing across our institutions is a loss of professional responsibility, a loss of people doing their jobs.
01:28:22.000 And I don't mean base-level employees.
01:28:25.000 I mean leadership.
01:28:27.000 Institutions have important roles to play.
01:28:29.000 Companies have missions.
01:28:30.000 Universities must be completely committed to the truth, research, discovery.
01:28:36.000 Journalists must be committed also to the truth and methods to find the truth.
01:28:40.000 And what we've seen in the 2010s especially is many of these institutions being led away from their mission, their purpose, towards the political agenda of one side or another.
01:28:56.000 And so I think this is what we're seeing.
01:28:58.000 And if we're going to make it through this difficult period, we need some way to find the truth.
01:29:04.000 And the more we've gone into the Internet age, the harder it is to find the truth.
01:29:08.000 Like, we just look like, you know, something's incredible.
01:29:11.000 Like, we just say, you know, hey, look this up, and we got it.
01:29:14.000 But on anything contested, it's just very hard to find the truth.
01:29:18.000 And so that's why I'm especially disappointed in Google.
01:29:20.000 I always loved Google.
01:29:21.000 I thought it was an incredible company.
01:29:23.000 And for them to so explicitly say, you know, our mission is political.
01:29:29.000 It's not to help you find the truth.
01:29:30.000 That I thought was so disappointing.
01:29:32.000 Yeah, it is disturbing when a large company decides their mission is political.
01:29:37.000 Like, to which side?
01:29:39.000 To who?
01:29:40.000 Is it the truth?
01:29:42.000 Is that your main politics?
01:29:44.000 Or is it you decide that one side is good overall, net positive, the other side is net negative, and whatever you can do to subvert that other side is valuable?
01:29:55.000 That's right.
01:29:56.000 And so that's a mindset in which the ends justify the means.
01:29:59.000 And so part of the genius of American liberal democracy was to calm down those tribal sentiments to the point where we could live together, we could celebrate diversity in its real forms, we could get the benefits of diversity.
01:30:12.000 And that was all possible when we didn't feel that the other side was an existential risk to the country, that if the other side gets in, it's going to be the end.
01:30:22.000 And that's a very powerful image.
01:30:23.000 And that's an image that helped Donald Trump win.
01:30:25.000 There was an essay, what's it, by Michael Anton, I think, called The Flight 93 Election.
01:30:30.000 You know, if you're on Flight 93 being hijacked to crash into Congress and, you know, if you do nothing, you're going to crash into Congress, you'll do anything.
01:30:39.000 And so he framed it as a sort of a Hail Mary pass that, you know, patriotic Americans were supposed to vote for Donald Trump.
01:30:48.000 That mindset of the ends justify the means, the situation is so dire that even violence, even violence is justified.
01:30:56.000 That is really frightening.
01:30:58.000 And that's my concern, is that we could be headed that way.
01:31:01.000 We have not had much political violence.
01:31:03.000 There's been an uptick, but very little compared to, say, 1968 to 73. That period was much more violent.
01:31:10.000 Some hopeful will avoid that.
01:31:12.000 But once you say the ends justify the means, and we can cheat, we can lie, we can subvert the company's purpose because the end we're fighting for is so noble, well, the other side's going to do the same thing.
01:31:22.000 And before you know it, your culture war becomes a real war.
01:31:26.000 Yeah, and you're seeing that in the news, how it's implemented in the news.
01:31:30.000 I mean, I'm sure you're aware of this recent Donald Trump speech where he talked about a bloodbath.
01:31:35.000 Oh, God, yeah.
01:31:36.000 What the actual phrase was.
01:31:40.000 See if you can find that, Jamie, because it's actually important to highlight how...
01:31:44.000 Not just inaccurate, but just deceptive the media was in their depiction of what he said and that they are taking this quote out of context and trying to say that there's going to be a civil war if he doesn't get elected,
01:32:02.000 which is not what he was talking about at all.
01:32:05.000 See, pull it up.
01:32:07.000 Because it's so disturbing that they would – first of all, they would think that they could get away with it in this day and age with all the scrutiny and all the – with social media and all the independent journalists that exist now, which is one of the more interesting things about the demise of corporate media,
01:32:24.000 the demise in trust.
01:32:25.000 Trust in corporate media is at an all-time low and so this has led to a rise in true independent journalists.
01:32:33.000 The real ones out there, the Matt Taibbi's, the Glenn Greenwall's, the people that are actually just trying to say, what is really going on and what are the influences behind these things and why are these things happening?
01:32:44.000 But this one was bizarre.
01:32:46.000 When I saw it, then I saw the actual speech.
01:32:49.000 Let's play the actual speech.
01:32:50.000 Yeah, I have the actual speech.
01:32:51.000 The headlines are different, but I'll just play this.
01:32:53.000 Let's play the actual speech.
01:32:54.000 ...to China.
01:32:55.000 If you're listening, President Xi, and you and I are friends, but he understands the way I deal.
01:33:00.000 Those big monster car manufacturing plants that you're building in Mexico right now, And you think you're going to get that, you're going to not hire Americans, and you're going to sell the cars to us?
01:33:12.000 No.
01:33:13.000 We're going to put a 100% tariff on every single car that comes across the line, and you're not going to be able to sell those cars.
01:33:20.000 If I get elected, Now, if I don't get elected, it's gonna be a bloodbath for the whole...
01:33:26.000 That's gonna be the least of it.
01:33:27.000 It's gonna be a bloodbath for the country.
01:33:29.000 That'll be the least of it.
01:33:30.000 If this election, if this election isn't won, I'm not sure that you'll ever have another election in this country.
01:33:36.000 Does that make sense?
01:33:38.000 I don't think you're going to have another election in this country if we don't win this election.
01:33:42.000 I don't think you're going to have another election, or certainly not an election that's meaningful.
01:33:46.000 And we better get out or we better...
01:33:48.000 I actually say that the date, remember this, November 5th, I believe it's going to be the most important date in the history of our country.
01:34:01.000 I believe that.
01:34:02.000 So that's what he said.
01:34:04.000 Well, that sounds pretty bad.
01:34:05.000 That sounds like the Flight 93 election argument, that if I don't win, the country's over.
01:34:10.000 Yeah, but what he's talking about is this subversion of our economy and the subversion of our democracy, that we'll never have an election again.
01:34:20.000 I don't think he's saying that it'll be a bloodbath in terms of a civil war.
01:34:23.000 He's saying the economy's going to be destroyed.
01:34:25.000 There was no...
01:34:26.000 I was listening for that.
01:34:27.000 I was thinking maybe he meant it as a metaphor.
01:34:29.000 It's a long speech.
01:34:29.000 I didn't hear any...
01:34:30.000 I mean, the bloodbath is...
01:34:32.000 It's an unfortunate term, but he's not...
01:34:35.000 I don't think he's saying it's a civil war.
01:34:38.000 It sounded to me like he was.
01:34:40.000 It sounded to me like, you know, if he doesn't win, there will be violence.
01:34:44.000 Right.
01:34:44.000 You have to really give him a hell of a lot of benefit of the doubt.
01:34:47.000 But he's talking about the economy.
01:34:47.000 He was talking about...
01:34:48.000 No, it wasn't...
01:34:49.000 He was talking about China building plants.
01:34:51.000 He was talking about all these things and saying that if he doesn't get elected...
01:34:55.000 It's gonna be a bloodbath.
01:34:56.000 It's gonna be a mess.
01:34:57.000 I don't think he's specific.
01:34:58.000 I mean, I think he would elaborate on that if he was saying there'll be violence.
01:35:02.000 I don't think that's what he's saying.
01:35:03.000 I think he's saying destruction of our economy, the destruction of our...
01:35:08.000 You know, he makes a lot of asides.
01:35:10.000 So he was talking about the economy.
01:35:12.000 That's true.
01:35:12.000 And then he said, if I'm not elected, and then he makes an aside about what would happen to the country if he...
01:35:18.000 So look, we might disagree on this.
01:35:20.000 We might disagree.
01:35:21.000 We surely disagree on our priors.
01:35:23.000 It's surely the wrong way to say it.
01:35:24.000 Surely.
01:35:25.000 We both agree on that.
01:35:26.000 It's an unfortunate term to use.
01:35:29.000 Yes, that's right.
01:35:29.000 But it doesn't sound to me as though the media took that one out of context.
01:35:33.000 I just rewatched the longer video on closed captioning.
01:35:36.000 The video we watched cuts it off right after he says bloodbath.
01:35:41.000 Don't get elected.
01:35:42.000 It's going to be a bloodbath for the whole...
01:35:44.000 That's going to be the least of it.
01:35:46.000 It's going to be a bloodbath for the country.
01:35:47.000 That'll be the least of it.
01:35:49.000 But they're not going to sell those cars.
01:35:51.000 They're building massive factories.
01:35:52.000 A friend of mine All he does is build car manufacturing plants.
01:35:56.000 Okay, so he's back on the economy.
01:35:58.000 But the aside was not about the economy.
01:36:02.000 The aside was him making one of these typical asides about how important he is.
01:36:09.000 Joe, I think we're not going to settle this.
01:36:11.000 Look, I do agree that the media as a progressive left-leaning institution like universities has violated its duty many times to the truth and thereby lost the trust of much of the country.
01:36:26.000 Most of the people who work in these industries, I think, are wonderful and are trying to do a good job.
01:36:31.000 But the net effect, and this is my point about structural stupidity, During our culture war, institutions that have had very little viewpoint diversity have been subject to hijacking by those with a political agenda.
01:36:45.000 So I agree with you about that, although I disagree with you about what that comment from Donald Trump meant.
01:36:52.000 It sounded to me like it was not taken out of context.
01:36:55.000 Well, he was talking about the economy, though, specifically.
01:36:58.000 I know, but in the aside, he wasn't.
01:37:00.000 In the aside, he elaborates in the aside about the economy.
01:37:05.000 No, he just makes us decide about the bloodbath.
01:37:07.000 But that's the least of our problems.
01:37:08.000 Now back to what I was saying about the economy.
01:37:11.000 All right, look, we're not going to settle this one.
01:37:12.000 It's a terrible term.
01:37:14.000 It's a very unfortunate term.
01:37:16.000 If he said it would be a disaster instead of a bloodbath, that would have been the better term to use.
01:37:21.000 Yes, that would have been a reasonable thing to say.
01:37:23.000 But he's filled with hyperbole.
01:37:24.000 I mean, he's talking about, he's trying to excite people about the idea.
01:37:27.000 You're right.
01:37:27.000 Words matter when you're presidential campaign.
01:37:29.000 They do.
01:37:29.000 You're right.
01:37:30.000 No argument there and no way saying that that was the correct thing to say.
01:37:37.000 But the way they phrased it, the way they just tried to make it seem like that was the only thing that he was talking about.
01:37:45.000 Okay, I'm just not going to say anything else on this.
01:37:47.000 I get it.
01:37:48.000 But what you're saying is that these people are good people, but that they are ideologically captured?
01:37:56.000 Is that what you're saying?
01:37:57.000 What I'm saying is that most people are reasonable wherever you go, but in the social media age, it's no longer about what most people are like.
01:38:04.000 It's about how much power do the extremists have because anyone now has the power to hijack, threaten, intimidate.
01:38:10.000 Yeah.
01:38:32.000 And it's the same with professors.
01:38:34.000 Many people, especially those who listen to conservative sources, might think that professors are mostly tenured radicals who care more about Marxism than about educating their kids.
01:38:43.000 That's just not true.
01:38:44.000 What is true is that the minority that have extreme views now have a much bigger platform.
01:38:49.000 They have more power.
01:38:50.000 But most people are reasonable wherever you go.
01:38:53.000 Is the issue that the reasonable people are afraid of pushing back against the radical people?
01:38:59.000 Exactly.
01:38:59.000 That's it.
01:39:00.000 That's the issue.
01:39:00.000 Because there really are consequences.
01:39:03.000 That's right.
01:39:04.000 And people say, well, you've got tenure.
01:39:05.000 What are you worried about?
01:39:06.000 And the answer is, yeah, we've got incredible security.
01:39:09.000 But everybody is afraid of being publicly shamed, humiliated, attacked, and mocked.
01:39:14.000 Right.
01:39:14.000 And the people who go through it, I mean, it's really, it's incredibly painful.
01:39:19.000 They have to take sleeping pills at night.
01:39:22.000 They sometimes contemplate suicide and in one case committed suicide that I know of.
01:39:27.000 So yes, that's exactly the problem.
01:39:30.000 That's what I think the effect of not the original social media platforms like MySpace or early Facebook, but of the hyperviral ones that we got in the 2010s.
01:39:38.000 And the result of that, in terms of people terrified about people attacking them, is what you get when you got those people from Penn, from Harvard.
01:39:50.000 We're talking about this rampant anti-Semitism on campus where people were actively calling for the death of Jews, saying that this does not constitute harassment unless it's actionable.
01:40:03.000 Yes, that was stunning.
01:40:04.000 Stunning.
01:40:05.000 Right.
01:40:06.000 It's not wrong unless they act on it.
01:40:07.000 What is that like as a person when, you know, you are an academic and you are a professor, when you see that from these, especially from somewhere like Harvard?
01:40:17.000 Yeah.
01:40:18.000 So, yes, I'm a professor at NYU. I was at UVA for 16 years.
01:40:23.000 I love being a professor.
01:40:24.000 I love universities.
01:40:26.000 I'm also Jewish.
01:40:27.000 And I can understand the argument that those presidents were making.
01:40:32.000 The argument was a very narrow technical argument about whether students should be allowed to say from the river to the sea, Palestine will be free.
01:40:40.000 And so I understand why it would have been reasonable for them to say, well, we're not going to punish students for saying that.
01:40:48.000 That is political speech that's protected under the First Amendment.
01:40:53.000 So I understand the point that they were making.
01:40:56.000 But they were such screaming hypocrites in making that point because—and this is what the Cotley and the American Mind was all about—how did it happen that, you know, if a professor or administrator writes a single word that a student objects to and calls racist,
01:41:11.000 suddenly this person is out of a job.
01:41:13.000 Like, really?
01:41:13.000 Like, you're going to fire someone or let someone be tormented and fired because they said something?
01:41:19.000 That someone interpreted it in a certain way.
01:41:21.000 And that led us to be super hyper crazy sensitive about every word we say, because you never know when it'll explode and cause a scandal.
01:41:29.000 And so for the presidents to say, oh yeah, you know, anything anyone ever said between 2015 and yesterday would be punished if anyone was bothered by it.
01:41:39.000 But from the river to the sea, oh yeah, sure, that's constitutionally protected.
01:41:42.000 It wasn't just from the river to the sea.
01:41:44.000 It was the literal expression, death to Jews.
01:41:48.000 Yes, that's right.
01:41:49.000 That's what they were specifically defending, saying, unless it's actionable, which is insane, unless you commit actual genocide?
01:41:56.000 Is that what you're saying?
01:41:57.000 That's right.
01:41:57.000 No, I'm sorry, Joe, you're right.
01:41:59.000 The deeper question is about political speech, but you're right that, as Stefanik, I believe, was asking them, it was about calls for genocide.
01:42:07.000 Yes.
01:42:07.000 And so, yes, calls for genocide, it seems to me.
01:42:11.000 Again, I'm not a First Amendment lawyer.
01:42:13.000 Maybe on the First Amendment, legally, you can't be arrested for it.
01:42:17.000 But for God's sakes, on a university campus where you're trying to make everyone feel included, you can't even comment on Not just about the calls for genocide, you know, but about the actual events on October 7th.
01:42:28.000 So that, I think, is what really brought higher ed to really a nadir, a low point in public esteem, like literally a low point in public esteem.
01:42:37.000 I think it was a wake-up call for a lot of people that are kind of on the fence about how big the issue is.
01:42:44.000 Because these are the same people that call for you being kicked out of the university if you deadname someone.
01:42:51.000 Yeah, that's right.
01:42:52.000 These are the same people that if you use the wrong pronouns.
01:42:56.000 Yeah, that's right.
01:42:57.000 And so I'm actually, you know, so last semester was the worst one ever for higher education.
01:43:03.000 Data from Gallup and Pew show that the public, higher ed used to have an incredible brand, global brand, we were the best, everyone wanted to come here, scientific innovation, all the top academics were here in the United States.
01:43:16.000 And in 2015, people on the left had a very high opinion of higher ed, and actually people on the right had a moderately high opinion of it.
01:43:23.000 And then since 2015, it's dropped, not just among people on the right, but among centrists and moderates as well.
01:43:29.000 So higher ed really lost the trust of most of the country.
01:43:34.000 And I was running an organization called Heterodox Academies.
01:43:37.000 I started it with some other social scientists that advocates for viewpoint diversity.
01:43:41.000 And that's why I was kind of a target sometimes, because here I am saying, we need viewpoint diversity.
01:43:45.000 We need some conservatives, some libertarians.
01:43:48.000 We need to not all be on the same side politically.
01:43:51.000 Which is an amazing thing to fight against.
01:43:53.000 Yeah, that's right.
01:43:56.000 We're the experts in why diversity is beneficial, and the most important kind of diversity turns out to be viewpoint diversity.
01:44:03.000 Well, it's also the most important aspect of an open and free society is the ability to debate things.
01:44:08.000 Democracy is based on it.
01:44:10.000 And find out who's right or whose ideas resonate the most, who makes the most sense, who has thought about this.
01:44:18.000 And who has the more enlightened and educated perspective?
01:44:22.000 Who has more information?
01:44:24.000 Who has more balance?
01:44:26.000 That's right.
01:44:26.000 That's right.
01:44:27.000 So I think we hit a low point in the fall in such a way that I'm actually optimistic that things are going to change.
01:44:33.000 Because I've been concerned about these issues in universities, the culture issues, since 2014-2015 when Greg Lukianoff and I wrote our first Atlantic article titled The Coddling of the American Mind.
01:44:42.000 And every year it's gotten worse and worse and worse.
01:44:45.000 There's never been a turnaround until last year.
01:44:47.000 And as with The Emperor's New Clothes, you know, people can see that something is stupid and crazy and wrong, but they won't say anything.
01:44:54.000 But then when somebody does, then everybody can speak.
01:44:57.000 And I'm feeling finally, for the first time since 2015, I'm feeling that people sort of understand, you know what, wait, that was crazy what happened to us.
01:45:06.000 That was crazy.
01:45:06.000 People were saying crazy stuff.
01:45:09.000 Let's put our head above the parapet.
01:45:11.000 Let's like start sometimes saying maybe that is not right.
01:45:14.000 So I think that things are actually going to turn around.
01:45:17.000 Maybe not at the Ivies, although there are movements of faculty there saying, no, let's return to academic values, the pursuit of truth.
01:45:24.000 So I think what I'm hoping, what I think is likely to happen, is we're going to see a split in the academic world.
01:45:30.000 That is, there are already schools like Arizona State University.
01:45:33.000 There are schools that already have basically said no to all the crazy stuff, and they're focusing on educating their students.
01:45:40.000 And I think we're going to see more students going that way.
01:45:42.000 The University of Chicago is another model.
01:45:44.000 So I think there are a few schools that departed while almost all the other schools went in the same direction.
01:45:49.000 But I think now that's going to change and it can change actually pretty quickly because most of the university presidents don't like this stuff.
01:45:56.000 I've spoken to many of them.
01:45:58.000 All the crazy politics, the activist students, it made their job very difficult.
01:46:03.000 So I'm actually hopeful that we are starting to see some university presidents standing up and saying, you know, it's not okay to shout down every conservative speaker.
01:46:11.000 Like, no, we're not going to allow that.
01:46:12.000 So we'll see a year from now, if I come back on a year or two, we'll see.
01:46:17.000 But I think things are actually beginning to get better for the first time since 2015. Well, I hope you're correct.
01:46:23.000 And I do agree that the pushback was so extreme that some action is likely to take place.
01:46:29.000 I think the first step of that has got to be to allow people with differing perspectives to debate and not shout them down.
01:46:39.000 And also to show that that shouting people down and setting off fire alarms is shameful.
01:46:45.000 It's disgraceful in a higher education institution.
01:46:49.000 That's right.
01:46:50.000 If there was any punishment, the students would change very quickly.
01:46:53.000 The students are very concerned about getting a job, about their futures.
01:46:57.000 And what the early presidents who didn't do anything, what they conveyed was, you can yell and scream all you want, nothing will happen to you.
01:47:03.000 You can bang on the glass and frighten speakers, nothing will happen to you.
01:47:07.000 You can throw rocks through windows, nothing will happen to you.
01:47:10.000 And of course, that just brought us more obnoxious behavior on campus and shame to higher ed in the eyes of the country.
01:47:19.000 So we had a brand that was based on extreme excellence and truth.
01:47:25.000 I think we damaged our brand very severely.
01:47:28.000 I think finally now there's a reckoning and a realization of what we've done.
01:47:32.000 And I think we're going to see a recovery, an uneven recovery.
01:47:35.000 But I do think that a year or two from now, the mood, well, who knows what's going to happen with the election and whether there'll be a bloodbath.
01:47:45.000 No, don't take that out of context.
01:47:46.000 I just was referring to the early part of our conversation that you're not quoting when you quote this.
01:47:50.000 Yeah, let's say disaster.
01:47:51.000 Yeah, disaster.
01:47:52.000 It could be disaster.
01:47:53.000 But I am actually, you know, about certain things, I'm pretty pessimistic like you.
01:47:56.000 But at least on the future of universities, I do think for the first time, I'm actually optimistic.
01:48:01.000 I wasn't optimistic a year or two ago.
01:48:02.000 Well, that's great because you're on the ground.
01:48:04.000 So you would really understand more than most.
01:48:07.000 And do you sense that with students, there's also a recognition that this is a gigantic issue?
01:48:14.000 Like, what was the reaction to students?
01:48:17.000 I mean, not specifically Jewish students, but the Jewish students must have been the most horrified by this.
01:48:23.000 Oh, my God.
01:48:24.000 Yes.
01:48:24.000 Stabbed in the back is the way many of us feel.
01:48:28.000 What I've found all along, as I say, most people are reasonable.
01:48:32.000 When all this stuff was breaking out in 2015, 2016, most students just wanted to get an education.
01:48:37.000 They don't want to take part in this.
01:48:39.000 And now I find out...
01:48:40.000 Of course, I teach in a business school.
01:48:41.000 I teach at NYU Stern.
01:48:43.000 Our students are pretty pragmatic.
01:48:45.000 They want to get a job.
01:48:46.000 Most of them are from immigrant backgrounds.
01:48:48.000 They're not here to protest the latest political...
01:48:52.000 They're here to succeed.
01:48:52.000 They're here to succeed.
01:48:52.000 That's right.
01:48:54.000 So that is an aspect of Gen Z that gives me hope is that they see the problems.
01:48:59.000 They see the problems with social media.
01:49:00.000 They see the problems with the extreme activists.
01:49:03.000 So what we have to change is not the average student.
01:49:06.000 What we have to change is the dynamics so that the average student feels freer to speak up.
01:49:10.000 And how can that be done?
01:49:11.000 Well, so I founded two organizations to do that.
01:49:14.000 One is Heterodox Academy.
01:49:16.000 We need more viewpoint diversity among the professors, or at least we need more toleration of people who are centrist or libertarian.
01:49:24.000 So that's one on the faculty side, what we need to do, and also the culture on campus.
01:49:29.000 But I also co-founded another organization called the Constructive Dialogue Institute with a woman named Caroline Mill.
01:49:35.000 And what we did is we took some of the insights of moral psychology and some of the content from my book, The Righteous Mind, and it evolved.
01:49:42.000 It's now six 30-minute modules that teach you about moral psychology.
01:49:45.000 Why are we divided?
01:49:47.000 What do liberals believe?
01:49:47.000 What do conservatives believe?
01:49:49.000 Why do conversations go wrong?
01:49:51.000 How can you start more skillfully?
01:49:52.000 How do you need to listen first?
01:49:54.000 So there's a lot of like Dale Carnegie sort of wisdom in there.
01:49:57.000 And it's really effective.
01:49:58.000 So if people go to ConstructiveDialogue.org, the program is called Perspectives.
01:50:03.000 It's being used in, you know, I think more than 50 universities now.
01:50:06.000 So there are things that we can do, but it's going to take leadership and good psychology.
01:50:12.000 That's so important what you just said and I think that if those programs gain momentum and that people recognize that it's really beneficial to all to have these ideas debated.
01:50:26.000 If you truly believe that opposing ideas to your ideology are evil, you should be able to debate those.
01:50:33.000 And the only way to do that is to have someone to have the ability to express themselves.
01:50:38.000 And for you to counter those points that they make.
01:50:41.000 Exactly.
01:50:42.000 And this is what many commentators on the left have been pointing out since 2015. Van Jones has an amazing talk.
01:50:48.000 He's a progressive, democratic, well-connected, smart person.
01:50:55.000 And he's been pointing out, there's a great talk he gave at the University of Chicago, I have a quote on this in the Coddling the American Mind, where he talks about the move to protect students from bad feelings, the move to protect them for emotional safety,
01:51:11.000 is really bad for the students.
01:51:13.000 But then his talk goes on and he says, this is actually really bad for the Democrats.
01:51:19.000 It's really bad for young activists to drown out opposition, to not listen to the arguments, to not get stronger.
01:51:27.000 A lot of what's happened on campus, I think, is what you might call a pyrrhic victory.
01:51:32.000 A pyrrhic victory is one where you won the battle, but that made you lose the war.
01:51:36.000 I think when your side is able to wipe out opposition, it might feel like a victory at first, But it's ultimately going to weaken you.
01:51:49.000 And, you know, the same thing is going on in the far right.
01:51:52.000 I mean, there's a lot more fear and really bad consequences for people who dissent on the right, too.
01:52:00.000 But if we're talking about universities, that's more an issue of what's been happening on the left.
01:52:04.000 Are there any universities that don't have a left-leaning perspective?
01:52:10.000 Oh, sure, sure.
01:52:11.000 Like what universities?
01:52:12.000 Yeah, not in the top 20 or 50, I would say.
01:52:14.000 Isn't that a problem?
01:52:15.000 Well, that is a problem.
01:52:15.000 That's right.
01:52:16.000 Yeah, it is.
01:52:16.000 Well, actually, no, but put it this way.
01:52:18.000 First of all, I mean, there are lots of religious universities, Christian universities that don't have this problem.
01:52:24.000 There are, let's see, there are large state schools tend to have much less of it because, again, most people are reasonable.
01:52:32.000 The great majority of faculty want to do their research, teach their classes.
01:52:35.000 They don't want to get involved in this stuff.
01:52:37.000 The problem is especially severe.
01:52:39.000 For some reason, the Ivy League schools, that's what's really surprising.
01:52:42.000 I thought it was just like, well, the elite schools.
01:52:43.000 No, it's actually the Ivies are the place where the worst anti-Semitic actual, you know, threats and intimidation and even some violence are happening or threats of violence are happening.
01:52:52.000 Something about the Ivies makes them more extreme.
01:52:55.000 What do you think that is?
01:52:56.000 Well, I think it's in part the region.
01:53:00.000 So most of the shout-downs, most, you know, Greg Lukianoff and FIRE, they've really been tracking this for a long time.
01:53:06.000 Most of the shout-downs happen in the Northeast and along the West Coast and then around Chicago.
01:53:10.000 That's where most of the really nasty stuff happens.
01:53:14.000 This is not happening at the great majority of American universities.
01:53:17.000 It's not happening at top schools in the South.
01:53:19.000 It's not happening at top schools in the Southwest.
01:53:22.000 So it is in part where it is.
01:53:25.000 And then I think also the Ivy League is full of really rich kids.
01:53:31.000 The statistic a number of years ago that the top schools have more people from the top 1% of the income distribution than from the bottom 60%.
01:53:42.000 So there's a real concentration, especially in the Ivies, of rich kids who don't need to worry as much about getting a job and have the bandwidth to devote themselves to politics while they're students.
01:53:55.000 Hmm.
01:53:56.000 God.
01:53:58.000 It's just, I just fear for the children that come out of that too, these young people that come out of that, that have these distorted perspectives that have to kind of rewire their view of the world once they get out.
01:54:13.000 It's almost like taking someone from a cult and trying to just delete the indoctrination.
01:54:24.000 That's right.
01:54:24.000 And it's almost impossible to do that, especially if most of what's coming in is coming in from TikTok, not from your parents or your friends or your teachers.
01:54:32.000 Back to the problem.
01:54:33.000 Back to the problem, that's right.
01:54:34.000 So again, back to the question of the TikTok ban.
01:54:38.000 The issue here is not, should we ban TikTok?
01:54:41.000 The issue is, should American law require a divestiture of TikTok from a Chinese corporation that is beholden to the CCP? That's the question.
01:54:49.000 Which seems logical.
01:54:50.000 Yes.
01:54:51.000 There's an issue that's happening in Texas currently where one of the porn sites has pulled out of Texas because they require age verification.
01:55:01.000 And so there's all this pushback about whether or not they should be able to require age verification.
01:55:07.000 You have to be 18 to use porn websites, which I think is very reasonable.
01:55:14.000 Yes, it's insane that we're even debating it.
01:55:16.000 Yeah, we're running a mass psychology experiment on children by having smartphones with large screens and having instantaneous access to porn.
01:55:26.000 That's right.
01:55:27.000 I forget the exact number, but a very large number of boys are on Pornhub or porn sites daily, every day.
01:55:35.000 And again, as we were talking about before, in puberty, the frontal cortex, the brain's really rewiring itself.
01:55:41.000 This is when you're supposed to be developing the ability for a boy to talk to a girl, you know, for straight kids.
01:55:46.000 Right.
01:55:46.000 You know, it's hard, like, because boys and girls, they think a little differently.
01:55:49.000 It's awkward.
01:55:50.000 There are always mistakes.
01:55:51.000 They need to be practicing.
01:55:53.000 But instead, they're exposed to this diet of just horrible, horrible stuff.
01:55:57.000 And the girls see it too.
01:55:58.000 The girls are not on as much, but they're all exposed to it.
01:56:01.000 And so, you know, we now see that many more members of Gen Z, they don't want to get married.
01:56:06.000 They don't want to have children.
01:56:07.000 They're not having as much sex.
01:56:09.000 I kind of understand it.
01:56:10.000 You know, if that's what you think this sex stuff is, when you're an 11-year-old and you see this stuff, you're not going to be like, ooh, I want that to happen to me.
01:56:17.000 It's also so distorted, the relationships in these porn videos.
01:56:21.000 It's bizarre fantasy.
01:56:25.000 And about step-siblings.
01:56:27.000 Like, why is so much about stepsisters?
01:56:28.000 It's a lot of step-moms, too.
01:56:31.000 Right.
01:56:31.000 So the whole thing is sick.
01:56:32.000 And once again, I'm not going to tell adults what they should do with their spare time.
01:56:36.000 But for God's sakes, I am going to try to tell companies that they can't just have access to my kids from the age of 9 or 10 and do what they want with them.
01:56:44.000 So, you know, I don't know the details of the Texas law.
01:56:46.000 But I think we've got to do something to age-gate pornography.
01:56:51.000 I just can't see...
01:56:52.000 I mean, yes, there's a libertarian argument on the other side that, oh, we should never require identification from anyone for anything.
01:56:58.000 Well, if that's the way you're going to go, no restrictions, then either we have to keep kids off the internet, which is insane.
01:57:08.000 We can't keep them off of the entire internet.
01:57:10.000 Or we have to say, you know what?
01:57:12.000 Maybe some companies should be held liable.
01:57:14.000 Maybe Congress was wrong to grant them blanket immunity from lawsuits for what they're doing to our kids.
01:57:19.000 I think we should change that.
01:57:20.000 Do you think, at a certain point in time, all this is going to become more obvious?
01:57:26.000 And do you think the trend is that it's becoming more obvious to people, whether it's to politicians or to parents or to...
01:57:34.000 Yes.
01:57:34.000 Over time, the negative effects of it are just so obvious.
01:57:39.000 Yes, and I think that is happening right now.
01:57:42.000 We're right at the beginning of the tipping point.
01:57:43.000 And I'm confident about this because the tipping point began in Britain last month.
01:57:48.000 So parents everywhere are fed up.
01:57:50.000 They all see it.
01:57:51.000 They don't know what to do, but they're all frustrated.
01:57:53.000 In Britain, some parents put up a website delayed smartphones.
01:57:58.000 People rushed to it.
01:58:00.000 They had a WhatsApp group for parents to come together.
01:58:03.000 Thousands and thousands joined right away.
01:58:06.000 In Britain, the government actually has mandated phone-free schools, which is one of my four norms.
01:58:11.000 So whenever you have a situation where most people hate it, but they're either afraid or confused, that can change really, really quickly.
01:58:20.000 And that's like the fall of the Berlin Wall, fall of the Iron Curtain.
01:58:23.000 We thought it was going to be there forever.
01:58:24.000 But since most people hate it, I traveled behind the Iron Curtain in 1987. Everybody hated it.
01:58:30.000 And so once the Berlin Wall fell, it fell everywhere very quickly.
01:58:35.000 I think the same is going to be true for social media and the digital environment for children.
01:58:40.000 I think that 2024 is going to be, for the digital environment, what 1989 was for Soviet communism.
01:58:47.000 Parents are fed up.
01:58:49.000 The data is in.
01:58:50.000 There's no doubt that there's an epidemic now.
01:58:53.000 The evidence that it's caused by social media is a lot stronger than it was a few years ago.
01:58:58.000 People are ready to act.
01:58:59.000 Congress is ready to act.
01:59:00.000 I think universities are now actually getting better now that they've been through that.
01:59:08.000 I think that the situation around kids and digital media Is going to change radically this year.
01:59:13.000 That's my goal in writing the book, in writing The Anxious Generation.
01:59:18.000 And I have this amazing collaborator, the artist Dave Cicerelli.
01:59:23.000 So these stickers here that I gave you, I don't know if we can hold them up.
01:59:27.000 I'll just hold them for my camera.
01:59:28.000 It's a milk carton with a child on it, and it says, Missing Childhood.
01:59:34.000 So my friend Dave Cicerelli is a great artist in New York City.
01:59:38.000 He designed the cover for the book.
01:59:40.000 And he and I had a plan for some, like, guerrilla art campaign with posters, you know, linking, you know, Instagram to cigarettes, that sort of thing, a couple years ago.
01:59:48.000 So Dave had the idea to really go big.
01:59:51.000 And so Dave has built a 12-foot-tall milk carton of the thing you just showed, a 12-foot-tall milk carton.
01:59:57.000 It's going to be on the National Mall in Washington this Friday.
01:59:59.000 If you're in D.C., check it out.
02:00:01.000 It's coming to New York City, the northeast corner of Union Square.
02:00:05.000 I'll be there on March 25th and 26th.
02:00:09.000 I'll be there on the 25th.
02:00:11.000 We're starting a national movement.
02:00:12.000 There are lots of organizations that are joining us here, but we're starting a national movement to get parents, to encourage parents, to work together.
02:00:21.000 Because as I said, we can escape this if we work together.
02:00:24.000 It doesn't have to be all of us.
02:00:25.000 But if a lot of us say, we're not going to give our kids smartphones until 14, we're not going to let them open an Instagram or TikTok account until they're 16, we're going to ask our schools to go phone-free, and we're going to give our kids a lot more independence of the sort that we had in a much more dangerous world.
02:00:42.000 If we do those four norms, we really can turn that around.
02:00:45.000 And I'm confident we are at the tipping point right now.
02:00:50.000 Even a few months, even by July and August, or let's say by September, when school starts again in the fall, I think there's going to be a different vibe about phones and the roles of technology in kids' lives.
02:01:00.000 Well, I hope you're right, Jonathan, and I really appreciate you, and I really appreciate you writing this and spending so much time on this and thinking about it so thoroughly.
02:01:09.000 The Anxious Generation, How the Great Rewiring of Childhood is Causing an Epidemic of Mental Illness.
02:01:14.000 It's available right now.
02:01:15.000 Go get it, folks.
02:01:17.000 Listen to it.
02:01:17.000 Read it.
02:01:18.000 Absorb it.
02:01:19.000 Take it in.
02:01:20.000 Thank you very much.
02:01:21.000 Really appreciate you.
02:01:22.000 Thank you, Joe.
02:01:22.000 It's always fun to talk with you.
02:01:23.000 Fun to talk to you, too.
02:01:24.000 Thank you.
02:01:25.000 Bye, everybody.