In this episode of the Joe Rogan Experience, the comedian and author Jeff Perla joins me to talk about his new book, The Coddling of the American Mind, and how the over-emphasis on social media has led to an epidemic of mental illness in America's youth. We talk about the dangers of too much screen time, the impact it's having on our mental health, and what we can do to mitigate the problem. It's a great episode, and I hope you enjoy it as much as I enjoyed recording it. If you like what you hear, please HIT SUBSCRIBE and leave us a rating and review on Apple Podcasts and other podcasting platforms. You can also join the conversation by using the hashtag and tag in the comments section below. Thanks for listening and Happy Listening! Joe and Jeff - See linktr.ee/TheJoeRoganExperience Subscribe to the show on iTunes. Learn more about your ad choices. Rate, review and subscribe to the podcast. Enjoy this episode and the others like it on your favorite streaming platform. And spread the word to your friends about it! Timestamps: 1:00 - How much time does it take to make a podcast? 2:30 - What are you spending on your social media? 3:15 - What is too much time on the internet? 4:40 - What do you spend on the Internet? 5: How much does it really mean to you? 6: What is a good day? 8:00 9:00 | How much of it matters? 11: What does it matter? 12:30 | What is enough? 13: How do you need to be productive? 15: Is it possible to have a good life? 16:30 17:40 | How do I have it all? 18:40 What is it better than a good night? 19:00 / 16:00 Can you have a full life ? 21: What do I get it? 22:00 Is it better? 27: What can I get more than one hour of that I could I have? 24:00 or less than that? 25:00? 26:00 Should I have more time to do something else? 29:00 + +?
00:00:16.000The same problems that you talked about when you were here last that I've referenced many times since on the podcast have only exacerbated, unfortunately.
00:00:25.000And that's why you wrote this, The Anxious Generation.
00:00:30.000And it could not be more true how the great rewiring of childhood is causing an epidemic of mental illness.
00:00:38.000I don't think anybody can dispute that.
00:01:07.000I think a lot of older people, particularly boomers, they're a little bit disconnected from it because they're not, unless they're addicted to Twitter, you know, they're not engaging in this stuff.
00:01:20.000But part of the message of the book is that social media and the things kids are doing on screens are not really like TV. They're much, much worse for development.
00:01:28.000Yeah, and even watching too much TV, I don't agree that they turned out okay.
00:02:19.000So in all these ways, the new way that kids are digital is really not like what we had when we were watching TV. It's also an extraordinary amount of wasted resources.
00:02:32.000I'm always embarrassed when I look at my phone.
00:02:34.000When I see my screen time, I'm like, four hours?
00:02:36.000Like, that's four hours I could have done so many different things with.
00:02:41.000And so that's the concept of opportunity cost is this great term that economists have, which is the cost of, you know, if you buy something, you know, if you invest, you know, an hour of your time and $100 to do something, how much does it cost?
00:02:53.000Well, you know, $100, but you could use that $100 and that hour for something else.
00:03:49.000So, in 2019, when I was last here with you, my book, The Coddling of the American Mind, had just come out.
00:03:55.000And back then, people were beginning to sense that, you know, this internet, the phones, the social media that we were all so amazed by, you know, there was a very positive feeling about all this stuff in the early part, you know, like in the 2000s.
00:04:09.000Sentiment was beginning to turn, but there was a big academic debate because when you look at studies that look at how, you know, do kids who spend a lot of time on screens, do they come out more depressed?
00:04:20.000The answer is yes, but the correlation is not very big.
00:04:22.000So there was a big argument among researchers, and that's when I got into this around 2019, really getting into that debate.
00:04:29.000And I think that Gene Twenge and I really had good data showing, you know, there is an issue here.
00:04:34.000And then COVID came, and that confused everything.
00:04:37.000Because, you know, basically when I was on with you last time, 2019, I was saying, you know, what kids most need is less time on their devices and more time outside playing unsupervised.
00:06:19.000So, I don't know what your kids think about social media and whether they think it's a good thing or a bad thing, but we are hopeful that members of Gen Z are going to start, and they are starting to advocate that, you know what, this is messing us up.
00:06:35.000So this is the graph that I showed last time I was on.
00:06:39.000What it shows, because I know most of your listeners are probably just listening to the audio, it shows that from 2005 to 2010, the rates of depression in girls was about 12% of American girls had a major depressive episode in the last year.
00:09:09.000It's when kids are getting high-speed data plans.
00:09:12.000So my argument in the book is that we had a complete rewiring of childhood between 2010 and 2015. In 2010, most of the kids had flip phones.
00:09:39.000So that's what I think happened between 2010 and 2015. TikTok becomes popular only really more 18, 19, 20. And it's so new, we don't have good data on just TikTok.
00:09:51.000But I suspect that that sort of extra acceleration might be due to TikTok.
00:12:28.000Instagram Reels and YouTube Shorts, they might have similar effects to TikTok, but the Chinese government can literally tell ByteDance to change what kids are seeing.
00:12:43.000They tell them in China, you have to have this kind of content and not that kind of content.
00:12:47.000There was an incredible episode of—you had Tristan Harris on.
00:12:53.000Tristan Harris has this amazing podcast episode where they go into the national security risks, and they show that the day that Russia invaded Ukraine, TikTok in Russia changed radically.
00:13:04.000Like, the government was on—like, you know, TikTok was on it.
00:13:07.000Like, yep, we're going to do what Putin wants us to do.
00:13:10.000So the idea that the most influential—the most influential platform on American children, The idea that that must do what the Communist Party tells it to do at a time when we have mounting tension with China and the possibility of a war.
00:13:25.000I mean, as Tristan says, imagine if in the 1960s, the Soviet Union owned and controlled PBS, ABC, NBC, and all the kids' programs.
00:13:38.000So I hope, listeners, I really strongly support this bill.
00:13:41.000I think Representative Mike Gallagher, I think, was one of the ones proposing it, or at least certainly advocating for this issue.
00:13:50.000I hope people will not see it as a TikTok ban, but they'll see it as an urgent national security move to force ByteTance to sell to a non-Chinese owner.
00:14:02.000And specifically, what are they pointing to when they say national security risk?
00:14:27.000And the data can be used for all sorts of purposes, especially marketing and advertising.
00:14:31.000And so TikTok has enormous amounts of data, and they can get all psychological on it because they know exactly how long you hesitated, how much you liked certain kinds of videos.
00:14:41.000You know, many people have written articles on how TikTok seems to have known they were gay before they did, that sort of thing.
00:14:46.000So TikTok has extraordinary amounts of data on most Americans, certainly most young Americans.
00:14:53.000And they say, oh, but, you know, we don't share, like, it's in a server over here in Singapore, I don't know where, but, you know, it's not in China.
00:15:02.000You know, there's no way it could possibly be the case that the data is really separated and not available to the Chinese Communist Party.
00:15:10.000And what are they pointing to in terms of the danger of this data that makes them want to have it sold to an American company?
00:15:18.000I don't know whether the motivation behind the bill, I don't know whether it's that the Chinese would have some access to data on American citizens or whether what most alarmed me when I heard the Tristan Harris podcast was the ease of influencing American kids to be pro this or pro that on any political issue.
00:15:40.000You're seeing that with Palestine and Gaza.
00:16:09.000We had Riley Gaines, who was the female athlete that competed against Leah Thomas.
00:16:13.000And she has said that biologically male athletes should not be able to compete with biologically female athletes because they have a significant advantage.
00:16:21.000And she was banned from TikTok just for saying that.
00:16:58.000And then we saw it in journalism, newspapers and editors who wouldn't stand up for journalistic principles.
00:17:04.000And so I think what has happened here is that social media allows whoever is angriest and can mobilize most force to threaten, to harass, to surround, to mob anyone.
00:17:17.000And when people are afraid to say something, that's when you get the crazy distortions that we saw on campus.
00:17:55.000And that's when we began sort of like teaching on eggshells in universities because our students could really do a lot of damage if we said one word they didn't like.
00:18:02.000And it's not just the students, which is really disturbing.
00:18:07.000There was an FBI security specialist who estimated that somewhere in the neighborhood of 80% of the Twitter accounts were bots, which is very strange because that means that they're mobilizing specifically to try to push different narratives.
00:18:35.000It's more like a place where people say things and the fans in the stands are hoping to see blood.
00:18:43.000To move our discussions onto platforms like that, that can be manipulated, that anyone—it doesn't have to be a foreign intelligence service.
00:18:53.000It could be anybody who wants to influence anything in this country or anywhere in the world— They can, you know, for very little money, they can hire someone to create thousands, millions of bots.
00:19:04.000And so we're living in this sort of funhouse world where everything is weird mirrors and it's very hard to figure out what the hell is going on.
00:19:12.000Have you ever sat down and tried to figure out a solution to this other than trying to encourage people not to use it?
00:19:18.000Jamie, does something happen if the volume just dropped lower?
00:19:32.000So, when we're talking about the democracy problems and the, you know, manipulation of politics or anything else, those are really, really hard.
00:19:40.000I have a few ideas of what would help and we're not going to do them because, you know, all of them are like the left likes and the right doesn't or vice versa.
00:19:47.000Oh, things like, you know, like identity authentication.
00:19:51.000If large platforms had something like know your customer laws, That is, you know, if you want to open an account on Facebook or on X, you have to at least prove that you're a person.
00:20:02.000And I think you should be able to have to prove that you're a person in a particular country, I think over a certain age.
00:20:09.000You prove those to the platform, not directly, you go through a third party.
00:20:13.000So even if it's hacked, they wouldn't know anything about you.
00:20:15.000You establish that you're a real person and then you're cleared.
00:20:21.000If we did that, that would eliminate the bots.
00:20:24.000That would make it much harder to influence.
00:20:26.000That would make us have much better platforms for democracy.
00:20:29.000Is that possible to do internationally?
00:20:32.000Well, the platforms can certainly require whatever they want for membership.
00:20:37.000Right now, they are legally required to ask you if you're over 13. If you're 13 or over, they ask it, and then they accept whatever you say, and that's it.
00:20:58.000So one of the things that people are nervous about when it comes to authentication is that if you could do that, then you could target individuals that wouldn't be allowed to be anonymous.
00:21:11.000So you eliminate the possibility of whistleblowers.
00:21:34.000So I understand the concern, and there are values to having anonymity, but I think what we're seeing now is that the craziness, the way it's affecting, it's making it harder for democracies to be good, vibrant democracies, and it's making it easier for authoritarian countries like China to be powerful and effective authoritarian countries.
00:21:53.000So I think we have to start weighing the pluses and minuses of the costs and benefits here.
00:21:58.000Right, but how would you ramp that up?
00:22:00.000How would you implement that internationally?
00:22:03.000Say, if you're talking about people in Poland, just pick a country.
00:22:08.000Well, the platforms can do whatever they want, but then, yes, if a company starts in Poland, then the US Congress would have no influence on that.
00:22:17.000Right, like China could pretend and they could falsify the data that shows that these are individuals.
00:22:33.000You're never going to have a perfect system.
00:22:35.000But right now, it's just so easy and cheap and free to have massive influence on anything you want.
00:22:43.000But the larger question here was, you asked me, what can we do?
00:22:45.000And what I'm saying is, there are some things like identity authentication that I think would help, but yes, there are implementation problems.
00:22:52.000There's all kinds of political questions.
00:22:53.000So my basic point is, man, those problems, I don't know that we can solve, but we can do better.
00:22:58.000Oh, and I should point out, a lot of these have to do with the basic architecture of the web.
00:23:02.000When we move from web one, which was put up information, it's amazing, you can see things from everywhere.
00:23:07.000To Web 2, which was directly interactive, now you can buy things, you can post stuff, and it's the Web 2 that gave us these business models that have led to the exploitation of children and everyone else.
00:23:22.000And I'm part of a group, Project Liberty, if you go to projectliberty.io, that's trying to have a better Web 3, where people will own their own data more clearly.
00:23:33.000As the architecture changes, it opens us up to new possibilities and risks.
00:23:37.000So there are some hopes for a better internet coming down the pike.
00:23:42.000Actually, I just wanted to put all this stuff out there about democracy to say this is really hard, but when we talk about kids and mental health, this is actually amazingly doable.
00:23:52.000We could do this in a year or two, and the trick, the key to solving this whole problem with kids is to understand what's called a collective action problem.
00:24:02.000So there are certain things where, you know, like if you have a bunch of fishermen and they realized, oh, we're overfishing the lake, let's reduce our catch.
00:24:13.000And if one person does that and no one else does, well, then he just loses money.
00:24:17.000But if everyone does it, well then actually you can solve the problem and everyone can do fine.
00:24:22.000With social media, what we see over and over again is kids are on it because everyone else is.
00:24:28.000And parents are giving their kids a phone in sixth grade because the kid says everyone else has one and I'm left out.
00:24:34.000And over and over again, you see this.
00:24:36.000When you ask kids, you know, how would you feel if I took your Instagram or TikTok away?
00:26:27.000So the first rule is no smartphones before high school.
00:26:30.000And as long as a third of the parents do this, well, then the rest of the parents are free to say when their kid says, Mom, you know, I need a smartphone.
00:27:40.000And I think that any community that wants to do this, because what I find over and over again is that most parents are really concerned about this.
00:27:51.000And so I don't have to convince parents to change their minds about something.
00:27:56.000What I'm trying to do with the book is show them here are four norms that are pretty easy to do if others are doing them, and these are going to make your kids happier, less mentally ill.
00:28:09.000Yeah, like I said, it sounds like a good suggestion.
00:28:12.000I just don't imagine that with the momentum that social media has today and the ubiquitous use that kids are going to give it up.
00:28:20.000They're not going to want to give it up.
00:28:21.000I think there's a lot of kids that have had problems that if you talk to them alone and you say, wouldn't it be better if social media didn't exist, if they've been bullied or what have you, they'd say yes.
00:28:40.000Well, you know, you may be right, but I'm encouraged because whenever I speak to Gen Z audiences, and, you know, I've spoken to middle schools, high schools, college audiences, I always ask, you know, do you think I got this wrong or do you think this is a correct description of what's happening?
00:29:14.000All right, so the first is no smartphone before high school.
00:29:17.000Second is no social media until 16. That one's going to be a little harder to do.
00:29:21.000But the big platforms like Instagram, where you're posting and the whole world is seeing and strangers are contacting you, I think the age is currently 13 and it's not enforced.
00:29:31.000I think that needs to go up to 16. Here, it would be nice if Congress would raise the age to 16 and make the companies enforce it.
00:29:39.000But even if they don't, Parents, as long as many other parents are doing it, me, I as a parent, you know, my kids are 14 and 17, as long as many other parents are saying 16 is the age, then it's very easy for me to say that also.
00:30:42.000So I've published articles in The Atlantic and on my substack, afterbabble.com, bringing together the research.
00:30:51.000When kids have a phone in their pocket in school, they're going to be texting.
00:30:55.000Because if anyone is texting during the day, during the school day, they all have to check because they don't want to be out of the loop.
00:31:01.000They don't want to be the one who doesn't know.
00:31:03.000So, when kids started bringing smartphones into school instead of flip phones, academic achievement actually went down.
00:31:10.000Kids are stupider today than they were 15 years ago.
00:31:12.000I mean stupider meaning measuring their academic progress.
00:31:15.000After 50 years of improvement, it turns around after 2012. And this is true in the US and internationally.
00:31:21.000So there's just no reason why kids should have the phone on them.
00:31:24.000They should come in in the morning, put it in a phone locker or a yonder pouch, go about their day, and guess what?
00:31:29.000The schools that have tried it, after a week or two, everyone loves it.
00:31:33.000The kids are like, oh, wow, we actually talk in between classes.
00:31:36.000We have five minutes in the hallway, we actually talk.
00:31:38.000And you hear laughter, whereas right now in a lot of schools, it's just zombies looking at their phones in between as they're walking from class to class.
00:32:26.000Imagine if you didn't have a normal childhood where you developed executive function, where you developed that ability as a teenager.
00:32:33.000Because puberty is when the prefrontal cortex, the front part of the brain, that's when it rewires into the adult configuration.
00:32:40.000So the fact that we're scrambling kids' attention at the time when they're supposed to be learning how to pay attention I think is terrible.
00:32:52.000This is my concern, is that this is just the beginning of this integration that we have with devices and that the social media model and it's been immensely profitable and incredibly addictive and there's a massive,
00:33:11.000massive amount of capital that's invested in keeping us locked into these things.
00:33:16.000Where do you think this goes from here?
00:33:18.000Have you paid attention to the technology?
00:33:24.000So let me just draw a very, very sharp, bright line between adults and children.
00:33:30.000I'm very reluctant to tell adults what to do.
00:33:33.000If adults want to spend their time on an addictive substance or device or gambling, I'm reluctant to tell them that they can't.
00:33:39.000So when we're talking about adults, I think where this is going is, well, where it's gone so far is everything that you might want becomes available instantly and for free with no effort.
00:33:53.000And so in some ways that's a life of convenience, but in other ways it's messing us up and it's making us weaker.
00:34:00.000So, you know, you want sexual satisfaction?
00:34:14.000Advances in robotics are such that it's just a matter of time before AI girlfriends are put into these incredible female bodies that you can customize.
00:34:22.000So I think the adult world, for young adults especially, is going to get really, really messed up.
00:34:28.000And again, I'm not saying we need to ban it now.
00:34:31.000But what I'm saying is, for God's sakes, don't let this be 11-year-old children's lives.
00:34:37.000Let's at least keep children separate from all this craziness until their brains develop, and then they can jump into the whirlpool and the tornado.
00:34:46.000But the fact that our 11-year-old girls are now shopping at Sephora for anti-wrinkle cream or, you know, all sorts of expensive skin treatments, this is complete insanity.
00:34:57.000So let's at least protect the kids until they're through puberty.
00:35:06.000It's just the way I see adults being so hooked on these things.
00:35:11.000There's so many adults that I know that are engrossed in this world of other people's opinions of everything they think and say.
00:35:18.000And it just doesn't give you enough time to develop your own thoughts and opinions on things.
00:35:24.000So many people are dependent upon other people's approval.
00:35:29.000And there's just so many people that are addicted to interacting with people online and not interacting with exceptional people in the real world.
00:36:00.000So they designed the foods to be as addictive as possible.
00:36:03.000And in the 70s and 80s, Americans switched over to a lot of junk food and we became obese, like a huge increase in obesity.
00:36:10.000And that kept going on for a few decades.
00:36:12.000As I understand it, obesity has finally turned around a little bit.
00:36:16.000And many people are still eating huge amounts of junk food, but at least some people are beginning to say, you know what, I'm going to resist that deep evolutionary programming for fat and sugar.
00:36:30.000The companies played to that, they hijacked those desires, and they got us hooked on junk food.
00:36:35.000But after 50 years, we're making some progress in pushing back and having healthier snacks and eating less.
00:36:58.000I'm just assuming that this is an issue that we dealt with as a society and we didn't know what we were doing at first and we got hooked and the efforts to educate people and to develop healthier alternatives.
00:37:13.000So again, I should have looked at the data before I came here.
00:37:18.000But I'm just using this as an analogy.
00:37:20.000I'm sure Jamie can find something that points to it or doesn't point to it.
00:37:46.000So do you think it's just people recognizing that they're developing health issues and they're taking steps to discipline themselves and mitigate some of these issues?
00:37:56.000Or is there some sort of information push that's leading them in that direction?
00:38:02.000Yeah, that I don't know because it's not my field.
00:38:03.000But I would say that that is a probably necessary precondition, like understanding the problem and developing people a desire to change it.
00:39:02.000What are people saying about the thing that I just said?
00:39:04.000But the question is, will we adapt to it in some way so that we begin, as with junk food, we're still going to be consuming junk food, but maybe we'll keep a lid on it.
00:39:15.000But what I can say with not confidence, but what I think is the case, is as long as our kids...
00:39:20.000Are going through puberty on social media and video games, and they're not developing executive control, I do not think they will be able to keep a handle on this as adults.
00:41:05.000I think your strategy is very wise and for this reason.
00:41:09.000When social media began, you would put something up and then people could comment on it.
00:41:15.000Okay, that goes until about 2013, 2014. I think it's 2013 when Facebook introduces threaded comments.
00:41:23.000So now you put something up, someone says some horrific, nasty, you know, racist, whatever thing in your comment thread, and now everyone can reply to that comment.
00:42:18.000That's powering this whole thing that you cannot fight against and that we are moving in a direction as a society with the implementation of new, more sophisticated technology that's going to make it even more difficult unless you completely opt out.
00:42:36.000And some people are going to opt out, but it's going to be like my 600-pound life.
00:42:41.000People that are realizing, oh my god, what have I done?
00:42:57.000The amount of focus that people have on comments and things, if you're addicted, if you're currently deep into it right now, where you're tweeting constantly.
00:43:07.000There's people that I follow that I know they're tweeting 12 hours a day.
00:45:33.000Yeah, that's the real fear of something like Neuralink or whatever.
00:45:37.000If they can figure out a Neuralink that doesn't require surgery, if they can figure out something that does that without surgery, the advantage of having that in a competitive sense in terms of business and And technology and industry.
00:45:50.000It's going to be massive and it's going to be so difficult to get people to not do that, that it's going to be like phones.
00:45:58.000I mean, I remember when I moved to Los Angeles in 1994, I bought a Motorola StarTAC and I was like, look at me.
00:46:55.000If there was a business thing that I had to deal with, there was something going on with my career, I could deal with it on the phone at Starbucks or wherever I was.
00:47:04.000My fear is that this is going to be that times a million.
00:47:07.000It's going to be you have to have it in order to compete.
00:47:10.000Just like you kind of have to have an email today.
00:47:13.000You kind of have to have a cell phone today.
00:47:27.000And when you simply connect people, you know, Mark Zuckerberg sometimes says, how could it be wrong to give more people more voice?
00:47:33.000If you're simply connecting people, making it easier for them to contact each other, you know, I think that's mostly going to have good effects.
00:47:48.000But when it became not technology making it easier for this guy to reach you or me to communicate with you, But rather, it's a way to put things out to try to gain prestige for me in front of thousands or maybe millions of people.
00:48:07.000You know, what games are we playing as we go about our day?
00:48:10.000And the more people are playing the game of I'm struggling to get influence in an influence economy where everyone else is on these seven platforms.
00:48:18.000So I have to be too or they have an advantage over me.
00:48:21.000That is the way that things have been rewired already.
00:48:25.000Now, you're raising the possibility that the next step is more hardware-based, that it's going into our bodies, and I think that is likely to happen.
00:48:34.000And so I hope what we'll do now, and I hope my book, The Anxious Generation, will sort of promote a pause.
00:49:16.000Like, here's where we have to heed, I think, the warnings of the ancients, of religious authorities, of those who warn us that we are leaving our humanity and we're stepping into an unknown zone where, so far, the initial verdict is horrible.
00:49:33.000So, if we keep going without putting on some brakes, yeah, I think we're going to a horrible place.
00:49:40.000Yeah, my fear is that it won't be horrible.
00:49:47.000So my fear, my genuine fear, is the rewiring of the mind in a way that can enhance dopamine, enhance serotonin, and do things that can genuinely make you feel better.
00:50:39.000There's a lot of issues that come along with those, and yet there's an immense profit in making sure that people take those and stay on those.
00:50:47.000My fear is that if you can do something that allows people to have their mind function, have their brain, their endocrine system, have all these things function at a higher level, then everyone is going to do it.
00:51:03.000You would not want to just be natural and depressed if you could just put on this little headset and feel fantastic.
00:51:11.000And maybe it could be a solution to so many of our society issues.
00:51:16.000Maybe bullying would cease to exist if everyone had an increase in dopamine.
00:51:21.000It sounds silly, but if dopamine increased by...
00:51:24.000Look, if you have an entire society that's essentially on a low dose of MDMA, You're not going to have nearly as much anger and frustration.
00:51:34.000You also are not going to have as much blues.
00:51:37.000You're not going to have as many sad songs that people love.
00:51:41.000You're not going to have the kind of literature that people write when they feel like shit.
00:51:45.000It's unfortunate, but also as a whole, as a society, it probably would be an overall net positive.
00:52:29.000My fear is that it's going to just change what it means to be a human being and my genuine feels that this is inevitable and that as technology scales upward this is unavoidable.
00:52:42.000Right now it certainly feels that way.
00:52:47.000And while I'm not optimistic about the next 10 years, I share your vision of what's coming.
00:53:20.000But I think we humans are an amazingly adaptable species.
00:53:28.000I think we can figure this out, and there are definitely pathways to a future that's much better.
00:53:34.000These technologies could, in theory, give us the best democracy ever, where people really do have the right kind of voice.
00:53:40.000It's not just the extremes who are super empowered, as it is now.
00:53:45.000So, you know, we're at a point in space and time, let's say, right now, and I can imagine a future that's really fantastic, but how do we get there?
00:54:38.000And let's test it in the place where we're most likely to succeed, which is rolling back the phone-based childhood and replacing it with a more play-based childhood.
00:54:49.000Oh, so actually, I said there are four norms.
00:55:33.000All kinds of ideas for how to help your kid have more independence, which makes them more mature, which makes them less fragile.
00:55:42.000So this fourth norm, this is the harder one.
00:55:45.000This is the one that we have to really overcome our fears of letting our kids out.
00:55:49.000And so actually, let me ask you, I think our disagreement last time was, I talked about this, and I said letting kids go for sleepovers and spend more time with other kids and unsupervised.
00:56:00.000And then you said, I think you said, no, I'm not letting my kid go to sleepovers because I don't trust the other families.
00:56:19.000If you know the parents and you trust the parents, it's a great way to give the kids independence and have them interact with other people.
00:56:25.000So tell me, what was your policy with your kids, with all three?
00:56:30.000When you let them out, like they could go out the door, get on a bicycle, walk seven blocks to a friend's house without any adult with them.
00:58:44.000And there are different moralities, and in some ways that's good, and left and right push against each other.
00:58:51.000So I'm very open to different moralities.
00:58:54.000But when a group makes something sacred, and they say, this is the most important thing, and nothing else matters other than this...
00:59:03.000Then they can kind of go insane and they kind of lose touch with reality.
00:59:08.000And I think, you know, again, I don't know the history of this particular movement, that horrible term, but there is a certain kind of morality which is all about, you know, oppression and victimhood.
00:59:20.000And once you, you know, someone, I guess, somewhere said, oh, you know, men who are attracted to boys or, you know, little girls are being, you know, are victims, I don't know what.
00:59:31.000Some, in some little eddy of weird morality, someone put that forward as a new victim class, because we've been trying to address victimhood all over the place.
00:59:41.000Once someone puts that up as a new victim class, and you have to do that, you have to change the terms.
00:59:47.000You change the terms, and then some others who share this morality, which is focused on not making anyone feel marginalized, not allowing any labels that will slander someone or make them look bad, I think people who approach children for sexual goals,
01:00:05.000I'm very happy to have them slandered and labeled and separated.
01:00:11.000But I suspect that some people, once they lock this in as a group that's being marginalized, they say, well, we have to defend them.
01:00:19.000And we don't think about what the hell we're actually saying.
01:00:25.000It seems that this is something that with people that only exist in sort of an academic space where it's almost like An intellectual exercise in understanding oppression.
01:00:53.000Before we go any further with this particular topic, I would want to point out one of the problems that our social media world has given us, which is Somewhere in all of the academy and all the universities, some philosopher,
01:01:09.000let's say, proposed that term or raised an idea.
01:01:11.000So this has been going on for thousands of years.
01:01:13.000Someone in a conversation proposes a provocative idea.
01:01:16.000What if we think about this as a minor attracted person?
01:01:20.000They put that idea out, and then other people say, no, that's really stupid, and it doesn't catch on, because this is not an idea that's going to catch on, even in the academy.
01:01:29.000But I think where we are now is, I'm guessing, someone proposed this, somebody else got wind of it, posted it online, and now you're going to have a whole media ecosystem going crazy about this terrible idea.
01:01:44.000So maybe can you look up a minor attracted person?
01:01:48.000Is this just like a thing that was from one academic talk?
01:02:32.000So that brings us to the issue of identitarianism, which I think is a useful term for us these days.
01:02:42.000I think a lot of what's happened on campus is the move to focus on identity as the primary analytical lens in a number of disciplines, not in most disciplines, but in a lot of the humanities, the studies departments.
01:02:56.000So putting identity first and then ranking identities and saying some identities are good, some are bad— This really activates our ancient tribalism.
01:03:05.000And I think that the liberal tradition, going back hundreds of years, is really an attempt to push back against that and to create an environment in which we can all get along.
01:03:15.000And so, as I see it from inside the academy, we've always been interested in identity.
01:03:22.000There's a lot of research on it going back many decades.
01:03:24.000But something happened in 2015 on campus that really elevated identitarianism into the dominant paradigm, not dominant in that most people believed it, but dominant in the sense that if you go against it, you're going to be destroyed socially.
01:03:40.000That's what Greg Lukianoff and Ricky Schlott, their new book, The Canceling of the American Mind, is about.
01:03:45.000So, yes, it's the people who are putting identity first, and that's sort of their religion and their morality.
01:03:52.000I mean, they're welcome to live in the United States, but when they get influence in universities or in school boards, yeah, bad stuff will happen.
01:04:01.000It's just bizarre the effect that it does have when people push back against identity politics.
01:04:08.000It's a small, very vocal minority that pushes this agenda.
01:04:21.000This is, again, a really important point about how our society has changed.
01:04:25.000Those of us from the 20th century still think in terms of public opinion, like, do most people believe this, or do most people not believe it?
01:04:38.000And I think what's happened since social media became much more viral in 2009-2010 is that the extremes are now much more powerful and they're able to intimidate the moderates on their side.
01:04:49.000So on the right, sort of the center-right, what I call true conservatives, or like Berkey and Edmund Burke conservatives, You know, they get shot and they get excluded and there's not many of them in Congress anymore.
01:05:01.000And on the left, you have the far left, the identitarian left, you know, shooting darts into, you know, people like me, into anybody who is, you know, anybody who questions.
01:05:10.000And what you have is even though most people are still moderate and reasonable, our public discourse is dominated by the far right, the far left, and all these crazy fringe, you know, I mean, it can be, you know, neo-Nazis on one side and then these, you know, identitarians defending minor attracted people on the other side.
01:05:32.000Recognize that we've moved into this weird, weird world because of social media in which it's hard to see reality and in which people are afraid to speak up.
01:05:42.000And so we get warped ideas rising to dominance, even though very few people believe them.
01:05:48.000And I think this is where bots come into play.
01:05:52.000I really do believe that this is being amplified, whether it's by foreign governments or by special interest groups or by whoever it is is trying to push these specific narratives.
01:06:04.000And this can bring us right back to TikTok and the national security threat.
01:06:08.000So Vladimir Putin was a KGB agent in the 20th century.
01:06:12.000And the KGB going back, I think it was in the 50s, they had some sort of a meeting or something where they decided that they were going to take, I think it's called active measures.
01:06:21.000They were going to try to mess up American democracy.
01:08:29.000I think it was 84. And he was talking about how the work is already done.
01:08:33.000And that is just a matter of these generations now going into the workforce with Marxist ideas and with all this ideological subversion that the Soviet Union has injected into the universities.
01:09:19.000That's a good point because that – That brings us to the big difference between democracies and autocracies.
01:09:26.000Back in the 1930s, when the West was in economic collapse, and it was the Soviet Union and then the Italian fascists and then Hitler, the German fascists, They were making rapid economic progress.
01:09:41.000And the criticism of democracy has always been, it's chaotic.
01:09:50.000But why did we triumph in the 20th century over all these other models?
01:09:54.000Because democracy gives us a degree of dynamism.
01:09:57.000Where we can do things in a distributed way.
01:10:00.000We have people just figuring stuff out.
01:10:02.000We have an incredibly creative economy and business sector.
01:10:06.000And so democracies have this incredible ability to be generative, creative, regenerative.
01:10:13.000Unless you mess with their basic operating system and say, let's take this environment in which people talk to each other, share ideas, take each other's ideas, compete, try to get a better company.
01:10:25.000Let's take that and let's change the way people talk so that it's not about sharing information.
01:10:31.000It's about making them spend all day long, nine hours a day, competing for prestige on social media platforms and in a way that empowers everyone to complain all the time.
01:10:42.000This, I think, really saps the dynamism.
01:10:45.000I think this social media, what I'm suggesting, I haven't thought this through, but I'm suggesting is that whatever the magic ingredient that made democracy so triumphant in the 20th century, Western liberal democracy, American style democracy, whatever made it so triumphant is being sapped and reduced by the rapid rewiring of our society onto social media.
01:11:06.000And I think it's also being influenced, again, by these foreign governments that have a vested interest in us being at each other's throats.
01:11:23.000When you say that you've been attacked, what have you specifically been attacked about?
01:11:27.000Oh, it's just in the academic world, if you say anything about any DEI-related policy, you'll be called racist or sexist or homophobic or something.
01:12:28.000And I began actually trying to help the left stop losing elections like in 2000, 2004. As a Democrat, I thought I could use my research in moral psychology to help the Democrats understand American morality, which they were not understanding.
01:12:44.000Al Gore and John Kerry, I thought, did a very bad job.
01:12:47.000So I've all along been sort of critical of the left, originally from within the left.
01:12:53.000And that's a pretty good way to get a bunch of darts shot at you.
01:12:55.000Nothing terrible ever really happened to me.
01:12:57.000I don't want to, you know, lots of people have been truly canceled, you know, shamed, lost their jobs, considered suicide.
01:13:03.000So nothing like that has ever happened to me.
01:13:06.000But, you know, when there's some minor thing on, you know, people take a line out of one of your talks.
01:13:12.000They put it up online with a commentary about what an awful person you are.
01:13:15.000Thousands of people comment on it or like it or retweet it.
01:13:28.000It was one of the major disputes when Elon bought Twitter.
01:13:31.000I mean, one of the things that's come out of Elon buying Twitter, and thank God he did, as much as people want to talk about the negative aspects, which are real, which I've seen racism and hate go up on Twitter.
01:13:44.000Openly discussed, which is very disturbing.
01:13:47.000But what we did find out is that the government was involved in this, that the federal government was interfering with people's ability to use these platforms for speech.
01:13:59.000Over COVID. You mean because of COVID? Yes, that's right.
01:14:01.000But I feel like that's just a test run.
01:14:04.000Being able to implement that for that.
01:14:07.000Then you can implement it for so many different things.
01:14:09.000Dissent about foreign policy issues, dissent about social issues.
01:14:13.000There's so many different ways they can do it if they can somehow or another frame it in a way that this is better for the overall good of America.
01:14:27.000There has to be some, but most people focus on the content and they think if we can clean up the content or change the content or, you know, in those Senate hearings we saw a couple months ago, you know, just, you know, if we can reduce the amount of, you know, suicide promoting or self-harm promoting content that our kids are seeing,
01:14:45.000Like, no, it's not primarily about the content.
01:14:47.000I agree with you that the government was influencing these platforms to suppress views that they thought were wrong and some of which turned out to be right.
01:15:05.000I'm a big fan of my friend Greg Lukianoff, who runs the Foundation for Individual Rights and Expression.
01:15:09.000So I think we shouldn't be thinking about social media like, well, how do we keep the wrong stuff off and only have it have the right stuff?
01:15:17.000I think almost only about architecture.
01:15:22.000And can we improve it in ways that are content neutral?
01:15:25.000Can we improve it in ways that aren't going to advantage the left or the right, but are going to make it more truth-seeking?
01:15:30.000And so Frances Haugen, the Facebook whistleblower, when she came out, She had all kinds of ideas about settings, things that Facebook could have done to reduce the incredible power of the extremes.
01:15:42.000The farthest right, 3%, the farthest left, 3%, and then a bunch of just random weirdos who just post a lot.
01:16:24.000And the problem with that is most people interact with things that rile them up.
01:16:29.000And so you're developing these platforms that are immensely profitable that ramp up dissent and ramp up anger and ramp up arguments.
01:16:40.000And like in the case of yourself, Instead of just debating you on these issues and doing it in a good faith manner, Jonathan Haidt believes this.
01:17:19.000So Twitter only went to algorithms, I think, in 2017. So before then, you know, people who tweet a lot, you know, People talk a lot about algorithms as though that's the cause of the whole problem.
01:17:36.000And they're not the cause of the problem, but man, are they amplifiers.
01:17:39.000And I think that's what you're saying.
01:17:40.000They're just super-duper amplifiers on whatever craziness would be there even without them.
01:17:46.000And so that certainly is shaping what we receive, what our children receive.
01:17:52.000And so this is some of the stuff that I think, again, we have to really protect our children from.
01:17:57.000To have a company able to micro-target their exact desires, even when they don't know what their desires are, It's a degree of control and influence over children in particular that I think they should just be protected from.
01:18:14.000Do you think that if you looked at algorithms, do you think that it's an overall net negative?
01:18:20.000And could the argument be made that algorithms should be banned?
01:19:34.000That move to virality, that I think is even more guilty of causing the problems even than algorithms.
01:19:40.000I don't know that it's necessarily one versus the other, but that's the way I see it, that we're in a world where the technology is so quick to ramp up whatever will most engage us, and that's mostly emotions such as anger.
01:19:53.000So yeah, that's why it feels like everything's burning.
01:19:58.000And this doesn't seem like it's slowing down.
01:20:02.000It seems like it's ramping up and it seems like they've gotten more efficient at the use of algorithms and all these different methods like retweeting and reposting and different things that sort of accentuate what people are upset about and what people get riled up about.
01:20:18.000Yes, I think it is accelerating, and for two reasons.
01:20:21.000One is that it's just the nature of exponential growth.
01:20:26.000I think in the 19th century, a guy named Adams gave us the Adams curve.
01:20:30.000He was noticing, like, wow, the amount of work we're able to do now that we're harnessing steam and coal keeps growing and growing and growing.
01:20:37.000And at some point, it's going to be going up so fast that it'll go up an infinite amount every day or something.
01:20:43.000You reach a point at which it's insane.
01:20:47.000And yeah, so many people think that we're now at the singularity.
01:20:51.000We're at the point at which things are changing so fast that we just can't even understand them.
01:20:56.000And we haven't yet mentioned the word AI. Now you bring in AI, and of course, you know, AI could unlock extraordinary material progress.
01:21:05.000And Marc Andreessen has been arguing that.
01:21:07.000But as a social scientist, I fear it's going to give us material progress and sociological chaos.
01:21:14.000It's going to be used in ways that make our already unstable social structures and systems even less stable.
01:21:22.000Well, what's very bizarre that we're seeing with the initial implementation of it, specifically with Google's version of it, is that it's ideologically captured.
01:21:32.000And that was so irresponsible of Google to do.
01:21:34.000So, no, I'm glad we have a chance to talk about this because I'm really horrified by what Google did in introducing Gemini.
01:21:39.000And just to give a little background here, so I'm sure many of your listeners know, Google Gemini was programmed to answer in ways that basically, you know, the most extreme DEI officer would demand that people speak.
01:21:53.000And so, you know, if you ask for a picture of the Founding Fathers, they're multiracial or all black.
01:22:01.000Even Nazis had to be multiracial or black.
01:22:04.000So there's two things to say about this.
01:22:06.000The first is that Google must be an unbelievably stupid company.
01:22:11.000Like, did nobody test this before they released it to the public?
01:22:14.000And obviously, Google is not a stupid company, which leads me to my next conclusion, which is if Google did such a stupid, stupid thing, so disgraced its product that it's banking so much on—I mean, it depends a lot on the success of Gemini— And now they've alienated half the country right away.
01:22:31.000On the first day, practically, they alienated them.
01:22:46.000questions a DEI-related policy on campus, they would get attacked.
01:22:51.000And that's what most of the early blow-ups were.
01:22:53.000I think you probably had Brett Weinstein on here.
01:22:56.000That's what Erika Christakis at Yale and Nicholas Christakis at Yale.
01:23:01.000If people wrote these thoughtful, caring memos about opposing a policy, There would be a conflagration, they'd be attacked, and they would sometimes lose their jobs.
01:23:12.000So that's what happened to us in universities in 2015 to usher in our now nine years of insanity, which I think might be ending.
01:23:19.000I think last fall was so humiliating for higher ed that I think we might be at a turning point.
01:23:26.000I suspect that Google was suffering from an extreme case of structural stupidity because surely a lot of those engineers could see that this is terrible.
01:23:35.000This is a massive violation of the truth and part of Google's brand is truth and trust.
01:23:41.000So I suspect they were just afraid to say anything.
01:23:44.000And that's why Google made this colossal blunder of introducing woke AI at a time when we desperately need to trust our institutions that are related to knowledge.
01:23:56.000And Google was trusted, and now they've lost a lot of it.
01:24:14.000And there was recently, was it David Rosado or who was it, who put out some listing of how far left each of the different AI products are.
01:24:24.000So you can certainly say that ChatGPT is not politically neutral, but you wouldn't say from that that the people at ChatGPT or OpenAI are stupid.
01:24:34.000You would not look at this product and say, how could they be so dumb as to have it be left-leaning?
01:24:40.000But with Google, you have to say, how could they be so dumb as to produce black Nazis for us?
01:24:56.000With DEI and with the universities and the education system, it just seemed like you had to apply that to artificial intelligence because you're essentially, you're giving artificial intelligence these protocols.
01:25:11.000You're giving it these parameters in which it can address things.
01:25:16.000And if you're doing it through that lens, this is the inevitable result of that.
01:25:27.000But if you say DEI, if you apply that to everything across the board and don't make exceptions in terms of historical accuracy, the founding fathers of America being all black...
01:25:40.000Again, I'm not an expert in AI, but large language models are basically just consuming everything written and then spitting stuff back out.
01:25:48.000And so it might be that most stuff is written.
01:25:52.000The people on the left are dominant universities.
01:25:55.000They probably publish more books, whatever.
01:25:57.000Right, but there's nothing written about black Nazis.
01:26:22.000Benetton ads had much more diversity in the 1980s and 90s.
01:26:25.000So no, I would agree that the Gemini case, clearly someone deliberately programmed in all kinds of rules that, yeah, they seem to come from a DEI manual just without much thinking.
01:26:35.000Yeah, how do they come back from that?
01:26:57.000Companies considering AI deal that would build on search pact.
01:27:01.000Apple also recently held discussions with OpenAI about deal.
01:27:06.000On this news, then a big investment happened too.
01:27:09.000Magnificent Seven adds $350 billion on Gemini's reported iPhone deal.
01:27:13.000So, because Google has implemented AI into their phones, specifically Samsung, Samsung's new Galaxy S24 Ultra, It has a bunch of pretty fantastic AI features, one of them being real-time translation, your ability to summarize web pages instantaneously,
01:27:34.000summarizing notes, bullet points, very helpful features.
01:27:39.000So because of that, another one is your ability to circle any image and it automatically will search that image for you.
01:28:08.000I guess the point that I'd like to add on, which I hope will be useful for people, is part of what we're seeing across our institutions is a loss of professional responsibility, a loss of people doing their jobs.
01:28:22.000And I don't mean base-level employees.
01:28:30.000Universities must be completely committed to the truth, research, discovery.
01:28:36.000Journalists must be committed also to the truth and methods to find the truth.
01:28:40.000And what we've seen in the 2010s especially is many of these institutions being led away from their mission, their purpose, towards the political agenda of one side or another.
01:28:56.000And so I think this is what we're seeing.
01:28:58.000And if we're going to make it through this difficult period, we need some way to find the truth.
01:29:04.000And the more we've gone into the Internet age, the harder it is to find the truth.
01:29:08.000Like, we just look like, you know, something's incredible.
01:29:11.000Like, we just say, you know, hey, look this up, and we got it.
01:29:14.000But on anything contested, it's just very hard to find the truth.
01:29:18.000And so that's why I'm especially disappointed in Google.
01:29:44.000Or is it you decide that one side is good overall, net positive, the other side is net negative, and whatever you can do to subvert that other side is valuable?
01:29:56.000And so that's a mindset in which the ends justify the means.
01:29:59.000And so part of the genius of American liberal democracy was to calm down those tribal sentiments to the point where we could live together, we could celebrate diversity in its real forms, we could get the benefits of diversity.
01:30:12.000And that was all possible when we didn't feel that the other side was an existential risk to the country, that if the other side gets in, it's going to be the end.
01:30:23.000And that's an image that helped Donald Trump win.
01:30:25.000There was an essay, what's it, by Michael Anton, I think, called The Flight 93 Election.
01:30:30.000You know, if you're on Flight 93 being hijacked to crash into Congress and, you know, if you do nothing, you're going to crash into Congress, you'll do anything.
01:30:39.000And so he framed it as a sort of a Hail Mary pass that, you know, patriotic Americans were supposed to vote for Donald Trump.
01:30:48.000That mindset of the ends justify the means, the situation is so dire that even violence, even violence is justified.
01:31:12.000But once you say the ends justify the means, and we can cheat, we can lie, we can subvert the company's purpose because the end we're fighting for is so noble, well, the other side's going to do the same thing.
01:31:22.000And before you know it, your culture war becomes a real war.
01:31:26.000Yeah, and you're seeing that in the news, how it's implemented in the news.
01:31:30.000I mean, I'm sure you're aware of this recent Donald Trump speech where he talked about a bloodbath.
01:31:40.000See if you can find that, Jamie, because it's actually important to highlight how...
01:31:44.000Not just inaccurate, but just deceptive the media was in their depiction of what he said and that they are taking this quote out of context and trying to say that there's going to be a civil war if he doesn't get elected,
01:32:02.000which is not what he was talking about at all.
01:32:07.000Because it's so disturbing that they would – first of all, they would think that they could get away with it in this day and age with all the scrutiny and all the – with social media and all the independent journalists that exist now, which is one of the more interesting things about the demise of corporate media,
01:32:25.000Trust in corporate media is at an all-time low and so this has led to a rise in true independent journalists.
01:32:33.000The real ones out there, the Matt Taibbi's, the Glenn Greenwall's, the people that are actually just trying to say, what is really going on and what are the influences behind these things and why are these things happening?
01:32:55.000If you're listening, President Xi, and you and I are friends, but he understands the way I deal.
01:33:00.000Those big monster car manufacturing plants that you're building in Mexico right now, And you think you're going to get that, you're going to not hire Americans, and you're going to sell the cars to us?
01:33:48.000I actually say that the date, remember this, November 5th, I believe it's going to be the most important date in the history of our country.
01:34:05.000That sounds like the Flight 93 election argument, that if I don't win, the country's over.
01:34:10.000Yeah, but what he's talking about is this subversion of our economy and the subversion of our democracy, that we'll never have an election again.
01:34:20.000I don't think he's saying that it'll be a bloodbath in terms of a civil war.
01:34:23.000He's saying the economy's going to be destroyed.
01:35:58.000But the aside was not about the economy.
01:36:02.000The aside was him making one of these typical asides about how important he is.
01:36:09.000Joe, I think we're not going to settle this.
01:36:11.000Look, I do agree that the media as a progressive left-leaning institution like universities has violated its duty many times to the truth and thereby lost the trust of much of the country.
01:36:26.000Most of the people who work in these industries, I think, are wonderful and are trying to do a good job.
01:36:31.000But the net effect, and this is my point about structural stupidity, During our culture war, institutions that have had very little viewpoint diversity have been subject to hijacking by those with a political agenda.
01:36:45.000So I agree with you about that, although I disagree with you about what that comment from Donald Trump meant.
01:36:52.000It sounded to me like it was not taken out of context.
01:36:55.000Well, he was talking about the economy, though, specifically.
01:37:57.000What I'm saying is that most people are reasonable wherever you go, but in the social media age, it's no longer about what most people are like.
01:38:04.000It's about how much power do the extremists have because anyone now has the power to hijack, threaten, intimidate.
01:38:34.000Many people, especially those who listen to conservative sources, might think that professors are mostly tenured radicals who care more about Marxism than about educating their kids.
01:39:30.000That's what I think the effect of not the original social media platforms like MySpace or early Facebook, but of the hyperviral ones that we got in the 2010s.
01:39:38.000And the result of that, in terms of people terrified about people attacking them, is what you get when you got those people from Penn, from Harvard.
01:39:50.000We're talking about this rampant anti-Semitism on campus where people were actively calling for the death of Jews, saying that this does not constitute harassment unless it's actionable.
01:40:07.000What is that like as a person when, you know, you are an academic and you are a professor, when you see that from these, especially from somewhere like Harvard?
01:40:27.000And I can understand the argument that those presidents were making.
01:40:32.000The argument was a very narrow technical argument about whether students should be allowed to say from the river to the sea, Palestine will be free.
01:40:40.000And so I understand why it would have been reasonable for them to say, well, we're not going to punish students for saying that.
01:40:48.000That is political speech that's protected under the First Amendment.
01:40:53.000So I understand the point that they were making.
01:40:56.000But they were such screaming hypocrites in making that point because—and this is what the Cotley and the American Mind was all about—how did it happen that, you know, if a professor or administrator writes a single word that a student objects to and calls racist,
01:41:13.000Like, you're going to fire someone or let someone be tormented and fired because they said something?
01:41:19.000That someone interpreted it in a certain way.
01:41:21.000And that led us to be super hyper crazy sensitive about every word we say, because you never know when it'll explode and cause a scandal.
01:41:29.000And so for the presidents to say, oh yeah, you know, anything anyone ever said between 2015 and yesterday would be punished if anyone was bothered by it.
01:41:39.000But from the river to the sea, oh yeah, sure, that's constitutionally protected.
01:41:42.000It wasn't just from the river to the sea.
01:41:44.000It was the literal expression, death to Jews.
01:41:59.000The deeper question is about political speech, but you're right that, as Stefanik, I believe, was asking them, it was about calls for genocide.
01:42:07.000And so, yes, calls for genocide, it seems to me.
01:42:11.000Again, I'm not a First Amendment lawyer.
01:42:13.000Maybe on the First Amendment, legally, you can't be arrested for it.
01:42:17.000But for God's sakes, on a university campus where you're trying to make everyone feel included, you can't even comment on Not just about the calls for genocide, you know, but about the actual events on October 7th.
01:42:28.000So that, I think, is what really brought higher ed to really a nadir, a low point in public esteem, like literally a low point in public esteem.
01:42:37.000I think it was a wake-up call for a lot of people that are kind of on the fence about how big the issue is.
01:42:44.000Because these are the same people that call for you being kicked out of the university if you deadname someone.
01:42:57.000And so I'm actually, you know, so last semester was the worst one ever for higher education.
01:43:03.000Data from Gallup and Pew show that the public, higher ed used to have an incredible brand, global brand, we were the best, everyone wanted to come here, scientific innovation, all the top academics were here in the United States.
01:43:16.000And in 2015, people on the left had a very high opinion of higher ed, and actually people on the right had a moderately high opinion of it.
01:43:23.000And then since 2015, it's dropped, not just among people on the right, but among centrists and moderates as well.
01:43:29.000So higher ed really lost the trust of most of the country.
01:43:34.000And I was running an organization called Heterodox Academies.
01:43:37.000I started it with some other social scientists that advocates for viewpoint diversity.
01:43:41.000And that's why I was kind of a target sometimes, because here I am saying, we need viewpoint diversity.
01:43:45.000We need some conservatives, some libertarians.
01:43:48.000We need to not all be on the same side politically.
01:43:51.000Which is an amazing thing to fight against.
01:44:27.000So I think we hit a low point in the fall in such a way that I'm actually optimistic that things are going to change.
01:44:33.000Because I've been concerned about these issues in universities, the culture issues, since 2014-2015 when Greg Lukianoff and I wrote our first Atlantic article titled The Coddling of the American Mind.
01:44:42.000And every year it's gotten worse and worse and worse.
01:44:45.000There's never been a turnaround until last year.
01:44:47.000And as with The Emperor's New Clothes, you know, people can see that something is stupid and crazy and wrong, but they won't say anything.
01:44:54.000But then when somebody does, then everybody can speak.
01:44:57.000And I'm feeling finally, for the first time since 2015, I'm feeling that people sort of understand, you know what, wait, that was crazy what happened to us.
01:45:11.000Let's like start sometimes saying maybe that is not right.
01:45:14.000So I think that things are actually going to turn around.
01:45:17.000Maybe not at the Ivies, although there are movements of faculty there saying, no, let's return to academic values, the pursuit of truth.
01:45:24.000So I think what I'm hoping, what I think is likely to happen, is we're going to see a split in the academic world.
01:45:30.000That is, there are already schools like Arizona State University.
01:45:33.000There are schools that already have basically said no to all the crazy stuff, and they're focusing on educating their students.
01:45:40.000And I think we're going to see more students going that way.
01:45:42.000The University of Chicago is another model.
01:45:44.000So I think there are a few schools that departed while almost all the other schools went in the same direction.
01:45:49.000But I think now that's going to change and it can change actually pretty quickly because most of the university presidents don't like this stuff.
01:45:58.000All the crazy politics, the activist students, it made their job very difficult.
01:46:03.000So I'm actually hopeful that we are starting to see some university presidents standing up and saying, you know, it's not okay to shout down every conservative speaker.
01:46:11.000Like, no, we're not going to allow that.
01:46:12.000So we'll see a year from now, if I come back on a year or two, we'll see.
01:46:17.000But I think things are actually beginning to get better for the first time since 2015. Well, I hope you're correct.
01:46:23.000And I do agree that the pushback was so extreme that some action is likely to take place.
01:46:29.000I think the first step of that has got to be to allow people with differing perspectives to debate and not shout them down.
01:46:39.000And also to show that that shouting people down and setting off fire alarms is shameful.
01:46:45.000It's disgraceful in a higher education institution.
01:46:50.000If there was any punishment, the students would change very quickly.
01:46:53.000The students are very concerned about getting a job, about their futures.
01:46:57.000And what the early presidents who didn't do anything, what they conveyed was, you can yell and scream all you want, nothing will happen to you.
01:47:03.000You can bang on the glass and frighten speakers, nothing will happen to you.
01:47:07.000You can throw rocks through windows, nothing will happen to you.
01:47:10.000And of course, that just brought us more obnoxious behavior on campus and shame to higher ed in the eyes of the country.
01:47:19.000So we had a brand that was based on extreme excellence and truth.
01:47:25.000I think we damaged our brand very severely.
01:47:28.000I think finally now there's a reckoning and a realization of what we've done.
01:47:32.000And I think we're going to see a recovery, an uneven recovery.
01:47:35.000But I do think that a year or two from now, the mood, well, who knows what's going to happen with the election and whether there'll be a bloodbath.
01:49:16.000We need more viewpoint diversity among the professors, or at least we need more toleration of people who are centrist or libertarian.
01:49:24.000So that's one on the faculty side, what we need to do, and also the culture on campus.
01:49:29.000But I also co-founded another organization called the Constructive Dialogue Institute with a woman named Caroline Mill.
01:49:35.000And what we did is we took some of the insights of moral psychology and some of the content from my book, The Righteous Mind, and it evolved.
01:49:42.000It's now six 30-minute modules that teach you about moral psychology.
01:49:58.000So if people go to ConstructiveDialogue.org, the program is called Perspectives.
01:50:03.000It's being used in, you know, I think more than 50 universities now.
01:50:06.000So there are things that we can do, but it's going to take leadership and good psychology.
01:50:12.000That's so important what you just said and I think that if those programs gain momentum and that people recognize that it's really beneficial to all to have these ideas debated.
01:50:26.000If you truly believe that opposing ideas to your ideology are evil, you should be able to debate those.
01:50:33.000And the only way to do that is to have someone to have the ability to express themselves.
01:50:38.000And for you to counter those points that they make.
01:50:42.000And this is what many commentators on the left have been pointing out since 2015. Van Jones has an amazing talk.
01:50:48.000He's a progressive, democratic, well-connected, smart person.
01:50:55.000And he's been pointing out, there's a great talk he gave at the University of Chicago, I have a quote on this in the Coddling the American Mind, where he talks about the move to protect students from bad feelings, the move to protect them for emotional safety,
01:52:39.000For some reason, the Ivy League schools, that's what's really surprising.
01:52:42.000I thought it was just like, well, the elite schools.
01:52:43.000No, it's actually the Ivies are the place where the worst anti-Semitic actual, you know, threats and intimidation and even some violence are happening or threats of violence are happening.
01:52:52.000Something about the Ivies makes them more extreme.
01:53:25.000And then I think also the Ivy League is full of really rich kids.
01:53:31.000The statistic a number of years ago that the top schools have more people from the top 1% of the income distribution than from the bottom 60%.
01:53:42.000So there's a real concentration, especially in the Ivies, of rich kids who don't need to worry as much about getting a job and have the bandwidth to devote themselves to politics while they're students.
01:53:58.000It's just, I just fear for the children that come out of that too, these young people that come out of that, that have these distorted perspectives that have to kind of rewire their view of the world once they get out.
01:54:13.000It's almost like taking someone from a cult and trying to just delete the indoctrination.
01:54:24.000And it's almost impossible to do that, especially if most of what's coming in is coming in from TikTok, not from your parents or your friends or your teachers.
01:54:34.000So again, back to the question of the TikTok ban.
01:54:38.000The issue here is not, should we ban TikTok?
01:54:41.000The issue is, should American law require a divestiture of TikTok from a Chinese corporation that is beholden to the CCP? That's the question.
01:54:51.000There's an issue that's happening in Texas currently where one of the porn sites has pulled out of Texas because they require age verification.
01:55:01.000And so there's all this pushback about whether or not they should be able to require age verification.
01:55:07.000You have to be 18 to use porn websites, which I think is very reasonable.
01:55:14.000Yes, it's insane that we're even debating it.
01:55:16.000Yeah, we're running a mass psychology experiment on children by having smartphones with large screens and having instantaneous access to porn.
01:56:10.000You know, if that's what you think this sex stuff is, when you're an 11-year-old and you see this stuff, you're not going to be like, ooh, I want that to happen to me.
01:56:17.000It's also so distorted, the relationships in these porn videos.
01:56:32.000And once again, I'm not going to tell adults what they should do with their spare time.
01:56:36.000But for God's sakes, I am going to try to tell companies that they can't just have access to my kids from the age of 9 or 10 and do what they want with them.
01:56:44.000So, you know, I don't know the details of the Texas law.
01:56:46.000But I think we've got to do something to age-gate pornography.
01:59:40.000And he and I had a plan for some, like, guerrilla art campaign with posters, you know, linking, you know, Instagram to cigarettes, that sort of thing, a couple years ago.
01:59:48.000So Dave had the idea to really go big.
01:59:51.000And so Dave has built a 12-foot-tall milk carton of the thing you just showed, a 12-foot-tall milk carton.
01:59:57.000It's going to be on the National Mall in Washington this Friday.
02:00:12.000There are lots of organizations that are joining us here, but we're starting a national movement to get parents, to encourage parents, to work together.
02:00:21.000Because as I said, we can escape this if we work together.
02:00:25.000But if a lot of us say, we're not going to give our kids smartphones until 14, we're not going to let them open an Instagram or TikTok account until they're 16, we're going to ask our schools to go phone-free, and we're going to give our kids a lot more independence of the sort that we had in a much more dangerous world.
02:00:42.000If we do those four norms, we really can turn that around.
02:00:45.000And I'm confident we are at the tipping point right now.
02:00:50.000Even a few months, even by July and August, or let's say by September, when school starts again in the fall, I think there's going to be a different vibe about phones and the roles of technology in kids' lives.
02:01:00.000Well, I hope you're right, Jonathan, and I really appreciate you, and I really appreciate you writing this and spending so much time on this and thinking about it so thoroughly.
02:01:09.000The Anxious Generation, How the Great Rewiring of Childhood is Causing an Epidemic of Mental Illness.