The Glenn Beck Program - December 01, 2018


Ep 13 | Greg Lukianoff | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 21 minutes

Words per Minute

186.91959

Word Count

15,277

Sentence Count

1,060

Misogynist Sentences

4

Hate Speech Sentences

13


Summary

In this episode of On Demand, Alex Blumbergbergberg talks with the founder of the Foundation for Individual Rights in Education, Jonathan Haid, about his new book, "The Dark Side of America: How Campus Speech Became Toxic."


Transcript

00:00:00.000 The Blaze Radio Network, on demand.
00:00:21.380 Who were you 10 years ago?
00:00:25.200 Well, 10 years ago, I had just recently become the president of the Foundation for Individual Rights in Education.
00:00:33.260 That followed about six years of me being the legal director.
00:00:36.760 And what I was doing there was I was following my lifelong dream of defending freedom of speech.
00:00:43.420 I went to law school to do First Amendment work.
00:00:46.500 It's my lifelong passion.
00:00:48.560 I'm a first-generation American, and I think one of the most amazing things about the U.S. is freedom of speech.
00:00:53.120 And I ended up defending it on campus, somewhat to my surprise.
00:00:58.720 And from pretty much day one, I realized, and day one was 2001 for me, that it was a lot easier to get in trouble on the modern college campus for what you said, even back then.
00:01:09.540 But what led me down the path to the book, frankly, is a very, you know, personal story.
00:01:16.640 I always had issues with depression.
00:01:19.660 You know, I just kind of took it for granted.
00:01:21.560 I'm, you know, Russian-Irish, you know.
00:01:23.140 It just goes with the territory, I assumed.
00:01:25.220 But I got into a really bad one in 2007 when I was about two years into my presidency of fire.
00:01:31.340 And I was hospitalized.
00:01:34.260 I actually talk about, like, how bad it got.
00:01:37.300 And what saved me, I'll say it flat out, is cognitive behavioral therapy.
00:01:43.500 So I am recovering from this devastatingly, terrifyingly bad depression, and I'm learning about CBT.
00:01:50.980 CBT, and CBT is this practice by which you learn to talk back to your own exaggerated thoughts that everybody has to a degree, but depressed and anxious people have them in particular.
00:02:01.620 So you learn this vocabulary, like catastrophizing, you know, like don't catastrophize.
00:02:06.560 Like if you go on a date and suddenly you say, I'm going to die alone if it doesn't go well.
00:02:11.600 You're catastrophizing.
00:02:13.160 My wife always makes fun of me because I'm also engaged in a lot of binary thinking.
00:02:16.840 Not that I think it's good, but I'm always like either gender is going to be great or it's going to be terrible.
00:02:21.160 Like it's like, and I have to remember this.
00:02:22.900 Have you ever heard the song, I Go to Extremes by Billy Joel?
00:02:25.480 Yes, exactly.
00:02:27.040 We have a lot in common.
00:02:29.060 We've got to talk that down.
00:02:31.780 Labeling, that's particularly interesting for campus.
00:02:35.180 These are all called cognitive distortions.
00:02:37.120 These are things that people do to distort the world around them.
00:02:41.020 Labeling, overgeneralizing, all of these kind of things.
00:02:44.220 And as I'm studying these things that are effectively making me well, you know, and it takes a while.
00:02:50.980 It's a daily practice.
00:02:52.100 It doesn't work if you just know it intellectually.
00:02:54.300 You have to practice this every day.
00:02:56.500 But I was doing this at the same time while I was working on college campuses.
00:03:00.660 And I was looking around going, wow.
00:03:03.160 It's like every administrator is telling to students, by the way, everybody should label, overgeneralize, catastrophize, engage in binary thinking, mind reading.
00:03:13.260 All these things that they tell you not to do because they'll make you anxious and depressed.
00:03:17.140 And I was like, this is funny.
00:03:18.540 So while I was defending freedom of speech, I'm also like, we're teaching really dysfunctional intellectual habits on campus, whether we know it or not.
00:03:25.240 But thank goodness the students weren't learning it.
00:03:29.760 At that point, back in 2008, say, 2009, students were, believe it or not, still the best constituency for freedom of speech.
00:03:37.900 They generally came to the defense of it, better so than usually professors and certainly than administrators.
00:03:44.140 And so, you know, it's like it was we were modeling distorted behavior, but the students weren't buying it.
00:03:53.060 And then seemingly overnight in 2013, 2014, suddenly the students started demanding people get disinvited from campus at a much faster rate.
00:04:04.260 They were demanding microaggression policies, trigger warning policies.
00:04:09.020 They're even being told at places like Oberlin to avoid anything.
00:04:12.460 They had a list of things that professors should avoid that included anything that touches on racism, sexism, classism.
00:04:18.720 It was about 12.
00:04:19.920 What are you talking?
00:04:20.780 What can you teach or talk about?
00:04:22.860 What good book can you read under those circumstances?
00:04:27.040 And this really happened very quickly.
00:04:29.760 And at that point, I was lucky enough to be friends with Jonathan Haidt, who became my co-author on this.
00:04:36.920 I talked to him about my CBT idea.
00:04:39.760 And the reason why it was really connected, it wasn't just me.
00:04:42.640 Arbitrarily connecting these things.
00:04:44.380 The students, the thing that really made the students different in 2013, 2014, was that they were justifying why this person can't speak on my campus by appealing to sort of a medicalization.
00:04:56.600 They were saying, not I'm offended, not this person's evil.
00:05:00.420 I might say that too.
00:05:01.980 But they were instead saying, it will medically harm me.
00:05:06.200 Actually, not so much me.
00:05:07.400 It will medically harm someone, some other undescribed person over there.
00:05:10.760 They will be triggered by it.
00:05:12.080 It will be traumatic.
00:05:13.120 And you'll be damaged forever.
00:05:15.360 And I was like, OK, this isn't wrong.
00:05:17.900 Like, I know enough, because I also became kind of a psychology hobbyist after that.
00:05:20.980 I knew enough to, I was just kind of imagining the psychologist who, when you come in, you know, to his office and he's kind of, and you tell him he's anxious, like, oh, you must be in danger then.
00:05:31.600 You must be in a great deal of danger.
00:05:33.160 And it's like, it's totally dysfunctional.
00:05:35.120 So I tell this to John.
00:05:36.840 John gets excited about writing an article with me, which is a dream come true for me because I'm, you know, already a huge fan of his work.
00:05:42.920 He wrote a book called The Happiness Hypothesis, which I was a huge fan of.
00:05:46.380 Also The Righteous Mind, of course.
00:05:47.980 Brilliant.
00:05:48.680 And so I was thrilled to write this article with him.
00:05:51.520 And so we wrote an article called Coddling the American Mind, which came out in the summer of 2015.
00:05:57.660 And it solved everything.
00:06:00.320 And that's how we fixed it.
00:06:02.280 And we were waiting to get our heads chopped off, basically, because we were taking on all these sacred cows in higher education and making the point that these are dysfunctional.
00:06:11.620 These are teaching people bad habits of free speech, but they're also teaching them, teaching the habits of depressed and anxious people.
00:06:18.720 And then things got so much worse on campus.
00:06:23.140 The fall of 2015, some absolutely great protests in the fall of 2015, but others of them, they were demanding.
00:06:30.200 You know, that was the famous Halloween costume fight over at Yale.
00:06:35.820 That was the, we talk about a case of Dean Spellman at Claremont McKenna College, where she clearly tried to write a letter that was very sympathetic to a student, but talked about this idea that people had said that she didn't fit into the Claremont McKenna mold.
00:06:50.020 They interpreted this as if she was some howling racist and got her kicked off of campus.
00:06:55.980 And then things just seemed to get worse.
00:06:58.240 And for the first time ever, of course, in 2017, we start seeing real large scale violence in response to speakers on campus.
00:07:05.940 And also Weinstein.
00:07:08.800 Oh, geez.
00:07:09.420 Yeah.
00:07:10.260 That's terrifying.
00:07:11.720 He figures heavily in the book because his story is just so amazing.
00:07:14.800 And you always have to remind people, you know, what was he saying?
00:07:19.100 He was saying, I actually don't think it's a good idea to divide this campus in terms of race because they literally were telling white professors and white students to get off campus as some kind of social experiment that was supposed to be healthy.
00:07:31.800 If you know basic group polarization psychology, no, you're actually just going to make it worse.
00:07:38.200 Weinstein was entirely right, but he was also an old-fashioned, you know, old-fashioned as of like, you know, maybe.
00:07:44.660 2010.
00:07:45.460 Yeah, way back in 2010.
00:07:47.040 Yeah, right.
00:07:47.820 This is what liberals used to think.
00:07:49.320 About actually wanting people, you know, to meet friends and talk across various lines.
00:07:54.700 So let me ask you two questions.
00:07:56.180 Sure.
00:07:56.580 One, define what liberal means today.
00:08:00.500 That's an excellent question.
00:08:02.700 I've wanted to write an article called The Crisis in Vocabulary, you know, with a big exclamation mark at the end of it.
00:08:08.420 Who knows?
00:08:09.120 Yeah, because I don't know what a conservative means.
00:08:10.900 I don't know what a liberal means.
00:08:11.960 I've always considered myself a classic liberal.
00:08:14.060 Yeah.
00:08:14.420 But that's always been murky to say because people are like, what is that?
00:08:18.100 Yeah.
00:08:18.600 It's somebody who believes in rights.
00:08:20.620 Right.
00:08:21.020 But people, you know, you used to, you lived in San Francisco.
00:08:25.420 You were ACLU.
00:08:26.860 You were, if I'm not mistaken, more of a classic liberal.
00:08:31.600 So classic liberals, that doesn't even seem to play a role in so many people's lives now.
00:08:39.300 Yeah, it's very strange because we, my father, you know, he's a Russian refugee and he talks about how some of the shifting in the term came from the fact that socialist was a really bad word in the U.S.
00:08:51.700 So liberal increasingly came to means, believes in civil liberties, which I certainly agreed with, but also saw a big role for government, but nobody was willing to say socialist.
00:09:01.720 So, whereas like in Denmark, you know, they still use liberal to mean someone who believes in lower regulations, but also civil liberties, that the state should play a somewhat limited role, which is more, you know, the tradition I come from.
00:09:14.880 I do think we were done a little bit of a favor terminology wise that around this time people started calling themselves progressives.
00:09:22.220 I agree.
00:09:23.020 Because that gets you back, and I know you're a huge fan of Woodrow Wilson.
00:09:27.500 Y'all love him.
00:09:29.020 I love him.
00:09:29.900 Who helped create the country my father grew up in, which was, which didn't work out all that well.
00:09:34.280 Yeah, no, it didn't.
00:09:35.280 Yugoslavia.
00:09:35.920 Yeah.
00:09:36.720 Not so successful.
00:09:37.820 No.
00:09:38.380 He had a lot of really bad ideas.
00:09:41.000 A lot of really bad ideas.
00:09:42.760 Okay, so the next step on that is you did not get the blowback that you expected.
00:09:50.500 For the 2015 article.
00:09:51.840 Yes.
00:09:52.140 Yeah, the 2015 article.
00:09:53.580 You know, we're taking on all these sacred cows, and we're waiting.
00:09:56.420 You know, it's like, oh, they're going to come kill us.
00:09:58.440 And from person after person, from comment after comment, we got, this was thoughtful.
00:10:04.520 Probably the most beautiful thing I read was a young woman writing that her brother had committed suicide by jumping off of a building.
00:10:13.640 And she wrote about how she was in a class where they showed a movie that included a scene in which someone kills himself by doing that.
00:10:20.180 And she realized nobody in that room knew that that had happened to her.
00:10:25.080 And being able to be normal and nobody paying attention to her at that moment was the first time she felt normal since her death of her brother.
00:10:32.940 And there were many stories like this about our basic point saying, no, you're not doing people who have aversions to things any favors by saying, oh, by the way, now you can avoid them for the rest of your life.
00:10:45.820 And that's something that I have to say over and over again.
00:10:49.160 If you say, oh, you don't like spiders, okay, we're not going to talk about spiders anymore.
00:10:54.240 You're actually giving them more power.
00:10:56.020 And you can turn something that might just be a strong aversion to something into something that's more like a phobia.
00:11:02.040 And worst of all, you can turn it into something called a schema, a self-definition, something that you define as part of you.
00:11:10.640 And one thing that I think is so messed up about one of the things that you see on campus is we're doing what I would call negative schema training.
00:11:17.880 We're more or less telling people it's like you really need to internalize the belief that you're wounded forever.
00:11:23.160 That's perverse.
00:11:24.980 Yes.
00:11:25.140 So you didn't get the pushback that you thought.
00:11:29.280 I see social justice and postmodernism infecting social justice to where it is just this nonsense that's happening.
00:11:42.260 And that's a lot of it's coming from the universities.
00:11:45.500 So are the university, are the professors beginning to wake up to this?
00:11:51.860 Or did they not see it?
00:11:53.600 Because it, you know, it was brought over in the 70s as a plan to infect.
00:12:01.260 So are they not part of it?
00:12:04.040 Are they just so blind with their own education that they didn't think this through, that this is going to happen?
00:12:10.740 What's happening there?
00:12:12.020 Well, definitely, you know, there are professors who, you know, would consider themselves lifelong liberals who have become some of our best allies on campus.
00:12:19.960 Partially because, well, some of them were good on free speech to begin with.
00:12:23.000 You know, there are people all across the spectrum who can be both bad or good on free speech.
00:12:29.600 And but one thing that we have seen lately is that professors are starting to get that some of these tools are being turned on them.
00:12:36.300 And in some cases, in really remarkable circumstances, like Brett Weinstein trying to say, just speak common sense about how you get people to get along.
00:12:45.860 Erica Christakis, you know, sending out something really that the students from the 1960s would have been like, absolutely.
00:12:52.700 Absolutely. You're defending our autonomy and our maturity.
00:12:56.120 We can pick our own Halloween costume without, you know, the nanny state of Yale University telling us what to do.
00:13:03.380 And she got treated as if what she had actually said was everybody go out and wear blackface, which was absolutely not what she'd actually done.
00:13:09.420 If I may use an example that I know you and fire were strong on the the Klan was the Klan rally of Notre Dame.
00:13:18.680 Oh, the Notre Dame versus the Klan.
00:13:20.580 Yeah, that's again a Woodrow Wilson poison.
00:13:22.940 But they they rally and the students decide that they're going to take on the Klan.
00:13:28.600 Yeah. Right. Yeah.
00:13:29.480 And tell me what the case is about.
00:13:31.840 The case is insane.
00:13:32.860 OK, so this is this is a case involving a guy who was was working his way through school as a janitor.
00:13:39.960 So, like, not not the man, but who was trying to educate himself on issues, particularly relating to race relations.
00:13:47.300 And he was reading a book called Notre Dame versus the Klan that was about, I think, a 1926 march on Notre Dame in which because people, you know, you have to remind people sometimes that the KKK was pretty broad in the people they hated.
00:14:01.140 So they also hated Catholics.
00:14:02.480 Right.
00:14:03.020 But in this case, the Catholics came out to fight fight them in the streets.
00:14:07.240 And this is a book celebrating the defeat of the Ku Klux Klan when they tried to march on Notre Dame in the 1920s.
00:14:13.140 It gleefully celebrates the fact that these students weren't going to take it.
00:14:17.300 And a student, a working class student, literally, who was reading this book because someone saw the cover, literally judging a book by its cover, was found guilty of racial harassment because people found some two employees apparently found the title Notre Dame versus the Klan and the picture of a cross burning on the front of it to be harassing somehow.
00:14:43.900 Nobody asked him what the book was about.
00:14:45.960 Now, to be clear, even if it was, you know, Mein Kampf, even if it was offensive, you still have a right to read it.
00:14:52.040 But it's extremely ironic that they went after a book that was manifestly anti-Klan.
00:14:58.280 So we ended up having to take on IUPUI.
00:15:00.900 This is Indiana University, Purdue University, Indianapolis.
00:15:03.740 That's a long name.
00:15:04.480 You know, on two different occasions to win this completely obvious case that you, you know, shouldn't judge a book by its cover.
00:15:12.020 I'm a self-educated guy.
00:15:34.680 I could only afford one semester.
00:15:39.100 I took it when I was 30.
00:15:41.940 And I was planning on going longer, but I got a divorce on my first day of college.
00:15:49.000 On your first day.
00:15:49.960 Yeah.
00:15:50.180 And so I was, I was really struggling prior to just trying to read as much as I could.
00:15:59.240 And, you know, it's, you know, Immanuel Kant is not the easiest to get through.
00:16:04.220 And, uh, and as I'm, I'm going through this, the best professor is the one who says, who half the class, I swear he believes X, Y, and Z.
00:16:21.700 Uh-huh.
00:16:22.640 The second half of the class, I swear he's on the other side.
00:16:26.600 Uh-huh.
00:16:26.940 You know what I mean?
00:16:27.480 Yeah.
00:16:27.820 That's, that's Alan Charles Kors.
00:16:29.160 One of the, I'm not saying actually literally, but one of the founders of FIRE is this enlightenment scholar named Alan Charles Kors.
00:16:36.120 And if you guys take the great courses, he teaches a lot of them on the enlightenment.
00:16:40.840 And he teaches, I know he's not, you know, he doesn't agree with Blaise Pascal, who's a big defender of the existence of God.
00:16:50.180 And he teaches for an hour, this absolutely riveting best explanation of how brilliant Blaise Pascal was.
00:16:57.300 And I know it's like, it did, it never occurred to you that he didn't want 100% agree with him.
00:17:01.680 And the ability to do that is, is a lost art among some of these professors.
00:17:05.500 And that's what, when I, when I was going to college, I was reading Mein Kampf.
00:17:12.240 Uh-huh.
00:17:12.480 I want to know what's in Mein Kampf.
00:17:14.800 It bothers me that it's in my library.
00:17:17.240 Yeah.
00:17:17.660 You know what I mean?
00:17:18.300 But it's important.
00:17:19.540 I made myself read it because I felt like, you know, partially to figure out how much they actually felt being censored made them stronger.
00:17:27.340 And that's a theme that does come up.
00:17:28.840 But what was surprising was founding how much he was obsessed with syphilis.
00:17:32.340 You know, that just keeps on coming up.
00:17:34.260 Like how much, and he really wanted that, he really wanted to be allies with Britain.
00:17:38.160 There was like all of this kind of like weird little, and of course it's sort of like backseat historic, right?
00:17:43.740 Being kind of like, well, we really shouldn't have allied with the Austro-Hungarian Empire.
00:17:47.260 It's like, yeah, that's great genius.
00:17:49.060 Like who, who, who didn't know that?
00:17:52.900 I, I, I read because I wanted to know, did the German people know?
00:17:59.500 Yeah, of course they did.
00:18:00.580 Read the book.
00:18:01.420 Yeah.
00:18:01.660 Yes, they knew.
00:18:02.700 They absolutely knew.
00:18:04.480 Unless you just compartmentalized and went, no, he didn't really mean that, which probably a lot of people did.
00:18:10.380 Right.
00:18:11.340 But we're losing that ability.
00:18:12.920 We're, we've, we've, we've taken the word safe.
00:18:17.800 I feel unsafe.
00:18:19.020 Oh, yeah.
00:18:19.380 No, you feel uncomfortable.
00:18:21.040 Right.
00:18:21.240 Very different thing.
00:18:22.220 Very different.
00:18:23.080 And actually it's kind of good to be uncomfortable sometimes.
00:18:24.720 Look at your story.
00:18:25.880 Yeah.
00:18:26.140 What got you to this theory?
00:18:28.520 Yeah.
00:18:28.920 Being very uncomfortable with your own thoughts.
00:18:32.200 Yeah, exactly.
00:18:33.520 Yeah.
00:18:33.840 And it, you know, and it's one of these things where it sounds hokey, but I recognize that there
00:18:38.080 were just a lot of things that I had these sort of like phobias of the, these limitations
00:18:43.920 that I kind of put on myself that I never would have gotten over unless I actually totally
00:18:49.040 broke down.
00:18:49.580 And so I have to say, I learned so much from it.
00:18:52.960 I can't really say it was a bad thing.
00:18:55.720 But boy, was it uncomfortable.
00:18:58.280 You know what?
00:18:58.900 My favorite quote is, the truth will set you free, but it's really going to suck first.
00:19:07.820 You're just going to hate it.
00:19:09.680 If you're out of line with the truth, you're going to hate it because you have to go, am
00:19:15.400 I going to change my life?
00:19:17.140 I think, I mean, people perhaps, and I'm, I'm hoping that this isn't true, that people
00:19:26.840 don't think big thoughts because instinctively they know if I find this to be true and I'm
00:19:37.880 at a crossroads, I'm going to have to knowingly live a lie or it's going to cause me all kinds
00:19:46.360 of pain in my friends and my relationships and everything else.
00:19:51.040 Yeah.
00:19:51.560 Yeah.
00:19:52.000 It was, it was, it was interesting that year that I got really depressed.
00:19:55.760 You know, I got to live the polarization.
00:19:58.400 I was, you know, I hang out in Shambhala Buddha circles in Philadelphia.
00:20:04.100 I'm on the board of a theater company.
00:20:05.800 I used to write, you know, plays and, and, and short stories.
00:20:09.420 And I was head over heels for this girl at the time.
00:20:12.620 But she was really uncomfortable with what I did for a living.
00:20:15.160 Um, defend free speech on college campuses.
00:20:17.840 And, and, and, and, and I've gotten used to that by, but by 2007, um, that people, because
00:20:23.220 they realized I oftentimes I was defending evangelicals or Republicans.
00:20:29.280 And at one point.
00:20:30.180 But isn't the point of speech.
00:20:32.220 No, exactly.
00:20:33.120 And I remember at one point that, and this is where, you know, I, I knew that we were doomed,
00:20:37.220 but, um, I, I.
00:20:40.180 You're catastrophizing.
00:20:42.440 The relationship, you know, didn't make it.
00:20:44.140 Oh, you mean them.
00:20:45.220 Oh, no, no, no.
00:20:45.700 You do.
00:20:46.420 I thought you meant everybody else.
00:20:48.060 Okay.
00:20:48.640 That's a couple level, yeah.
00:20:49.300 Yeah.
00:20:49.480 Okay.
00:20:49.840 Um, I said, you know, I, I'm willing to defend the free speech rights of Nazis.
00:20:53.580 Um, I'm certainly willing to defend the free speech rights of Republicans.
00:20:56.300 And she actually said, I think Nazis might be worse.
00:20:59.100 I think, I think Republicans might be worse.
00:21:00.880 And I was like, um, so, and that was 2007.
00:21:05.160 And we've gotten much more polarized since then.
00:21:09.200 So what is, um, what are the factors that are playing in?
00:21:14.020 Why was there the change in 13, 14?
00:21:18.160 What happened to, why do you say in 2006, the students are still kind of balanced and
00:21:24.260 they're still fighting for free speech.
00:21:25.800 And then all of a sudden, boom, why things got so much worth worse in 2013, 2014 is the
00:21:31.940 social science detective story of the whole book, because it really did seem to be overnight.
00:21:36.320 And John really noticed that my coauthor, um, a bunch of columnists I'm friends with, everybody at FIRE was like, what, did something just happen?
00:21:44.400 Um, what, what just happened on campus?
00:21:46.160 Because, and I always say, it's not like it was all that rosy prior to 2013, 2014, but those were administrators telling students that they had to follow ridiculous speech codes.
00:21:55.180 Suddenly the students were completely agreeing with the administrators.
00:21:58.040 So the whole book is trying to figure out what happened.
00:22:00.440 Now, the most powerful theory out there at the moment that does seem to have some explanatory power, um, is, uh, social media.
00:22:09.640 Um, this is, uh, this is the first generation that, um, grew up with, you know, smartphones with social media in their pockets.
00:22:15.740 And this is the, what, um, uh, Jean Twenge calls, uh, she is a researcher of, um, uh, of, of differences between, um, generations, calls iGen.
00:22:25.760 And she, uh, noticed that in, in all of the polling, people born 1995 and after have a lot different characteristics than millennials.
00:22:34.360 So we always have to explain, this is not a book bashing millennials.
00:22:37.500 I actually think millennials get a little bit of a hard rap.
00:22:39.700 I think they really do.
00:22:40.580 Um, but iGen, um, when it comes to everything from anxiety to depression, even to the fact that there's, there's a lot fewer moderates among iGen,
00:22:48.960 which is just a total reflection of the society we live in, you know, the, the way you can sort of surprise people saying, do you know,
00:22:55.440 there are more conservatives among people born after 1995, uh, and there are more liberals.
00:23:01.080 It's all at the expense of the moderates, um, have, have been hollowed out.
00:23:04.940 So, um, we definitely found enough research to, to, to convince us that, um, social media plays a major role.
00:23:12.420 Um, that way to, uh, two, two important ways.
00:23:15.260 Um, first of course, polarization, um, that, uh, we talk, we ground a lot of the book in some really well-established research on how easy it is to make even people who look alike really dislike each other and how easy it is to give people a sense of us versus them.
00:23:32.360 It's amazing to me.
00:23:33.440 I hear this all the time.
00:23:35.000 I've always gotten along with my family.
00:23:36.700 I could always talk to my family, but I follow my family on Facebook and I can't talk to them anymore.
00:23:42.580 I mean, this is Pete, these are people you've been around your whole life.
00:23:46.740 That kind of says, maybe you shouldn't be reading Facebook, at least with your family and your friends.
00:23:52.780 Well, Facebook, both Facebook and Twitter really pat you on the back for having a really thick echo chamber.
00:23:58.260 Yes.
00:23:58.600 Um, they, they make it feel good.
00:24:00.500 And unfortunately, you know, as John and I talk a lot about, it really plugs into this kind of tribal, uh, natural pro, uh, sub programming that we have where it just feels great to have this group that, that, um, it becomes sort of a quasi religious experience.
00:24:15.440 Um, and so.
00:24:16.680 I think for conservatives, it's like when Rush Limbaugh came on, he was the first guy.
00:24:22.860 Yeah.
00:24:23.080 But then, you know, in 2008, a lot of conservatives felt like, is this just me?
00:24:28.680 Yeah.
00:24:28.840 I mean, we're all socialists now.
00:24:30.320 This isn't, what happened?
00:24:31.800 Right.
00:24:32.160 And so Facebook, it has helped people find tribes and go, okay, it's not me.
00:24:38.500 Right.
00:24:38.840 But it's also now convincing them that it's you.
00:24:42.540 Right.
00:24:43.060 Against them.
00:24:44.180 Yep.
00:24:44.340 And that's why we call this problems of progress.
00:24:46.020 We always want to be really clear about this because I'm, I'm painfully aware of the fact that my dad was born in 1926 in Yugoslavia.
00:24:52.720 His dad died when he was six.
00:24:54.440 My life is freaking cake by comparison.
00:24:57.280 Everything is easy by comparison.
00:24:59.460 And we call them problems of progress, partially because there were social scientists who were looking forward to the future and saying, you know, now that we're not as industrialized and people can have greater freedom of freedom of movement, that we can increasingly live in communities that reflect our values.
00:25:14.620 And that sounds absolutely lovely.
00:25:16.720 And that's what we've done.
00:25:18.240 But it does have a dark side.
00:25:20.180 And that dark side is tribalism.
00:25:22.720 Wait, wait.
00:25:23.600 Does it have to, though?
00:25:25.080 For instance, I have, I love San Francisco.
00:25:27.980 I love San Francisco.
00:25:28.960 It's one of the great, I'm not living in San Francisco.
00:25:31.420 It's hard to have a good argument in San Francisco, though, I can say, and somebody used to live there.
00:25:35.380 There are parts of Texas.
00:25:37.200 If you're from San Francisco, I don't recommend you go have an argument, okay?
00:25:43.020 But why does that community in Texas have to be ruled by the same rules that are in San Francisco?
00:25:51.560 Why can't we say, you want to live like that?
00:25:53.700 Great, live that way.
00:25:54.960 Why can't we leave each other alone?
00:25:56.440 Why does the tribal nature have to be a warring nature?
00:26:00.300 It doesn't necessarily have to be, but we definitely don't value it, because that's the difficult first step, is you have to value talking to people on the other side of the aisle, who people come from different places.
00:26:13.680 So I was a holy terror when I lived in San Francisco.
00:26:16.260 And I was hanging out with all the people.
00:26:17.600 I would go to Burning Man with them, you know, like, but when it came to actual political arguments, I was constantly frustrated when I lived there.
00:26:24.500 And people would talk about Middle America, usually, like, the stand-in for Middle America was Kansas.
00:26:30.840 And with real contempt, not all of them, but there would be some, occasionally, like, someone would just go off, you know, usually a white, fairly privileged dude would talk about how much he hates the people from Middle America.
00:26:41.260 And I'd just be standing there with, like, a mouth open, being kind of like, okay, my dad's from Yugoslavia.
00:26:47.340 Imagine someone saying, oh, those Croatians are all so ignorant and backwards.
00:26:52.580 Like, wouldn't that set off a little bit?
00:26:54.160 Because as a first-generation American, when people talk about different regions with that kind of contempt, I'm like, no, no, no, that's not cool.
00:27:00.380 Don't generalize like that.
00:27:02.100 But there was nobody pushing back on that.
00:27:04.560 And I think that's part of the problem of an echo chamber, is that it tends to push you all in one direction.
00:27:09.640 However, actually, this is fun to remember this.
00:27:13.040 You know all those experiments that they did, you know, from the 40s on up, where they would have a classroom where there'd be, you know, people saying, which line is longer?
00:27:24.380 And, like, most of the people say the shorter line is longer.
00:27:27.520 It's a setup.
00:27:28.240 And to see how people will conform.
00:27:30.560 And sadly, you know, a lot of people will.
00:27:33.580 Most people will conform.
00:27:34.940 They'll say, okay, I guess I was wrong.
00:27:37.300 The shorter line is actually the longer line.
00:27:39.640 But it only takes one person to go, oh, to call, this is nonsense.
00:27:44.780 That's obviously the longer one, to break that spell.
00:27:47.640 So there is some good news in the research.
00:27:50.060 In history, isn't it always the person who says, I will not conform?
00:27:55.040 They're usually killed.
00:27:57.220 But isn't that the person in history that changes everything?
00:28:00.080 Absolutely.
00:28:00.480 And this is actually, when I talk about First Amendment, when I talk about freedom of speech, the premise I begin with is everybody understand free speech is not normal.
00:28:10.360 Our natural instincts are to burn, crucify, behead, make, drink hemlock.
00:28:16.900 These might sound familiar to some people.
00:28:19.700 That's the way, like, we have a history of treating dissenters.
00:28:24.620 The idea that you should actually listen to them?
00:28:27.140 That sounds crazy, you know, 500 years ago.
00:28:31.220 If you know anything in history, it's always been the people who were crazy.
00:28:36.840 Right.
00:28:37.320 That you went, no, they weren't crazy.
00:28:40.040 One of my favorite stories in American history is George Washington is dying.
00:28:44.840 He has pneumonia and his lungs are filling up.
00:28:48.500 He can't breathe.
00:28:50.620 And the usual doctor, the one that everybody loves, said, bleed him.
00:28:56.800 Bleed him.
00:28:57.780 A young doctor who is also standing in the room says, okay, I know this sounds crazy, but I don't think he can breathe.
00:29:05.780 And I've heard there's this new thing.
00:29:08.600 If I take a tube and I pop it right here, give him a trach, I think we can save him.
00:29:16.440 The older doctor said, are you out of your mind?
00:29:21.360 Yeah.
00:29:22.100 He was killed.
00:29:24.220 That kid was crazy.
00:29:26.040 No, he wasn't.
00:29:27.000 He was ahead of his time.
00:29:28.340 And even if those crazy people say things, there's usually a germ.
00:29:35.300 I mean, you know, there are some crazy people who are just crazy.
00:29:37.380 Just crazy, yeah.
00:29:38.140 Right.
00:29:38.400 But there might be a germ.
00:29:41.020 I don't understand why we can't.
00:29:43.720 I don't want to live in a nation of all artists.
00:29:46.580 Uh-huh.
00:29:47.220 We'll never get anything done.
00:29:49.140 Right.
00:29:49.420 It'll be horrible.
00:29:50.740 I don't want to live in a nation full of accountants.
00:29:53.380 Right.
00:29:53.880 There will be no art.
00:29:56.000 They need each other.
00:29:57.240 They really do.
00:29:59.200 What happened to the idea that we need each other?
00:30:04.500 I just don't see a lot of people valuing at this point.
00:30:06.900 I think maybe we're, I think we've gotten so close to the precipice right now, people are
00:30:10.280 starting to go, wait a second, this is not the way I want to live.
00:30:16.200 So far, the response to the book, you know, which where we, you know, slaughter a lot of
00:30:20.700 sacred cows in, we've been pleased that we haven't yet been fully called heritage.
00:30:25.500 Yeah.
00:30:25.940 When I talked to Jonathan about The Righteous Mind, I was so excited to talk to him.
00:30:33.460 I read it and I said, this is such a great book.
00:30:35.500 I was like, this is part of the answer.
00:30:38.200 Yep.
00:30:38.560 You know, if we can get people to understand the language that we're, this is it.
00:30:43.680 And he bummed me out so badly.
00:30:47.700 He said, yeah, it's not going to work.
00:30:50.500 And I said, this is the answer.
00:30:52.680 And he said, you have to get so many people to do it and they're just not going to do it.
00:30:58.960 Do you think that's changing?
00:31:00.640 I hope so.
00:31:03.300 Because, you know, if something can't continue in a particular direction, it won't.
00:31:06.640 And we can end up in a really ugly place if we keep going in this direction.
00:31:12.320 And particularly, if you look at the sort of exaggerated polarization we have now, imagine
00:31:18.940 it 10 times worse.
00:31:20.180 And if some of the characteristics that we're seeing in iGen, you know, with the lack of
00:31:25.580 moderates, with the some of these ideas where, you know, where they're essentially ideas of
00:31:31.840 individual fragility, but that gets and ends up being used almost like kind of like a weapon
00:31:36.360 that essentially, since everybody's so fragile, you may not believe the following things or
00:31:40.420 even speak the following things.
00:31:42.060 And on the other side, you've got, you know, a population of conservatives who've just, you
00:31:45.700 know, had enough and hate all of this stuff and are talking and getting angry
00:31:49.940 among their own people.
00:31:51.200 And what I've seen in the last couple of years is sort of like the echo chamber, you
00:31:55.020 know, on the left on campuses and the echo chamber that's a little bit more on the right
00:31:58.740 are starting to collide.
00:32:00.700 And we're just seeing the first sort of glimmers of what that looks like.
00:32:03.560 And it's not pretty.
00:32:05.100 I'm very concerned only because student of history, everything is a cycle.
00:32:10.480 To everything, there is a season.
00:32:12.020 Yep.
00:32:12.560 We've had a good economic season through all of this.
00:32:15.100 Yeah, which is great.
00:32:15.820 If we hit a serious downturn, forget even a downturn, we have Silicon Valley working toward
00:32:26.320 a 100% unemployment rate as a good thing.
00:32:30.440 Yep.
00:32:31.020 When that kicks in, if we are doing this, how do we pull it back?
00:32:37.900 What does the average person on either side of the aisle, how do they, how do they, let's
00:32:46.920 say you have a kid who's an iGen, which is what age is?
00:32:50.140 Born after 95?
00:32:51.320 That's born after 95.
00:32:53.100 I think it ends maybe around 2012.
00:32:55.180 Okay.
00:32:55.800 So if you have a kid who's iGen and they're in college and you're seeing this madness,
00:33:03.660 how do you reach to them?
00:33:04.700 It's kind of funny.
00:33:07.040 Height is much more pessimistic about this.
00:33:09.020 Yeah, I know.
00:33:09.660 Too.
00:33:10.280 And it's kind of funny.
00:33:11.060 Like we, we, we're, you know, we're good friends and we, we take turns sometimes.
00:33:14.720 Sometimes, some days I'm like, it's hopeless.
00:33:17.300 And he's more like, oh, don't catastrophize.
00:33:19.320 And sometimes I'm like, come on.
00:33:20.540 Like, right.
00:33:21.100 If I could wave a magic wand and have one thing, and actually maybe your listeners can
00:33:25.360 help with this.
00:33:26.240 Cause we've been, I've been talking with the college board and the National Constitution
00:33:29.500 Center.
00:33:31.100 Jeff Rosen, who's brilliant.
00:33:32.280 Um, and it's where it seemed to be thinking the same thing.
00:33:35.780 If it could be a norm for every high school in the country to, uh, where you basically
00:33:40.820 have to do an Oxford style debate, but with one rule, you have to take the opposite side
00:33:45.540 of what you actually believe.
00:33:46.540 That helps a lot because it's very easy.
00:33:49.500 If you're in an echo chamber, if everybody agrees with you and the, and some people just
00:33:53.740 agree with you more adamantly than, than others, um, it's very easy to think people who disagree
00:33:58.440 with you are either stupid or evil.
00:34:00.380 Um, and that's a very easy perception to come across.
00:34:02.900 But law school, you know, you know, it's pretty, it's pretty good for, because you've seen
00:34:06.680 yourself, you start seeing in yourself that like when moot court is assigned, you hear
00:34:11.460 what the case is and your initial impressions of it's like, oh yeah, totally.
00:34:15.820 Like that should totally go towards the plaintiff.
00:34:18.500 Um, but then you get assigned to the defendant and within like a couple of days of reading,
00:34:22.100 like, oh, it's totally a defendant.
00:34:23.160 And you realize how pliable, how, how convincible you are.
00:34:26.240 It really helps you understand, um, some of the tribalism and to understand that generally
00:34:31.040 people aren't motivated, quick digression, but it'll make sense.
00:34:36.140 Um, one of the most frustrating things about the book, uh, has been that people sometimes
00:34:40.520 only respond to the title.
00:34:41.860 And I've done some radio shows where it's really clear that the host has only read the
00:34:47.080 title.
00:34:47.540 It's like, oh, so coddling.
00:34:49.000 And usually if they're a little bit more left on the spectrum, they think coddling's,
00:34:52.820 uh, you know, offensive.
00:34:54.320 But if they're more right, I've gotten a couple of people saying, um, uh, how good
00:34:59.160 intentions and bad ideas.
00:35:00.620 And I was like, well, what are the good intentions?
00:35:02.220 You know, I can't believe you're saying that these, you know, these left radicals have good
00:35:06.040 intentions.
00:35:06.680 And what I just say is kind of like generally for movements in humankind, people don't stand
00:35:11.140 at the top of the mountain and say, in the name of evil, follow me.
00:35:16.700 But, but I will tell you, don't all of those people stand at the pinnacle and say, I know
00:35:23.260 I'm right.
00:35:23.840 Yes, exactly.
00:35:26.840 Certainty is the, certitude is the problem.
00:35:29.520 Certitude is the problem.
00:35:30.600 And it's one of these things where, and I've heard people say that you have to, you know,
00:35:34.040 go towards certainties and we're just really tempted towards and all this kind of stuff.
00:35:37.100 And I think it's true, but it is really possible if you work at it to have that wonderful sense
00:35:42.420 of like looking at a gigantic library full of books becomes like looking at a night sky
00:35:46.840 full of stars.
00:35:47.620 You know, it's just wonderful about the things you don't know.
00:35:50.860 Being certain about something is not bad as long as you say, but I am open to new information
00:35:59.080 until I am absolutely positive until some new piece comes that I didn't know about that
00:36:05.840 might change everything.
00:36:06.920 Well, and I talk about free speech as being a natural consequence of the fact that individually
00:36:10.720 we're not all that clever.
00:36:13.180 Even the smartest of us, we need to, we need to consult with the best ideas and occasionally
00:36:17.580 some of the worst ideas in human history.
00:36:20.460 I will tell you some of the, you know, I, I obviously was not, not for Barack Obama and
00:36:27.440 I get into people on the right are like, how could you possibly say this?
00:36:31.720 Barack Obama made me a better man.
00:36:33.840 He absolutely made me a better man.
00:36:36.600 I am glad in some ways that Barack Obama was there because he threw me up against the wall
00:36:45.720 and challenged what I thought I knew.
00:36:50.300 I had to, I learned about anti-colonialism.
00:36:55.280 I learned about the progressive era.
00:36:57.340 I learned about the constitution deeply.
00:37:01.780 I've learned so much.
00:37:03.960 Same with Donald Trump.
00:37:05.840 You know, we can either look at this as a bad experience or a good experience that you
00:37:09.920 learn from, you know, learn from it, learn from it.
00:37:14.320 But are we?
00:37:16.140 Yeah.
00:37:16.540 I mean, I have a very expansive, you know, view of freedom of speech where it comes down
00:37:21.380 very simply to, it's important to know what people really think, period.
00:37:27.120 And, and I say this and people, they're, they're kind of like, but because a lot of the way people
00:37:32.500 try to challenge freedom of speech is by saying, well, what if they have terrible ideas?
00:37:35.740 It's like, do you think you're safer for not knowing those terrible ideas?
00:37:39.240 Do you think that, and also.
00:37:41.380 I'd want to live, if my kids were, we were living next door to somebody.
00:37:44.320 Who was a real racist.
00:37:45.580 I don't want him saying all the politically correct things.
00:37:48.460 I want him.
00:37:49.340 I want my son going over there and coming.
00:37:51.500 Dad, dad, you know, he was just saying, great.
00:37:54.700 We know who he is.
00:37:56.320 Don't go there anymore.
00:37:57.720 You know what I mean?
00:37:58.320 I talk about censorship as being a little, a little blue, but like taking Xanax for syphilis.
00:38:04.720 Where essentially you're just taking something that makes you feel better, but you're just
00:38:07.420 getting sicker by the minute.
00:38:08.980 And it takes, you know, it takes a little bit of like the looking at things a little bit
00:38:13.340 more sometimes like an anthropologist.
00:38:15.100 Oh yeah.
00:38:15.440 So I went on a smirk on his show.
00:38:17.400 And I was, and I was there to talk about why, to talk about the disinvitation of Steve Bannon
00:38:23.400 from the New Yorker Festival.
00:38:25.940 And, you know, a lot of celebrities got up in arms that they were going to do an interview
00:38:29.960 with Steve Bannon at the New Yorker Festival of Ideas.
00:38:34.400 And I was there to-
00:38:35.380 Festival of what?
00:38:36.060 Of ideas.
00:38:36.780 Okay.
00:38:37.160 Yeah.
00:38:37.540 Ideas.
00:38:37.960 And I was there, you know, of course, with my First Amendment technical hat on, I'm like,
00:38:41.760 well, of course, the New Yorker can invite whomever it wants.
00:38:44.080 But with my marketplace of ideas, sort of like knowing what people really think hat on,
00:38:49.200 I was like, okay.
00:38:50.260 And then the responses I got on Twitter were the funniest.
00:38:53.560 People were like, so you're saying you would have ought to hear an interview with someone
00:38:56.080 from ISIS?
00:38:56.660 And I'm like, I would love to hear an interview.
00:38:59.400 Yes.
00:38:59.600 It would be one of the most interesting interviews you can imagine.
00:39:01.900 And, you know, you stare into the face of evil.
00:39:03.660 That's great.
00:39:04.620 And then the other stream that people were going for was, but now he's irrelevant.
00:39:08.180 And I'm like, he was arguably the second most powerful person in the White House, like
00:39:14.000 two weeks ago.
00:39:15.820 You're kidding yourself.
00:39:16.780 And now he's talking to all these groups in Europe.
00:39:18.700 So it is this, you know, we talk about this, me and Pamela Pratsky, she was our chief
00:39:25.760 researcher for the book, and John, we talk about moral pollution a lot.
00:39:29.480 Basically just the idea that once you get super tribal, it becomes this much more kind of
00:39:35.220 superstitious idea that if I'm in the presence of, if I shake the hands of, if I'm anywhere
00:39:39.400 near, you know, the bad, the bad man, it's somehow like, it's going to rub off on you like
00:39:45.620 some kind of evil pox.
00:39:47.120 I think one of the most vile voices out there is Louis Farrakhan.
00:39:51.540 I'm glad I can hear exactly what Louis Farrakhan is saying.
00:39:55.160 You know, I don't want him silenced.
00:39:56.520 You know, you could invite him to be here by Mitzvah, and you'd be like, oh my God.
00:39:59.660 Yeah.
00:40:00.360 I thought you were a great guy.
00:40:01.520 I had no idea.
00:40:02.460 Right.
00:40:03.460 Are you, how concerned, let me just take a quick offshoot here.
00:40:07.840 Yep.
00:40:08.420 How concerned are you about the growth of Google with its algorithms now being taught what to
00:40:20.300 recognize hate speech?
00:40:21.980 Yeah.
00:40:22.120 How, I mean, what hate speech is, at first, I don't believe in hate speech.
00:40:26.160 Right.
00:40:26.540 But what hate speech is to one person is not hate speech to the other person.
00:40:30.560 Right.
00:40:31.520 What, are you concerned about the loss?
00:40:34.180 I mean, the colossal overnight loss of, I would call it a digital ghettoization.
00:40:40.120 Yeah.
00:40:41.300 Hate speech has always been kind of the boogeyman that you have to deal with when you're dealing
00:40:44.600 with free speech on campus.
00:40:45.800 And the first thing you have to explain is there's a whole generation of students who
00:40:50.520 largely believe that hate speech is protected.
00:40:52.880 Sorry, it's unprotected speech.
00:40:54.740 They think it's a special category of unprotected speech.
00:40:57.520 And that's just not true.
00:40:58.820 It's too vague.
00:40:59.880 It's too broad.
00:41:00.500 It wouldn't fit any of the First Amendment analyses.
00:41:02.820 But then you have institutions like Google, you know, who I've always had a great deal
00:41:08.140 of respect for.
00:41:08.940 But then you look at cases like what happened to James Damore, you know, who wrote something
00:41:12.600 that was, you know, I think Haidt wrote about it saying it was, you know, it wasn't perfectly
00:41:18.220 right on everything, but it was also a...
00:41:20.820 Not dismissive.
00:41:21.420 It was a dispassionate, you know, argument of what the stats say about gender differences,
00:41:25.740 including preference.
00:41:26.940 For some reason, like the taboo around saying that men and women might actually be drawn
00:41:31.980 to different fields, it's like, is that really the end of the world?
00:41:35.420 But anyway, but yeah, the idea of a handful of institutions having so much power over what
00:41:42.780 we can read and what scares me.
00:41:47.300 And if they start actually policing hate speech, I get worried that the work that I do, where
00:41:53.020 we're, you know, and I always have to be clear, 99 out of 100 cases that we're dealing with
00:41:58.760 are more like the guy getting in trouble for reading a book, or for, you know, cracking
00:42:03.600 a joke, that anybody off campus would be like, I don't even understand what was offensive
00:42:09.540 about that, is going to get in trouble.
00:42:12.300 Meanwhile, though, I do have some sympathy for Google and for Facebook, because they're
00:42:17.140 being pushed towards this by some really idiotic laws coming out of the European Union.
00:42:22.500 Do you know about this whole right to be forgotten thing, right?
00:42:26.100 Right?
00:42:26.760 Right to be forgotten.
00:42:27.760 Forgotten?
00:42:28.300 Yeah.
00:42:28.900 Is this like transgender naming?
00:42:30.960 No, no, no.
00:42:31.760 Okay.
00:42:32.060 This is much, much worse than that.
00:42:34.400 To be forgotten.
00:42:35.280 The European, one of the European courts issued a decision talking about you people, individuals
00:42:43.200 have a right to be forgotten.
00:42:44.300 And there was a law passed that tried to make this law, controlling law for the entire EU
00:42:52.100 that put it on Google.
00:42:55.320 If someone came to you and said, that article about me is old and irrelevant, so you have
00:43:00.100 to remove it or face a huge fine.
00:43:03.420 Yeah.
00:43:04.340 Face a huge fine unless Google, for some reason, decides to actually put up a fight to keep
00:43:08.820 it.
00:43:09.000 So it's like, it's all downside for Google.
00:43:12.660 Subsequent decisions say that it can't just be for Google Europe.
00:43:16.160 It has to be for Google for the entire world.
00:43:18.080 And it comes from this kind of ridiculous idea that, you know, like if, you know, so what,
00:43:22.660 you know, so what if I murdered someone 20 years ago?
00:43:26.300 I have a right to be forgotten, to be forgotten.
00:43:29.040 And it's just so, it's among numerous dunderheaded laws that I see coming out of Europe that are
00:43:37.320 actually having spillover effects to the whole rest of the world.
00:43:40.140 So in some ways, you know, I am worried about the internal politics of Google, but I'm also
00:43:44.120 worried about how different, you know, governments are sort of taking advantage of every opportunity
00:43:49.680 to limit them.
00:43:51.020 That's the thing I love about our Constitution.
00:43:53.480 Yeah.
00:43:53.960 It doesn't, you don't have a right to be forgotten.
00:43:56.060 Yep.
00:43:56.720 You know, 18th Amendment is, is it the 18th was prohibition?
00:44:01.260 No.
00:44:01.520 Yeah.
00:44:01.760 18 is still there.
00:44:03.280 Yep.
00:44:03.500 But 21st repeals it.
00:44:06.340 Yep.
00:44:06.520 But that scar is still there.
00:44:08.360 So you learn.
00:44:09.380 Yeah.
00:44:09.540 You know, perhaps you read it all and you go, hey, we did that once before.
00:44:14.120 Let me take, let's, let's go through the three bad ideas.
00:44:23.880 Oh, sure, sure.
00:44:24.580 Yeah.
00:44:24.780 Okay.
00:44:25.220 So part of the idea of the book was, it was to kind of recreate sort of what we did in
00:44:29.700 the original article.
00:44:31.480 And basically saying it's as if we are giving a generation of people, of kids, of younger
00:44:38.020 people, the worst possible advice you could ever imagine.
00:44:40.680 And so we talk, we create this situation of going up to this, you know, supposedly wise
00:44:45.080 man.
00:44:46.500 And he tells us three, three pieces of what he thinks are wisdom.
00:44:51.240 What doesn't kill you makes you weaker?
00:44:52.760 Always trust your feelings.
00:44:56.000 And life is a battle between good people and evil people.
00:44:58.920 And we do this as kind of a joke in the beginning of it.
00:45:01.500 And we have, it's me and John going, that's like, those are like the worst ideas we've ever
00:45:06.000 heard in our entire life.
00:45:07.620 And so the first one, what doesn't kill you makes you weaker is obviously a play on Nietzsche.
00:45:11.420 What doesn't kill you makes you stronger.
00:45:12.920 And of course we recognize it's like, yes, there are things that are short of killing
00:45:16.360 you that can still, you know, leave you in worse shape.
00:45:18.700 But it stands for a great truth, which is, you know, both, so we tried to make all the
00:45:24.700 great untruths, things that were both bad in terms of modern psychology, what modern psychology
00:45:31.460 would tell you, and bad in terms of resilient ancient wisdom, which is surprisingly coherent
00:45:36.120 on a number of issues.
00:45:37.340 One of them is that people need challenge.
00:45:39.300 You're going to see that in practically every culture, that it would be absurd to say people
00:45:43.780 don't need challenge.
00:45:45.820 But what we see on campuses that we dub safetyism is, and also for parents these days, you know,
00:45:53.340 K through 12, this idea that kind of like there's no limit to how safe you can be.
00:46:00.080 And they also expand that into that weird kind of definition of safety that means like emotionally
00:46:04.660 unperturbed.
00:46:05.940 So the concept creeps in two different directions, that there's no amount of physical safety that's
00:46:11.240 too much, or it comes with no bad side.
00:46:13.980 And by the way, let's add an emotional safety too.
00:46:16.780 And of course, you know, what we talk about in the book is Nassim Taleb's idea of anti-fragility.
00:46:21.880 Human beings aren't, we're not fragile, and we're not merely resilient.
00:46:27.800 We're actually creatures that need stressors.
00:46:30.300 We need to be challenged, or we atrophy and die, or we grow healthy and strong.
00:46:34.840 So, you know, probably best represented by, you know, astronauts, if you send them up
00:46:38.960 to send them up into no gravity, their joints start decaying really quickly.
00:46:44.800 But on the other hand, you know, if you if you run every day, and you lift a little bit
00:46:48.240 of weight, it's amazing how much how much you can improve.
00:46:50.660 I think it's, I think it's interesting.
00:46:52.440 They're doing studies now on what what the what they think the effects will be on living
00:46:56.680 on Mars.
00:46:57.460 Yeah.
00:46:57.780 And they believe that after I think it's 20 years of living on Mars, that you actually
00:47:03.860 won't be able to mate with an Earthling, because you will no longer be technically what we call
00:47:12.140 human.
00:47:12.680 Wow.
00:47:13.240 So you're, so you actually you're changing.
00:47:15.980 Uh huh.
00:47:16.300 And I think it's interesting that part of being human is having the pull and the drag on you.
00:47:26.180 Absolutely.
00:47:27.480 And so what we see with this obsession of safety is that there wasn't really meaningful
00:47:31.780 pushback saying that, listen, we can take this too far.
00:47:34.480 It can actually be harmful.
00:47:36.260 Um, but of course it can be harmful.
00:47:38.240 Uh, it's just the same way we tell people, you know, um, you don't overcome phobias by,
00:47:42.540 uh, you know, bubble wrapping the world from your phobia.
00:47:45.260 Right.
00:47:45.740 Um, so that's great truth number one.
00:47:47.420 The second one I actually like because it sounds so darn romantic, um, which is, uh.
00:47:52.620 Follow your feelings, Luke.
00:47:53.760 Your feelings are always right.
00:47:54.860 Um, and every, you know, uh, a lot of, not every, um, but you know, movies and sci-fi
00:48:01.140 and a lot of stuff that I love does a lot of times have a, have this idea of your feelings
00:48:04.860 are always right.
00:48:05.520 And in one sense, it is correct to say that your feelings are always telling you something.
00:48:11.480 Just, it's not always what you think it is.
00:48:13.340 Um, Susan David, uh, um, has this great quote where she, it used to take me paragraphs to
00:48:18.460 say that, you know, you run into that where you feel like you took a book to explain something
00:48:22.780 and someone gets it down to like a pithy phrase.
00:48:24.600 She says, feelings are information, not directions.
00:48:29.340 Um, why you're angry, why you're jealous, why you're, uh, why you're guilty without
00:48:34.700 interrogating those things.
00:48:35.920 We could be way off base on, on where they're actually coming from and what they're trying
00:48:39.480 to tell us.
00:48:40.000 So have you ever read Gavin DeBecker, The Gift of Fear?
00:48:43.240 No.
00:48:43.700 You should.
00:48:44.300 He's one of the best protectors, um, in the world.
00:48:48.160 Um, and his book Gift of Fear starts out with everybody always says when there's a serial
00:48:55.660 killer, you know, you know, I thought something was weird, but I dismissed it.
00:49:00.300 But my dog, every time he came by that dog, my dog went crazy.
00:49:04.900 And he said, the difference, the, the, we both have dogs and people have a gift and it's
00:49:11.420 a gift of fear.
00:49:12.320 Yep.
00:49:12.980 Dogs don't analyze it and then rationalize it away.
00:49:17.560 You have to examine it because the dog's not always right.
00:49:20.980 Right.
00:49:21.380 You know what I mean?
00:49:22.200 You just might smell like someone that they didn't like previously.
00:49:25.320 Correct.
00:49:25.460 Well, a book I always like to recommend is called The Upside of Your Dark Side, which talks
00:49:32.100 about how, you know, all these quote unquote negative emotions can actually have, you know,
00:49:36.740 that we have a built in system for defending yourself if you're wrong.
00:49:40.240 You know, that, that, that, that.
00:49:41.180 All of this, all of this stuff is so heretical now.
00:49:45.300 This is the stuff that was ready to be deleted from Kindle or burned.
00:49:48.760 Yep.
00:49:48.980 Yeah, exactly.
00:49:49.760 Um, so Upside of Your Dark Side, really got to recommend it.
00:49:52.060 But, you know, the emotional reasoning one is really dear to my heart for obvious reasons,
00:49:55.660 um, because, you know, overcoming depression and anxiety is partially talking back to your feelings.
00:50:00.520 It's going kind of like, okay, I know I'm terrified right now, but guess what?
00:50:04.580 Nothing's actually happening right now.
00:50:06.860 Um, and the amazing thing about CBT and, and people sometimes really get hung up on the
00:50:11.860 fact that there's a T at the end of that and it's therapy.
00:50:13.840 And it's like, aren't you recommending the therapeutic state that got us into this?
00:50:17.480 And I always have to say, if you think about what CBT is actually saying, it's saying, it's
00:50:22.820 basically applied stoicism.
00:50:24.760 Um, it's in line with ancient philosophy.
00:50:27.540 It's in line with Buddhism, um, at the same time, trying to actually, uh, you know, the
00:50:31.460 practice of seeing your thoughts is not necessarily, you are not your thoughts is, is like a distillation
00:50:35.860 sometimes of Buddhism.
00:50:37.960 Uh, but unfortunately on campuses, it's as if we're saying, you know, if you're ever offended,
00:50:42.780 we have to do something about that.
00:50:44.820 Don't examine if you should be offended.
00:50:47.080 Don't be examined.
00:50:47.980 Don't examine if it's rational.
00:50:49.880 Don't examine any of that stuff.
00:50:51.240 But being offended is enough.
00:50:53.260 And that's a really dangerous, you know, uh, behavior to, to, to, to cultivate because
00:50:57.640 you end up leading to a situation where people can really convince themselves that the entire
00:51:02.400 world needs to be silenced.
00:51:04.480 It's, um, you know, you said you, you are not your thoughts.
00:51:08.800 I believe you are your thoughts.
00:51:10.500 And we're just teaching, we're teaching people not to examine their thoughts and just be
00:51:18.720 comfortable living in their fear.
00:51:21.420 Aren't we?
00:51:22.020 Well, the, the, the, you are not your thoughts idea.
00:51:24.040 And this is one of the fun things about meditation and I'm terrible at meditating, by the way.
00:51:27.780 And I occasionally have people saying that, but that must mean you're a great Buddhist.
00:51:30.700 I'm like, no, no, I really need it.
00:51:32.060 I'm not good at it.
00:51:33.340 But I have, you know, after like a weekend, you kind of reach a point where you can see sort
00:51:37.640 of your thoughts sort of bubbling up and you don't necessarily have to do anything about
00:51:41.460 them.
00:51:41.880 Correct.
00:51:42.180 You can, you can just watch them.
00:51:43.320 You write, you write in the book, your worst enemy cannot harm you as much as your own thoughts
00:51:47.420 unguarded.
00:51:48.820 Yeah, exactly.
00:51:49.500 Right.
00:51:49.720 So it's just guarding your thoughts.
00:51:51.420 Yep.
00:51:52.000 Examining them.
00:51:52.600 And, you know, for me, I got this, but when I did the National Constitution Center, I got
00:51:57.260 in this funny argument with Jeff Rosen that was really actually kind of awesome.
00:52:00.540 We were talking about, um, is it okay to, you know, to have bad thoughts?
00:52:05.340 And, and he talks about how a lot of Buddhism is, is going towards right thinking.
00:52:09.680 And meanwhile, you know, I'm more of the, you can think whatever you want, as long as
00:52:14.220 you don't think they're telling you what you need to do.
00:52:16.800 But, you know, then again, I used to write science fiction.
00:52:21.100 Okay.
00:52:21.760 So, uh, that's, that's greater truth.
00:52:24.400 Number two.
00:52:24.760 Yeah.
00:52:25.020 Number three.
00:52:25.700 Number three, life is a battle between good people and evil people.
00:52:30.620 And that one is the great untruth of us versus them.
00:52:34.360 Um, um, now, of course I do occasionally get the question, like, are you saying human
00:52:37.560 evil doesn't exist?
00:52:39.060 And I say, I absolutely believe human evil exists.
00:52:42.480 Um, and I think the best definition that's come up, uh, that anyone has come up with it
00:52:46.420 is F. Scott Peck's definition in a book called People of the Lie, where he basically says that
00:52:51.960 human evil on an individual level is, are people who are sociopaths who, uh, also get joy
00:52:58.920 from hurting people.
00:53:00.240 Um, would you, would you, cause I know you write about, um, you write about him.
00:53:05.340 Would you put Foucault in, in that category?
00:53:08.540 I don't know enough about him.
00:53:10.500 I know they didn't really practice what he, uh, what he preached, so to speak, but I don't
00:53:15.160 know much about personality.
00:53:16.320 I'm just thinking about this, anyone, I think postmodernism, the way he described it when
00:53:25.120 he came here, um, and started using it after the Paris riots, um, was with the intent to
00:53:32.940 destroy, to destroy the, um, enlightenment, the, you know, everything that came with the
00:53:39.560 enlightenment, um, uh, that to me, at any time somebody is doing something covertly that is
00:53:48.500 trying to destroy, cause I cannot find a, I can't find a good reason for postmodernism
00:53:55.300 and postmodern thought, um, when the goal is no, the enlightenment, no science, empirical
00:54:04.600 data, that's all bad.
00:54:06.400 Yeah.
00:54:06.560 There is no truth.
00:54:07.640 I can't find a, a, a good human reason for that.
00:54:14.340 You know, what, how, what is that building?
00:54:16.260 It's interesting because I've known people who are self-described, um, you know, existentialists
00:54:21.140 or, or even nihilists who are perfectly fine.
00:54:24.800 Um, you know, who, who somehow it's just a sort of fun game that they play in their head.
00:54:29.820 That's different than setting out to, when he arrived.
00:54:34.600 When he, when he arrived and he brought this into the university system, the story is that
00:54:39.320 they were on the tarmac of, in Boston and one looked at the other and said, you know,
00:54:44.520 what we're doing is we're planting a virus in this culture.
00:54:50.520 Interesting.
00:54:51.240 That kind of bad.
00:54:53.520 Yeah.
00:54:54.240 Well, but I, and I do think there are, there are dangerous ideas, but all we're really saying
00:54:59.540 is the relatively old fashioned notion.
00:55:01.860 And for the most part, people are both a combination of, you know, some good motivation, some bad
00:55:06.380 motivation.
00:55:06.860 Some people have better control over their impulses and others.
00:55:10.380 Um, and if your first assumption is that if you're on the other side of the puzzle fence
00:55:14.320 for me, that you're evil.
00:55:15.380 Yes.
00:55:15.760 You're doing it wrong.
00:55:17.520 I, I, you know, when I left Fox, you know, you can't be hated by half the country and not
00:55:24.900 go, gosh, am I that?
00:55:27.400 Yeah.
00:55:27.700 You know, what part of that am I?
00:55:29.860 What's true?
00:55:30.700 What's not on this?
00:55:32.400 And one of the first things I did was I tried to ban the word evil from my lexicon because
00:55:38.360 it's, you know, it, it, that's a pretty intense word.
00:55:43.420 Sure.
00:55:43.520 And, um, in, in, in trying to talk to people, let's say on the right or the left, let me
00:55:50.700 just use the right, talking to people on the right and saying, no, let me Democrats, they
00:55:57.280 don't want to destroy America.
00:56:01.040 People will in their head see, well, this guy, this guy, this guy does that guy, that guy,
00:56:07.440 that guy is not all Democrats, you know, and we, we are so labeling.
00:56:13.220 And once you say, oh, the Democrats want to destroy America or the conservatives want
00:56:17.700 to destroy this group, um, that's evil.
00:56:21.940 Yeah.
00:56:22.360 Well, and the problem is, of course, it feeds, uh, and, and group dynamics.
00:56:25.720 That's another thing that I'm Scott Peck talks about is that when you really want to see
00:56:28.980 some of the worst things humankind has ever done, it's in situations of sort of, uh, where
00:56:32.920 people have their war hats on and there's diffuse authority, um, where essentially nobody's
00:56:37.040 really taking responsibility for it, for any individual, uh, any individual thing.
00:56:41.400 But part of the problem is that it becomes almost a self-fulfilling prophecy because,
00:56:46.720 you know, um, and I, I, I like to blow conservatives minds by, by saying this part, um, you should
00:56:52.220 understand that there are people that I'm friends with in San Francisco, uh, when, when
00:56:56.000 they're, when they go on like anti-Obama rants, I'm like, then they think he's a neocon
00:56:59.840 and I'm not kidding.
00:57:01.260 Like they, they think he's essentially, you know, right of center or like a right wing, like
00:57:05.200 basically like, and it's like, yeah, that I know, I actually know these people.
00:57:09.920 And, uh, but unfortunately the more we get our war hats on and the less actual exposure
00:57:14.900 we have to other, uh, to people from the other side of the fence, the easier it becomes to
00:57:19.900 make them into cartoons and people that you don't, uh, that have nothing useful or productive
00:57:24.680 to say.
00:57:25.400 And I, I, I do feel, uh, you know, I almost feel like I have to apologize for this.
00:57:29.320 Um, the, you know, I was, I, I thought of myself as being, and compared to a lot of my
00:57:34.660 classmates, I was, you know, open-minded, um, when it came to, you know, heretical ideas
00:57:39.140 on campus.
00:57:40.220 But, um, you know, when I was in law school, you know, it's Stanford, it's the Bay Area,
00:57:44.760 um, you know, labeling someone as a conservative thinker was a way like, oh, well, you know,
00:57:49.260 I didn't realize, um, you know, that I shouldn't be reading Edmund Burke or Thomas Sowell
00:57:53.600 or Camille Paglia.
00:57:54.940 And then of course I finally did.
00:57:56.200 And I was like, oh, really?
00:57:58.220 Like, this is the, this is the, the, the ideas you're trying to protect me from, you know,
00:58:02.360 and then realizing that thoughtful people all over the spectrum were, were able to talk
00:58:05.880 about, you know, what was valuable in Burke and certainly, you know, what, uh, and Sowell,
00:58:10.180 I mean, like absolutely, you know, an amazing thinker.
00:58:13.540 Um, and that's part of what I call like the first protection of, of what I call the perfect
00:58:18.940 rhetorical fortress, that we're spending all of this, um, uh, cognitive energy on college
00:58:25.060 campuses to try to figure out ways, reasons for why you don't have to listen to somebody.
00:58:30.060 And, you know, defense number one is, well, you're a conservative, so I don't have to listen
00:58:33.240 to you.
00:58:33.900 Done.
00:58:34.820 Done with 50% of the population.
00:58:36.680 But as you get in deeper, like a lot of the, the privilege theory, and of course, privilege
00:58:41.380 is a real thing.
00:58:42.120 There are comparatively privileged people.
00:58:43.540 There's no question about that.
00:58:45.260 Americans.
00:58:46.020 All Americans.
00:58:46.860 For example, certainly compared to.
00:58:49.360 The rest of the world.
00:58:50.140 Where my dad grew up.
00:58:50.760 Yeah.
00:58:51.220 For example, um, the, uh, but when you make it really sort of like draconian, really, um,
00:58:57.180 about kind of like what race you are and what your background is, if you follow the sort
00:59:01.780 of, uh, the privilege hole all the way down to the bottom, it applies to 100% of the entire
00:59:06.960 population.
00:59:07.860 But here's the trick.
00:59:09.260 You don't have to call privilege on someone if you don't feel like it.
00:59:12.220 So you can, so you now have a, you know, you now have an intellectual tactic that gives
00:59:16.200 you multiple levels of defense for having to listen to anybody.
00:59:18.940 Uh, you, you disagree with, um, we've, we've done it.
00:59:22.040 We've come up with this perfect fortress.
00:59:23.480 So you never have to listen to anyone you agree with, uh, disagree with, but still have
00:59:26.760 the option of listening to everybody you do agree with.
00:59:28.660 And it's so pointless.
00:59:30.680 So, so it's one of these things like watch the way people argue on Twitter and, you know,
00:59:34.500 people on the right do this too.
00:59:35.520 They're kind of like, why should I listen to some libtard from, uh, from, from Massachusetts?
00:59:39.760 Um, but on the, on the left, the, the tactics are like, well, first I'm going to call you
00:59:44.580 out for, you know, being a white male heterosexual.
00:59:47.580 Well, it's like, well, actually I'm gay, you know, and it's like, well, and the next one,
00:59:50.540 but you're a conservative.
00:59:51.440 So I don't have to listen to a lot of actually, you know, uh, that you can go down and down
00:59:55.460 and down.
00:59:55.780 It's like, wow, there's like 50 levels of defense you have to ever having to, before
01:00:00.060 you actually even get to the argument.
01:00:01.460 And as far as like literal cultural fixes, you know, just, it's just another ad hominem.
01:00:06.460 It's just another way to, to basically say, I don't need to actually address what you're
01:00:09.700 actually saying.
01:00:10.200 Cause you, um, and it's just not productive.
01:00:12.880 It just leaves us nowhere.
01:00:14.640 All right.
01:00:15.100 So let's dismantle all three of these.
01:00:16.940 Give me the, give me the cures or the, the steps that we should all be taking with all
01:00:22.280 three.
01:00:22.520 Let's start with, with, uh, the bad idea.
01:00:25.260 Number one, what doesn't, what doesn't kill you makes you weaker.
01:00:29.460 Do you want the deep ones or the easy ones?
01:00:32.400 Surprise me.
01:00:33.360 I'd like a little of both.
01:00:34.620 You know, it's one of these things where I don't want to get too bleak because I do
01:00:39.100 think that some, because, you know, conservatives a lot when, when I talk about what I do on
01:00:42.720 campus, which is defend freedom of speech, there's a lot of like, oh, it's lost forever.
01:00:48.060 The Academy is gone.
01:00:49.340 Kind of like the people will never have free speech there again.
01:00:52.240 And I'm like, but have we even tried, you know, giving lectures about freedom of speech
01:00:56.440 ever?
01:00:57.240 I don't think, I think, I think the, I think the, I think the
01:00:59.440 biggest problem is we are a culture that is, um, teaching everyone you're wounded and
01:01:06.620 there's no recovery.
01:01:08.020 Um, uh, the second thing we're teaching people is you should not talk to, to others.
01:01:15.440 Um, and, um, the, the, the problem is I think we're running, I had a train of thought here,
01:01:22.880 but I lost it.
01:01:23.840 I think the problem is, is we're, we're running out of time and if we don't get these things
01:01:33.600 fixed pretty quickly, it is pretty pessimistic, isn't it?
01:01:38.480 Yeah.
01:01:38.980 No.
01:01:39.220 And, and that's where, you know, on our bad days, John and I are both like, you know,
01:01:43.260 how are we going to, how are we going to fix this?
01:01:45.060 But for younger kids, you know, I, I definitely, you know, like I said, I have two kids under
01:01:48.960 three, um, delightfully, some of the things that could be the best, um, are the things
01:01:53.520 that kids enjoy the most.
01:01:54.460 Uh, we have a whole chapter on play, um, teaching people about their own anti-fragility.
01:01:59.880 Let them, let kids play and let them play in a way that's, that adults aren't actually
01:02:04.000 running it.
01:02:04.980 Amen.
01:02:05.840 My, um, my wife and I were in the car after I read this and I said, you know, kids have
01:02:11.780 to ride their bikes and they need to ride, you know, get out of the, we live in a gated
01:02:15.380 community, get out of the, she said, Oh my gosh.
01:02:18.140 And I said, no, honey, there's, it's not, it's not between a lot of Steven Pinker.
01:02:22.220 It's not, it's not that bad.
01:02:24.460 In fact, it's really good, but there's two problems.
01:02:28.220 The adult doesn't want, yeah, but if it happens, then I'm a bad parent.
01:02:34.400 Um, and the, and the, and the, the stats don't matter to people.
01:02:40.660 Yeah.
01:02:41.120 And that's something we talk about and we try to, we try to show compassion for everybody
01:02:44.440 in this book.
01:02:45.060 Um, we, we bend over backwards to, to, to, to, to do that.
01:02:47.980 So I try to figure out like why parents who are living in the safest age possibly in human
01:02:53.720 history, probably in human history, almost certainly in human history, um, are acting
01:02:58.080 like it's the height of the crack epidemic in, in New York city in terms of murder.
01:03:02.800 And, um, you know, like, but, but then I remember, you know, of course, uh, it was a, when I was
01:03:13.740 a kid, uh, you know, I, I, I started college in 1992 and for my entire life, it had been
01:03:19.520 a safe bet that it was going to be more dangerous in terms of murder rate country the next year.
01:03:24.000 And so a lot of people who are my age or even close to my age who are having kids did grow
01:03:29.120 up in a situation where it seemed like, you know, projecting forward, I used to write like
01:03:32.760 dystopian science fiction about what 2000 would just be everybody, you know, you know, having
01:03:37.020 to arm themselves with multiple machine guns because it was just that dangerous.
01:03:39.760 But then amazingly it, it, everything started getting a lot safer and we still don't know
01:03:43.800 entirely why, which is amazing by itself.
01:03:46.000 Um, so, but now we live in with this major disconnect, you know, like the affluent parents
01:03:50.740 think that their, their kids are going to be kidnapped at any step and they're, and
01:03:54.100 you know, statistically speaking, they're just, it's just extraordinarily unlikely.
01:03:58.320 Have you seen the movie Taken?
01:04:01.040 But then you get such a cool opportunity to get Liam Neeson, she has amazing skills.
01:04:06.280 But yeah, but so one problem that, that, uh, it does lead to is that is finding the other
01:04:11.220 parents who are willing to, so your kids can have someone to play with.
01:04:15.380 But now I think there's some energy to do this.
01:04:17.380 So like in my neighborhood, I'm going to be talking to other parents about let's have, you
01:04:20.600 know, a free range kids group where, you know, the park that's right next to us, you
01:04:24.620 know, like our kids are, you know, our kids are able to get together and they have permission
01:04:28.340 to go, go play, you know, they have cell phones for goodness sakes now, like if they
01:04:32.600 actually get in trouble, they can call.
01:04:34.420 So play is a big part of it.
01:04:35.660 Probably one of the simplest ones is, you know, petition your local public school to,
01:04:39.520 uh, have the, the playground open for the hour, hour before and for two hours after
01:04:43.960 school, you know, like your kids are going to want to hang out and play with their
01:04:47.220 friends if they're allowed to.
01:04:48.520 So play is a big part of it.
01:04:50.480 Um, we also have come around to, uh, gap years, um, taking a gap year between high
01:04:56.140 school and, uh, um, and college and, and, you know, if once again, if I could wave my
01:05:00.660 right, wave my magic wand, it would be, you know, if you live in New York city, you go
01:05:04.600 work in Arizona, you know, in a real job or you go to some, basically you go away
01:05:09.880 somewhere and, and, and, and, and the way that could happen actually relatively easily
01:05:13.600 is if colleges show that they really valued it, um, that, that you would get extra, um,
01:05:18.280 attention if you, if you had some real life experience before going to school.
01:05:21.660 And by the way, the research there is really strong.
01:05:23.840 And when I went to law school, it was shocking how, well, not really shocking if you think
01:05:28.100 about it, but really dramatic, the difference between the students who had just come right
01:05:32.000 out of college and the people who had, you know, jobs beforehand.
01:05:35.860 And overwhelmingly the people who had jobs or had other lives before they went to law
01:05:40.360 school, got better grades, they had better attendance.
01:05:42.780 I went at 30 and I could not believe the, the people, the other kids in class, I'm with
01:05:49.940 underclassmen and they didn't care.
01:05:53.220 Yeah.
01:05:53.460 And they was, this is amazing.
01:05:54.660 Yeah.
01:05:54.800 It was almost one-on-one with me and the professor cause they didn't care.
01:05:58.400 They just wanted to get through.
01:05:59.780 Yep.
01:06:00.040 I wanted the information.
01:06:01.800 Yeah.
01:06:02.140 So gap year is definitely part of it.
01:06:03.700 When it comes to colleges, um, the biggest enemy in this is the idea that there's nothing
01:06:08.640 that can be done.
01:06:09.760 Um, there's so much people can be, that can be done because, you know, a lot of your listeners,
01:06:14.620 you know, like, um, they, uh, people will send, you know, their, their little check to
01:06:19.220 their alma mater or to where they want their daughter to go or their son and never ask them,
01:06:24.860 do you have a speech code?
01:06:26.040 Do you teach anything about freedom of speech in the orientation?
01:06:29.160 Practically no schools do.
01:06:30.380 And it's a sophisticated concept.
01:06:31.860 It's something that really has to, you really have to understand it.
01:06:34.500 And like I said, through debate, through formal debate, you can actually practice it.
01:06:38.120 And that's how it really becomes a life skill.
01:06:40.100 You should get rid of your speech codes.
01:06:41.400 You should have classes on this stuff.
01:06:43.140 I'm always thinking about high rigor, low cost ways to signal to employers that I'm dealing
01:06:48.360 with like a, with an autodidact with someone who actually really likes to study for, I don't
01:06:53.040 know, something goofy, like the love of ideas.
01:06:55.180 There's a lot of things we can do.
01:06:57.540 And we have a website, thecoddling.com.
01:07:00.040 And we want more suggestions too, because, because we can't give up given we've tried so
01:07:05.320 little so far.
01:07:25.960 Michael Reckinwald.
01:07:26.700 You know what Michael Reckinwald is?
01:07:27.860 I know the name.
01:07:28.440 Anti-NYU professor.
01:07:30.680 Uh-huh.
01:07:30.960 I got in trouble.
01:07:32.140 Oh, yeah.
01:07:33.640 I just talked to him a few weeks ago, and he said, I wouldn't send my kids to school.
01:07:37.220 I wouldn't send, I would, it just, I wouldn't.
01:07:39.760 I know, you know, Mike Rowe.
01:07:42.500 Yeah.
01:07:42.760 Who believes in, you know, college is not for everybody.
01:07:46.000 Yeah.
01:07:47.300 Where do you stand on college, especially with an outlook of the future?
01:07:51.120 Yeah.
01:07:51.460 Google and Apple and everybody saying, we're not even taking, I don't care about your diploma
01:07:56.240 anymore.
01:07:56.660 Yeah.
01:07:56.860 Show me what you've done.
01:07:57.980 Well, I have a lot of thoughts on that.
01:08:00.380 I think about it all the time.
01:08:01.660 There's this interesting idea that Jane McGonigal has on EduBlox, where, um.
01:08:07.260 EduBlox?
01:08:07.820 EduBlox.
01:08:08.260 It's basically like a blockchain little thing that you can get on your ledger, basically
01:08:12.080 on like something you carry with you that's your account, more or less.
01:08:15.020 Wow.
01:08:15.480 That tells you that if you wanted to, you can tell somebody like every little class you
01:08:19.600 took on something.
01:08:21.260 And I, as soon as I heard this idea, I realized for FIRE, for where I work, because the great
01:08:27.500 thing about FIRE is we are actually people all over the spectrum who believe in freedom
01:08:31.300 of speech.
01:08:32.080 We practice what we preach.
01:08:33.880 The fact that my, you know, I'm more of an old-fashioned ACLU liberal.
01:08:37.880 My executive director is a Christian Republican.
01:08:40.640 And I love that.
01:08:41.900 Like, we have arguments for it and different religious backgrounds.
01:08:45.260 You know, like, it's just absolutely phenomenal.
01:08:47.480 But when we're interviewing people, the one way in which we're trying to figure out if
01:08:50.740 you're one of us, we want to know if you're a free speech nerd.
01:08:53.200 We want to know if you read, you know, philosophy on your spare time or you read about Louis
01:08:58.400 Brandeis or, you know, the Alien Sedition Act.
01:09:02.660 And if I, if rather than knowing that you went to fancy school A, I could see, oh, actually
01:09:07.780 on your own time, you did, you know, 10 great courses on law and 50 books about Supreme Court
01:09:13.840 justices, then I realized you're really one of us.
01:09:16.160 And that could be a really low-cost way of signaling.
01:09:19.040 Now, to be realistic, the Princetons and the Stamfords and the Harvards and the Yales aren't
01:09:23.980 going anywhere.
01:09:24.900 They're international brands.
01:09:27.980 But it's still kind of criminal that they're able to charge $70,000 a year.
01:09:32.000 And I think people should really be revolting about that.
01:09:34.980 I think the amount of debt we put a generation into.
01:09:37.760 It's horrible.
01:09:38.580 And it's holding back innovation, I'm sure.
01:09:41.380 It's, and it's just, it's just a crummy thing to do, you know.
01:09:43.760 And then, of course, for the kids who really, who can afford all that, that gives them a huge
01:09:47.340 leg up.
01:09:47.820 It feeds into all sorts of bills.
01:09:50.220 So I think that some of the mid-tier colleges have to really rethink their entire model,
01:09:56.700 you know, low-cost, high-rigger, things that people can do.
01:09:59.960 I was even thinking about a system where you kind of trick people into, you know, if someone
01:10:06.040 wants to take, like, the online class so they can knock out some credits before they go to
01:10:09.260 college, you know, at the end of it, it's like, by the way, you got a super high pass.
01:10:13.060 Do you want to go on to the next level?
01:10:14.180 And the final level of which would be in a free and face one year of college.
01:10:19.200 But it will be like, you know, like a super international competition.
01:10:23.660 We really got to rethink some of these things and make sure that they achieve.
01:10:26.980 But they have to achieve a couple things.
01:10:28.140 They have to say that someone is hardworking, smart, but also shows that they can, you know,
01:10:36.760 work as a team, that they can, but we can do that in a much less expensive way.
01:10:41.120 Because right now what we're doing is we're looking at people from these fancy schools.
01:10:44.060 And really all, the only hurdle for like a Harvard is, well, you know, you got some high
01:10:48.120 IQ people who are hard workers and so just don't ruin them.
01:10:52.820 As long as they're not that much worse off when they came out, they're still probably going
01:10:57.280 to be relatively good hires.
01:10:58.340 And that's incredibly inefficient.
01:11:00.720 Meanwhile, for us, you know, like one thing that has really been amazing is we've gotten
01:11:04.860 a lot of great students from Indiana University, for example.
01:11:07.260 But all of them were ones who had, by the time they were 20, had written great pieces
01:11:11.720 on freedom of speech.
01:11:12.620 And it's like, that is a much better signal to me of what kind of person you are.
01:11:16.740 So we've really got to be more creative in the way we think.
01:11:19.600 All right.
01:11:19.840 Take me to feelings.
01:11:21.400 Uh-huh.
01:11:21.880 Tell me what we should be doing on feelings.
01:11:24.400 Well, in some ways we should be taking them both more seriously and less.
01:11:28.380 And by more seriously, I mean, we want to be really clear here.
01:11:31.820 There is a mental health crisis going on right now on campus.
01:11:35.380 Is it connected?
01:11:37.260 We think it is.
01:11:38.100 We think that essentially, you know, and not to be too dismissive of it, but I think that
01:11:43.200 we're teaching the generation the habits of anxious and depressed people.
01:11:47.200 Yes.
01:11:47.440 So we shouldn't be shocked they're anxious and depressed.
01:11:49.360 Yes.
01:11:49.820 So we got to rethink the way we parent and all sorts of stuff.
01:11:52.280 But for the kids who are already there, the kids, and here's the worst thing we discovered.
01:11:56.400 The single worst thing we discovered was that suicide's going up for the first time in decades
01:12:01.620 and dramatically.
01:12:02.780 If you take the average overall, suicides for boys since the first decade of this millennia
01:12:10.060 has gone up by 25%, which is an absolute disaster.
01:12:13.080 For girls, it's gone up 70%, which is awful, which is an absolute calamity.
01:12:20.900 And if you actually take the lowest point of the year, 2007, so if you go back almost exactly
01:12:28.300 10 years, it's doubled since 2007.
01:12:31.440 So there is a real serious mental health problem going on here, but partially because I think
01:12:36.080 we're disempowering students, we're teaching them all these dysfunctional habits.
01:12:39.620 But once they're already there, this idea that, you know, I'll just give them a trigger
01:12:44.700 warning.
01:12:45.640 No, that means you're actually not taking it seriously enough.
01:12:49.400 We have to make sure that there are apps that can get you in touch with serious psychologists.
01:12:53.100 There are, they need to know about the existence of resources, but, you know, my preferred form
01:13:00.060 of intervention for anxiety and depression is CBT.
01:13:02.980 And like I said, people, you know, can get over the therapy part of it because if you look
01:13:08.820 at what it teaches you, the amazing thing is it teaches you how to argue fairly with yourself.
01:13:13.800 And turns out that arguing fairly, not, not everything's swell, not rose colored glasses,
01:13:18.680 but just being reasonable can make you less depressed and anxious.
01:13:22.480 But the wonderful implication of this too, though, is as soon as you direct it outward as
01:13:26.780 well and be like, before I open my mouth, am I overgeneralizing?
01:13:30.540 Am I labeling?
01:13:31.680 Am I catastrophizing?
01:13:33.220 Am I binary thinking?
01:13:34.640 If we could, if people could learn that both of the inside, we'd have a better mental
01:13:37.920 health situation.
01:13:39.220 If we could learn to do that outwardly facing, we'd have a much better political situation.
01:13:43.800 I want to be really careful here because, um, you know, this, and I want to make sure
01:13:48.260 people know that, you know, this, um, I've had suicide in my family and I've been clinically
01:13:53.820 depressed too.
01:13:55.500 This therapy is not there.
01:13:57.200 There is a stage where medicine is critical.
01:14:02.000 You're absolutely right.
01:14:02.780 Yeah.
01:14:02.960 And then, and that's absolutely worth saying.
01:14:04.420 And we do say in the book, we, we made a point of saying that.
01:14:07.500 And I have, um, I have, it gets me a little choked up.
01:14:10.620 I gotta say, um, the, I had a, another friend, um, the, um, the, um, the, um,
01:14:13.780 friend, uh, kill himself at one point.
01:14:15.680 And, um, I was walking down the street with one of my best friends, one of my groomsmen.
01:14:19.240 Um, uh, and he talked about how selfish it was for that friend to kill himself.
01:14:24.260 And even though this is one of my best friends in the world, he didn't know I was hospitalized
01:14:27.020 as a danger to myself.
01:14:29.160 And I had to, you know, I, I, I said like, listen, I wasn't in my right mind.
01:14:33.260 You can't blame people in other circumstances.
01:14:35.060 And there is a point where what I needed was supervision.
01:14:38.660 I needed my family.
01:14:40.020 I needed, um, uh, I needed medication.
01:14:43.080 And you have to make sure that those resources are available because after a certain point.
01:14:48.040 It becomes logical.
01:14:50.700 Yeah.
01:14:51.360 It becomes logical.
01:14:52.620 That, that, I talk about this in the book and the, and the mess, funny, I mean, funny,
01:14:57.100 dark, funny is that, you know, um, someone was, uh, someone, someone criticized an early
01:15:02.180 draft saying, doesn't Greg know something about depression?
01:15:04.680 You know, like this is also cold the way you're talking about it.
01:15:07.120 I'm like, okay, I'll write what actually happened to me.
01:15:09.500 And I convinced myself, which is sometimes a habit that I have from fiction writing.
01:15:15.740 This is just between me and you, computer.
01:15:17.360 Um, and I realized I put down things that were things I'd never told literally anybody.
01:15:23.340 My wife had never heard them.
01:15:24.720 Um, I'd never actually said them out loud.
01:15:26.460 That's so funny.
01:15:27.460 You've done that with the microphone.
01:15:30.260 You've done it with the computer.
01:15:31.600 I've done it with the microphone.
01:15:32.820 That is just between us, right?
01:15:34.040 Yeah.
01:15:34.320 It's so crazy.
01:15:35.340 And, and, and the messed up thing was, um, I did have little flickers of sanity during
01:15:39.720 when I was really trying to.
01:15:41.380 And, um, but the yelling back in my brain was, no, you have to do this now before you feel
01:15:46.820 better because this is the true thing.
01:15:48.620 Like basically, if you continue to live, you're living a lie because what you actually need
01:15:52.540 to do is, and it's just like, and you convince people and people say it's selfish.
01:15:56.980 No, no, no.
01:15:57.660 Oh no.
01:15:58.120 Not at that point.
01:15:59.140 You think you are doing everyone else a great service.
01:16:02.980 I actually was so messed up at one point.
01:16:05.180 I thought I could actually ask my sister for help and I, uh, I hope she doesn't hear that
01:16:09.740 cause I would probably make her cry.
01:16:10.720 But I, and Justin, and I mean my sister who loves me very much.
01:16:15.200 And of course that's completely insane to think that.
01:16:18.560 But you are insane.
01:16:19.320 But you are temporarily.
01:16:20.420 So yeah.
01:16:20.780 So I'm glad you brought it up because we talk about suicide prevention when you, when you
01:16:24.300 reach a certain point and be on the lookout for your friends when they get like that.
01:16:27.280 But oh, which brings me though to a messed up case here.
01:16:30.100 Yes.
01:16:30.640 Um, because not all of this stuff is all that ideological.
01:16:33.000 Some of the stuff we see on campuses are, uh, are lawyers trying to protect the bottom line
01:16:37.280 of their universities.
01:16:37.960 It's one of the scariest things I've ever read.
01:16:40.120 University of Northern Michigan had a policy that if you went to the counseling center,
01:16:45.160 you would get a scary letter from the Dean of Students, uh, from the, from, from the disciplinary
01:16:49.840 Dean saying you will be brought up on, you'll be, uh, brought up on charges if you talk to
01:16:55.720 anybody about your thoughts of self-harm.
01:16:58.200 Besides us.
01:16:58.740 Uh, yeah, besides us.
01:17:00.320 And the, we first found out about this from someone who went in, she was been sexually assaulted.
01:17:05.260 She was just going there to talk about that.
01:17:06.800 She didn't say anything about self-harm, but nonetheless, she gets this scary letter and
01:17:11.000 I, you know, with my personal experience with it and everybody else who knows anything about
01:17:14.520 it, I'm like, are you telling me that you told people who were in some cases kind of
01:17:18.720 depressed that one, they should isolate themselves and two, that they're a burden on their friends.
01:17:25.280 And there were some quotes that sounded exactly like that.
01:17:27.660 Um, and that comes from one of the motivations that can also be somewhat more easily fixed.
01:17:33.060 Um, universities, uh, they react, they overreact to the threats of lawsuits.
01:17:37.880 So in this case, they had this misconception that if they, if they did that, that would
01:17:41.740 protect them from lawsuits for suicide.
01:17:43.480 It's bad science.
01:17:44.740 It's bad law.
01:17:45.560 Um, it was amazing and so cruel that they thought that, um, but they also, uh, also federal
01:17:51.380 regulation, you know, making sure that that makes, that's clear and, and makes sense because
01:17:55.560 some of the motivating factors in this stuff isn't ideological at all.
01:17:58.660 It's university, uh, administrators thinking that they're somehow protecting the institution
01:18:02.720 and they don't really care who they hurt.
01:18:04.880 Try the last one.
01:18:06.020 Sure.
01:18:06.360 What do we do?
01:18:07.020 Good and evil.
01:18:07.760 Oy.
01:18:07.980 Um, we're open to ideas there.
01:18:10.720 Um, the, I mean, we definitely, you know, John definitely thinks that we have to have
01:18:14.780 more viewpoint diversity on campus.
01:18:16.500 You have to know someone smart who totally disagrees with the, with the reigning orthodoxy.
01:18:21.040 I have to tell you, um, uh, you never, no university, even I, I, maybe, maybe a couple, no universities
01:18:34.060 would allow me to teach media.
01:18:37.980 Now, why?
01:18:40.400 Yeah.
01:18:40.780 You know, if you have a wealth of experience, I'm not saying that I should, but if you're
01:18:45.720 a conservative, there's no way you get on campus.
01:18:50.620 Yeah.
01:18:50.880 So how is that going to happen?
01:18:53.160 Yeah.
01:18:53.420 And, and it has to be that people value it.
01:18:55.140 And right now they don't, you know, if you start actually taking viewpoint diversity seriously,
01:18:58.680 like Heights really been trying to get them to, and he is, he does have, there are 2000
01:19:02.180 members of heterodox Academy, which is not bad for a new organization.
01:19:05.620 Um, and as soon as you get that, if it's just an echo chamber, you produce dumber and
01:19:10.020 dumber ideas.
01:19:12.000 Um, if you have nobody to say that actually sounds pretty, pretty goofy.
01:19:15.360 Can I ask you a question?
01:19:16.220 How is it that the people who have, um, uh, tenure to protect ideas?
01:19:25.620 Oh yeah.
01:19:26.100 Ideas.
01:19:26.700 Yeah.
01:19:27.160 The people that are, that know Galileo.
01:19:31.820 Yeah.
01:19:32.700 How is it that they haven't realized that they've become the church?
01:19:36.400 This is, uh, as far as something that has just been a huge disappointment to me, because
01:19:39.780 in theory, tenure makes perfect sense to me.
01:19:42.440 Um, but other than some really notable exceptions, people, great people like Alan Charles Kors,
01:19:47.300 who I mentioned before, uh, some, uh, you know, Robbie George, there, there are a lot
01:19:51.700 of professors who just, oh, for that matter, Cornel West, his friend, who don't stand up,
01:19:55.940 who they have the best protected jobs in the universe, pretty much.
01:19:59.380 And nonetheless, they don't stand up for the rights of students or for their fellow faculty
01:20:03.800 members when they get in trouble.
01:20:05.140 And it's just like, how much more protection do you actually want?
01:20:08.400 So unfortunately I would, you know, I, these tenured professors could actually be a force
01:20:13.940 for good in some of these situations, but they're just not.
01:20:17.620 That's too bad.
01:20:18.020 And, and, and, and sometimes I love Robbie George and Cornel West.
01:20:21.620 I love that.
01:20:22.620 And those guys are, those guys are a force.
01:20:24.340 Right.
01:20:24.640 And, and, and Peter Singer.
01:20:26.580 I love the way that you, that's the way it should be.
01:20:29.340 Yeah.
01:20:29.580 I want to hear Peter Singer and Robbie George talk about the ethics of life.
01:20:34.120 Yeah.
01:20:34.540 That's what I want.
01:20:35.440 Yep.
01:20:35.820 And those are the, and those are absolutely, absolutely amazing talks.
01:20:38.940 Um, I think Heather McDonald, I actually realized that we totally agreed on one thing,
01:20:42.880 which was, uh, everybody used to listen to the great courses and maybe some way to get a,
01:20:48.020 a good cheap education would be have someone listened to all the lectures, read some of
01:20:52.420 the books and take a test at the end might actually serve you a little better than some
01:20:55.460 of the, some of the courses, some of the courses I took.
01:20:58.460 Yeah.
01:20:59.540 Thank you.
01:21:00.300 Such a pleasure.
01:21:01.300 Thank you.
01:21:07.180 Just a reminder, I'd love you to rate and subscribe to the podcast and pass this on to a friend
01:21:12.800 so it can be discovered by other people.
01:21:14.420 Like, don't.
01:21:19.000 Yeah.
01:21:21.660 Welcome.
01:21:22.300 Yeah.
01:21:24.440 Yeah.
01:21:29.900 Yeah.
01:21:30.460 Yeah.
01:21:30.840 Yeah.
01:21:31.220 Yeah.
01:21:32.100 Yeah.
01:21:32.320 Yeah.
01:21:32.620 Yeah.
01:21:32.840 Yeah.
01:21:33.320 Yeah.
01:21:34.180 Yeah.
01:21:34.380 Yeah.
01:21:38.800 Yeah.
01:21:39.320 Yeah.
01:21:40.200 Yeah.
01:21:40.320 Yeah.
01:21:40.640 Yeah.
01:21:40.860 Yeah.
01:21:42.580 Right.
01:21:43.300 Yeah.