Making Sense - Sam Harris - May 18, 2020


#204 — A Conversation with Jonathan Haidt


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

181.29454

Word Count

11,887

Sentence Count

263

Misogynist Sentences

12

Hate Speech Sentences

17


Summary

Jonathan Haidt joins me in this episode to talk about the current pandemic of political polarization and self-transcendence, and why we need a better understanding of the nature of consciousness and psychedelics. We also talk about our mutual interest in self-Transcendence and psychedelicism, and how they can help us understand the world we live in, as well as the potential knock-on effect of psychedelics and meditation on our understanding of morality. And we talk about why we should all be working together to find a way to make sense of the chaos that is our political and moral world. Make sense of it all, in part one of a two-part conversation on the co-op pandemic, COVID19. In part two of this conversation, we discuss self-awareness and the role of the mind in understanding the world around us, and what it means to be a good human being and a good moral being. Thanks for listening to the Making Sense Podcast, and Happy Listening! Sam Harris Make Sense? - The Making Sense Project at Making Sense and The New York Times Bestselling author of The Righteous Mind: How to Find Your True Calling and Find Your Calling in the 21st Century, Jonathan HaidT at The Heterodox Academy at Harvard Law School, where he helps students discover their true calling and find their purpose. . The New Statesman in the new book, "The New Science of Consciousness." is out now in paperback and on amazon to be published in paperback for $99, $99.99.00, plus shipping free on Amazon Prime, Blu-ray and Vimeo, and also rental on Audible, and a limited edition hardcover for $49.99, and Blu-Hardcover on $99 or $99 at $99 including Audible.99 and Audible is also available in hardcover $99/Vimeo, plus Audible UK.99/ Audible Prime, and VGA Plus, and $99 retail, plus a limited third-party shipping service, and shipping on Vimeo for $2499, shipping + Audible Pro, B&FSB, and 3VGA, and will be shipping only $49,99, will get you a copy of the book, plus an Audible Plus, shipping is also get $99 Plus a 3-AVGO, and an additional 2 years of Audible order.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.460 This is Sam Harris.
00:00:10.380 Just a note to say that if you're hearing this,
00:00:12.680 you are not currently on our subscriber feed
00:00:14.740 and will only be hearing partial episodes of the podcast.
00:00:18.320 If you'd like access to full episodes,
00:00:20.480 you'll need to subscribe at samharris.org.
00:00:22.960 There you'll find our private RSS feed
00:00:24.980 to add to your favorite podcatcher,
00:00:27.140 along with other subscriber-only content.
00:00:30.000 And, as always, I never want money to be the reason
00:00:32.900 why someone can't listen to the podcast.
00:00:34.960 So if you can't afford a subscription,
00:00:36.960 there's an option at samharris.org to request a free account,
00:00:40.240 and we grant 100% of those requests.
00:00:42.680 No questions asked.
00:00:47.040 I am here with Jonathan Haidt.
00:00:49.100 John, thanks for joining me again.
00:00:51.120 My pleasure, Sam.
00:00:52.620 So, we're planning to do a two-part conversation here,
00:00:56.140 starting with the topic that has been omnipresent
00:01:01.160 and on everybody's mind for now some months,
00:01:04.940 which is the COVID-19 pandemic.
00:01:07.900 And, you know, I wanted to talk to you about that
00:01:10.140 just because of, you know, your expertise in social psychology
00:01:14.720 and the way in which it's informing,
00:01:17.820 or should be informing,
00:01:18.740 our view of political polarization,
00:01:22.760 the fraying of society's concerns about social cohesion,
00:01:27.760 and everything that is a kind of knock-on effect
00:01:31.920 of the, or a potential knock-on effect
00:01:34.120 of the immediate concern here,
00:01:36.300 which is epidemiological and economic.
00:01:39.640 And so we'll dive into that.
00:01:40.880 And then in the second half,
00:01:41.800 I thought we could talk about our mutual interest
00:01:44.300 in self-transcendence
00:01:47.000 and the nature of consciousness
00:01:48.500 and the kinds of methods people have used
00:01:52.000 and you and I have both used
00:01:53.640 to explore that terrain,
00:01:55.960 psychedelics and meditation being two.
00:01:58.940 So we'll just,
00:01:59.740 this will be kind of a two-chapter conversation,
00:02:02.040 and I'm looking forward to it.
00:02:03.840 So, but before we begin, John,
00:02:06.100 just perhaps summarize your background briefly
00:02:10.420 in terms of your kind of intellectual life
00:02:13.800 as it relates to psychology and politics, perhaps.
00:02:19.280 Yeah, so I think in a lot of ways,
00:02:20.840 I started out on a very similar path to you.
00:02:24.580 I was a philosophy major in college
00:02:27.220 and I wanted to understand the meaning of life.
00:02:29.080 And then I went to graduate school in psychology
00:02:31.500 and I shifted over to social psychology
00:02:34.320 and morality and emotion.
00:02:36.260 And I began studying how morality varies across cultures,
00:02:39.500 but as the American culture wore heated up,
00:02:41.600 I shifted over to looking at left-right
00:02:43.840 as being like different cultures.
00:02:45.640 So I started studying political polarization
00:02:47.160 back in 2004.
00:02:49.860 And boy, is that a stock whose value has risen.
00:02:52.780 I mean, it's just reached insane valuations right now.
00:02:55.900 So that's what I've been studying.
00:02:57.840 And during, so I actually got into it
00:03:00.420 in part to help the Democrats win.
00:03:02.300 I was so upset that the Democrats in 2000, 2004
00:03:05.520 just had no idea how to talk about morality.
00:03:08.600 But as I began to write The Righteous Mind,
00:03:10.860 I really started reading conservative ideas
00:03:14.220 and intellectuals and discovering
00:03:16.060 that there are actually a lot of ideas out there
00:03:17.700 that I didn't know.
00:03:18.620 And it's very valuable to hear other sides.
00:03:20.960 I kind of stepped out from being on a team.
00:03:23.200 And since then, I've really just been trying
00:03:24.880 to help everyone understand across the divide.
00:03:27.740 And I'm extremely alarmed about our democracy
00:03:29.920 and its health.
00:03:31.520 So that's what I've been working on
00:03:32.660 for the last 10 or 15 years, especially,
00:03:35.040 is how do we help people understand
00:03:36.840 all the different moral matrices that they live in
00:03:39.680 and thereby turn down some of the anger
00:03:42.100 and make it possible to have pragmatic solutions
00:03:45.100 of the sort that a democracy should be able to reach?
00:03:47.840 Yeah, and you were one of the earliest people
00:03:51.460 on some of these points.
00:03:52.860 You might have been the first person
00:03:54.120 to signal just how dysfunctional
00:03:57.400 the ivory tower's view of the political landscape has been.
00:04:02.260 I mean, so it's just natural within the academy
00:04:05.520 to have a level of political bias
00:04:07.960 that just would be starkly dysfunctional anywhere else,
00:04:12.800 which guarantees an echo chamber effect.
00:04:16.300 And you were, you know, very early on
00:04:19.080 talking about how a lack of diversity of ideas
00:04:22.440 was really socially and intellectually problematic.
00:04:26.100 And so you started the Heterodox Academy
00:04:28.640 to shine more light on that.
00:04:31.080 Do you want to say something about that?
00:04:32.600 Yeah, sure, because it's very well connected
00:04:33.860 to what we'll be talking about today.
00:04:36.140 So once I stepped out of the matrix
00:04:39.320 and stopped being a member of a team,
00:04:41.520 fighting the other team,
00:04:43.080 and, you know, just started being
00:04:44.940 just a social scientist trying to figure out
00:04:46.840 what the hell's going on,
00:04:48.380 I started noticing not just that we lean left.
00:04:51.420 That isn't a problem.
00:04:52.440 The field can function, even if it leans,
00:04:54.260 you know, two or three, if you've got two or three times
00:04:56.280 as many people on the left as the right,
00:04:58.800 that's not a problem,
00:04:59.680 and it wouldn't be a problem to reverse either.
00:05:02.300 We don't need balance.
00:05:03.340 What we need is a complete absence of orthodoxy.
00:05:07.960 So orthodoxy, you know, means that if you dissent,
00:05:10.920 you will be punished.
00:05:12.620 And, you know, that's fine if your goal is cohesion.
00:05:15.720 You know, if you're an army marching into battle,
00:05:17.780 maybe that's fine.
00:05:18.760 But if you're a scientist seeking the truth,
00:05:21.640 you know, anybody who's read John Stuart Mill knows
00:05:23.800 he who knows only his own side of the case
00:05:26.220 knows little of that.
00:05:27.980 So that's what alarmed me.
00:05:28.980 As soon as I started looking at the polarization
00:05:31.000 in the country, I saw it happening
00:05:32.780 in my own field of psychology
00:05:34.580 and saw it happening in most of the social sciences
00:05:38.020 and humanities.
00:05:39.460 And I could see orthodoxy.
00:05:41.100 I could see bad social science thinking.
00:05:43.200 And I started getting alarmed by it.
00:05:44.940 I gave a talk in 2011 on how this was a problem
00:05:47.840 for social psychology.
00:05:49.460 And to my field's credit, I didn't suffer.
00:05:51.620 Nobody, you know, I wasn't kicked out.
00:05:53.020 People didn't get angry at me.
00:05:54.340 People generally agreed it's a problem,
00:05:56.140 but it's been hard to really change things.
00:05:59.420 And that's what Heterodox Academy is trying to do.
00:06:02.120 Well, it was a problem way back then.
00:06:04.900 But in 2016, the reckoning really seemed to happen
00:06:09.760 because what we witnessed there was a country
00:06:12.760 divided along seams we had seen before,
00:06:16.580 this heartland revulsion against the coasts
00:06:20.940 and against the cosmopolitanism and elitism
00:06:24.340 or perceived elitism of big cities
00:06:27.840 and, you know, their liberal inhabitants.
00:06:31.040 And Trump managed to magnify that divide
00:06:35.400 to a degree that I still think we're trying
00:06:37.540 to grapple with what happened there
00:06:39.340 and trying not to repeat the same psychological experiment
00:06:42.980 over the next six months.
00:06:44.840 And then I should also say that now the pandemic
00:06:47.480 has somehow, you know, if it were possible
00:06:50.140 to amplify that dynamic, it has.
00:06:52.840 That's right.
00:06:53.440 So how are you viewing the current moment
00:06:55.420 and what this quasi-quarantine has done
00:06:59.720 to further expose this intellectual
00:07:02.860 and tribal schism in the country?
00:07:04.840 Yeah, so to understand where we are,
00:07:07.220 you have to go back at least to the,
00:07:09.680 well, let's go all the way back to the 1950s and 60s
00:07:12.680 when America was pretty unpolarized.
00:07:15.540 The post-war world was an unusually,
00:07:18.060 historically it was quite unusual,
00:07:19.960 the mid-20th century
00:07:20.780 and having very low levels of polarization.
00:07:23.340 There were liberal Republicans,
00:07:24.940 there were conservative Democrats.
00:07:27.000 And for a variety of reasons in the 70s
00:07:29.720 and especially in the 80s,
00:07:31.180 we began to see almost like, you know,
00:07:33.080 tectonic plates moving around,
00:07:35.080 we began to have one party
00:07:36.580 that had psychological progressives
00:07:39.320 and one party that had psychological conservatives.
00:07:42.320 So before then, things were all scrambled
00:07:44.520 and, you know, rural people were often Democrats
00:07:47.240 and the Democrats were the party of the working man.
00:07:49.760 There was a lot of mixing and matching
00:07:51.520 and there was the possibility
00:07:52.420 of bipartisan legislation.
00:07:53.900 A lot of legislation was bipartisan back then.
00:07:56.240 But for a variety of reasons,
00:07:57.400 we start getting sorting into types of people
00:08:00.860 who are sorting by values.
00:08:03.140 I think Ronald Reagan put together a coalition
00:08:04.780 that was not just economic pro-business,
00:08:07.060 it was also with the Christians
00:08:08.840 and religious right and family values.
00:08:11.900 And this is much more dangerous
00:08:12.940 because if you have coalitions based on interests,
00:08:15.880 well, you can make deals, you can trade off.
00:08:18.780 But when you have coalitions
00:08:19.860 based on personality types that share values,
00:08:23.360 well, now the other side is evil.
00:08:25.480 They are bad people.
00:08:26.780 And as the parties increasingly then became more purified
00:08:30.780 in terms of density, that is, if it's, you know,
00:08:33.500 if there's a lot of people per square mile,
00:08:35.640 it votes Democrat.
00:08:36.920 And if there's a few people per square mile,
00:08:38.260 it votes Republican.
00:08:39.200 And also very alarmingly by race
00:08:41.420 as the Republican Party is becoming more
00:08:44.020 of the party of white people.
00:08:46.020 These splits are very dangerous.
00:08:48.240 So I'm extremely alarmed.
00:08:50.620 I was extremely alarmed even back around 2010, 2012.
00:08:54.120 And it is so much worse now.
00:08:56.540 And then there's the media environment.
00:08:57.800 We can get into that later perhaps,
00:08:59.100 but changes in social media between 2009 and 2011
00:09:02.320 gave us much more of an outrage machine,
00:09:05.380 adding on to cable TV,
00:09:06.980 which has been causing problems for a while as well.
00:09:09.880 So it was really, the table was really set
00:09:11.940 for an election in which reality had little grip
00:09:16.200 on a lot of people and passions, anger, fury,
00:09:20.300 gripped a lot of people.
00:09:21.440 I mean, it's basically straight out
00:09:22.740 of the Federalist Papers where Madison writes
00:09:25.920 about faction and the human tendency to faction.
00:09:29.240 If we hate the other side so much,
00:09:30.900 we don't care about the common good.
00:09:33.340 And there was a lot of anger in the 2016 election.
00:09:36.020 Had there not been so much anger,
00:09:37.300 had we not been so polarized,
00:09:38.420 there's no way Donald Trump could have gotten elected.
00:09:40.600 So I think everyone needs to think,
00:09:42.380 whatever side you're on,
00:09:43.140 if you care about the country,
00:09:44.680 we need to figure out what do we do about this?
00:09:46.860 How can we turn things down in the future?
00:09:49.100 Yeah, and the information piece is crucial here.
00:09:53.120 The fact that people can so successfully silo themselves
00:09:56.860 and pre-stigmatize other sources of information
00:10:00.740 or messages that they don't want to hear.
00:10:03.580 I mean, there's a level of confirmation bias
00:10:05.280 and just allergy to data that doesn't fit your narrative
00:10:10.280 and conspiracy thinking that doesn't even recognize
00:10:14.340 that it's conspiracy thinking
00:10:15.680 in terms of just the public conversation
00:10:18.000 we're having with one another and failing to have.
00:10:21.400 There's something unrecognizable about this.
00:10:23.280 I don't know if that's just some kind of delusion
00:10:26.000 that I've acquired based on it being delivered
00:10:29.660 through new channels like social media
00:10:31.740 or if it's some recency effect
00:10:33.640 or if I'm just getting older.
00:10:34.880 But to some degree, it's even ramped up
00:10:36.640 in the context of this pandemic
00:10:39.140 where I see otherwise very smart, irrational people,
00:10:45.140 i.e. not the usual tinfoil hat crowd,
00:10:49.040 succumbing to degrees of motivated reasoning
00:10:52.400 that without apology and without apparent bandwidth
00:10:56.140 to check themselves ever
00:10:58.260 and proving completely unsusceptible to argument.
00:11:02.760 It's just like there are no universally trusted sources
00:11:08.340 of information that can resolve disagreements
00:11:11.360 at this point, it seems.
00:11:12.880 Well, that's right, because you have to see people
00:11:15.960 not as creatures seeking information,
00:11:18.360 but as social creatures enmeshed in games of competition
00:11:22.500 or war or conflict.
00:11:24.000 And when the conflict level is low
00:11:27.160 and you put us in the right circumstances
00:11:30.240 and institutions, we actually can find the truth.
00:11:33.480 And that's the magic of a university.
00:11:35.020 That's the magic of science.
00:11:36.720 But imagine a scientific field
00:11:38.500 in which suddenly let's take
00:11:40.680 all the normal dynamics of science
00:11:42.180 and then let's put a lot of money in
00:11:44.000 so that there's a huge amount of money riding
00:11:46.660 on whether you get this discovery or patent.
00:11:51.040 Well, that would corrupt things.
00:11:52.220 And of course, that has happened to some degree
00:11:54.180 in medicine, in some areas.
00:11:57.000 In the social sciences, money doesn't play much of a role,
00:11:59.120 but politics does.
00:12:00.480 And so you get, as tribal passions
00:12:03.180 and hatred of the other political party rises,
00:12:07.180 you get the same kind of corrupting dynamic there.
00:12:10.380 So I do think it is a theme of the 2010s
00:12:12.960 and I suppose of the 2020s,
00:12:15.120 that it is actually getting harder to find the truth
00:12:17.860 than it was 20 or 30 years ago, I believe.
00:12:20.440 That is, despite the,
00:12:22.560 obviously some kinds of facts and truth
00:12:24.800 are just fantastically easy.
00:12:26.300 I mean, I'm very grateful for Google and the internet.
00:12:29.140 Obviously, many things are getting better.
00:12:30.960 But anything that is politically or morally tagged
00:12:34.940 so that one side wants to believe
00:12:36.640 and the other doesn't,
00:12:38.040 in some ways it is now harder to find the truth
00:12:40.280 than it used to be.
00:12:42.300 At least that's what I'm coming to see.
00:12:44.660 In my own field in psychology,
00:12:45.980 we've had this replication crisis.
00:12:47.540 And so this is a different mechanism,
00:12:49.740 but we used to think that,
00:12:51.420 when I was in grad school,
00:12:52.500 we learned that correlational studies
00:12:53.620 are not very reliable,
00:12:55.260 but experiments, wow, that's the gold standard.
00:12:57.600 If it's a random assignment, double blind,
00:13:00.820 boy, that tells you what caused what.
00:13:04.420 But now we're finding that even a lot
00:13:05.740 of our experiments don't replicate.
00:13:08.220 And so I think the attitude we have to take
00:13:10.120 into the 2020s is a lot more humility.
00:13:13.200 We simply don't know what the truth is,
00:13:15.440 no matter how fervently we believe we do.
00:13:17.860 And I imagine you're quite familiar
00:13:19.640 with that kind of a mindset and issues of faith,
00:13:22.540 but it infects all of us.
00:13:24.820 And I'm hopeful that this virus,
00:13:26.880 this pandemic has humbled everyone
00:13:28.540 because we were pretty much all wrong
00:13:29.940 about a lot of things.
00:13:31.420 We're still wrong about most things,
00:13:32.640 or many things probably.
00:13:34.920 Yeah, this has been an interesting
00:13:36.020 ordeal of epistemology, really, this pandemic.
00:13:39.520 So we've been dealing with patently unreliable information,
00:13:44.160 rumors leaking out of China,
00:13:46.680 and then the overt attempt to suppress those rumors
00:13:51.140 or a message against them
00:13:53.280 by a communist regime
00:13:55.860 that has every reason to worry about
00:13:58.560 the perception of it in the world.
00:14:00.220 And then all of the tribal spin
00:14:05.100 given to that circumstance by our own politics,
00:14:08.120 we have a completely deranged president
00:14:10.640 who is concerned about the stock market
00:14:13.880 and its effect on his chances of re-election.
00:14:17.440 We have a personality cult
00:14:19.560 amplifying every one of his errant ideas.
00:14:23.760 But then we have just all these different vested interests
00:14:28.420 and people without much political partisanship
00:14:34.640 exposed to very different
00:14:37.320 or likely very different outcomes
00:14:39.720 with respect to the single variable
00:14:41.840 of deciding to lock down society, right?
00:14:43.740 So you have people whose businesses
00:14:44.760 can still be maintained once we lock things down,
00:14:47.900 and then some of them even improve, right?
00:14:51.000 And then you have people for whom
00:14:52.740 every aspect of economic life
00:14:55.360 is going to grind to a halt.
00:14:57.020 And these people may,
00:14:58.420 on either side of this divide,
00:14:59.640 they may be equally reasonable
00:15:01.540 and equally respectful of science,
00:15:03.680 and yet you can see the consequences
00:15:05.200 of your economic concerns
00:15:07.380 trimming down your ability to think clearly
00:15:09.740 about what the data is suggesting
00:15:11.840 at any time point.
00:15:13.940 It's been very interesting to witness.
00:15:16.120 I mean, I continue to believe
00:15:17.300 that at every point along the way,
00:15:20.780 even when we,
00:15:21.780 the truth is we still don't know
00:15:23.580 how lethal this disease is.
00:15:25.760 That's right.
00:15:26.300 But we, at every point along the way,
00:15:28.240 it has been prudent
00:15:29.080 to try to stop the spread of the contagion,
00:15:33.840 to spare our healthcare system,
00:15:36.040 because we could see what was happening
00:15:37.400 in Italy and other countries,
00:15:39.680 and to use the time we were thereby gaining
00:15:45.480 for ourselves to ramp up testing
00:15:48.280 and our ability to trace and isolate cases
00:15:51.460 and to understand the virus
00:15:53.240 and obviously develop therapeutics
00:15:55.000 and ultimately a vaccine.
00:15:56.960 Now, we have proven surprisingly inept
00:16:00.560 at using the time well,
00:16:02.220 and that's something we have to figure out
00:16:04.380 how to improve and understand,
00:16:06.820 you know, going forward.
00:16:07.560 But it has always seemed prudent,
00:16:10.020 even given the absolutely predictable
00:16:12.940 economic costs,
00:16:14.780 to err on the side of caution here
00:16:17.680 because at every point along the way,
00:16:20.300 this has seemed considerably worse
00:16:22.420 than the flu.
00:16:23.160 I mean, the analogies to the flu
00:16:24.540 have always seemed inaccurate.
00:16:26.860 And the question is,
00:16:27.760 how much worse is this than the flu?
00:16:30.040 And reasonable people can debate that.
00:16:32.200 So, for instance,
00:16:32.900 there are very prominent people
00:16:33.980 who are making claims like hospitals
00:16:36.660 are coding more or less
00:16:39.480 every conceivable death
00:16:41.560 as a COVID death.
00:16:43.280 So the mortality statistics
00:16:44.560 are completely fake, right?
00:16:46.460 Now, whether this is,
00:16:47.760 I'm sure that that's happened
00:16:49.800 in a few places,
00:16:51.520 but this is either a
00:16:53.240 very dangerous conspiracy theory
00:16:55.360 or something we have to get
00:16:56.440 to the bottom of immediately.
00:16:58.560 And it's very hard to tell, right?
00:17:00.980 I mean, you can't figure this out
00:17:02.300 in two hours.
00:17:03.620 And who would you trust
00:17:04.560 to put this claim to rest or not?
00:17:07.880 The New York Times
00:17:08.840 isn't good enough, apparently.
00:17:11.040 So I don't know how you think
00:17:12.460 about how we move forward
00:17:14.160 in this space where
00:17:15.100 there are very few trusted
00:17:17.100 gatekeepers of information.
00:17:19.260 And the disparity between
00:17:21.240 believing one thing
00:17:22.680 and believing its antithesis
00:17:24.240 is enormous.
00:17:26.080 That's right.
00:17:26.920 So I think,
00:17:27.820 so I'll go with you
00:17:28.660 on your analysis
00:17:29.260 on the first few weeks
00:17:31.040 or a month or two of this,
00:17:32.740 which is that as long as
00:17:34.580 we didn't know much
00:17:35.780 about this thing,
00:17:36.720 we didn't know
00:17:37.280 what the death rate was,
00:17:38.400 it could be 3%, 6%.
00:17:40.440 And for God's sakes,
00:17:41.940 our doctors didn't even have masks.
00:17:44.460 So I think there was no,
00:17:45.900 there was really no dispute
00:17:46.780 that we had to do lockdowns
00:17:48.340 at first when we just didn't know
00:17:49.740 what was going on
00:17:50.360 and we could not deal with it.
00:17:51.460 We had no idea
00:17:52.180 where the high water mark would be.
00:17:53.980 You know, and I'm sitting here
00:17:55.060 in Manhattan
00:17:55.760 where everything is peaceful
00:17:57.740 and the streets are quiet,
00:17:59.080 but, you know,
00:17:59.500 it was pandemonium
00:18:00.340 in the hospitals
00:18:01.000 and we had no idea
00:18:02.020 how high the wave of death
00:18:04.000 was going to crest.
00:18:05.240 And I think to their credit,
00:18:06.120 Americans actually really did
00:18:08.100 accept that.
00:18:09.080 I mean, Americans really did,
00:18:10.380 you know, I was surprised
00:18:11.140 that I think in those first weeks
00:18:12.520 we actually did get,
00:18:13.960 you know, obviously not like
00:18:15.180 they did in China
00:18:15.900 or other places
00:18:16.440 using a lot more force,
00:18:17.780 but Americans did go along with it.
00:18:19.140 And the surveys still show
00:18:20.400 that most people support that.
00:18:21.680 But once we got through
00:18:23.560 that first wave
00:18:24.580 with enormous economic cost,
00:18:27.280 which is also a personal cost,
00:18:29.300 now I think there are
00:18:31.080 at least real alternative views
00:18:33.240 that need to be discussed.
00:18:34.220 And if we had some sort
00:18:35.800 of a reasonable,
00:18:37.080 rational media system,
00:18:39.560 reasonable democracy
00:18:41.240 with reasonable discourse norms,
00:18:43.160 we could actually do it.
00:18:44.540 What I mean is,
00:18:45.200 especially, say, the Sweden model,
00:18:46.460 it is at least reasonable
00:18:47.720 to say, okay, you know,
00:18:48.700 they're doing it differently
00:18:49.440 in different countries.
00:18:50.140 Well, let's look, how does it work?
00:18:52.040 You know, do they get immunity faster?
00:18:54.820 So as long as there was
00:18:55.420 so much unknown,
00:18:56.780 it actually would be really important
00:18:58.880 to listen to the other side,
00:19:00.360 to listen to critics.
00:19:01.880 And that's the way that
00:19:02.840 that's the way that
00:19:03.620 we all get smarter
00:19:04.520 is by having
00:19:05.560 our confirmation biases challenged.
00:19:07.840 So I'm a big fan of that.
00:19:09.960 Now, unfortunately,
00:19:10.700 we live in this crazy fun house,
00:19:13.820 mad house,
00:19:14.880 in which, as you said,
00:19:16.060 there are national interests
00:19:17.440 trying to distort things.
00:19:18.580 There are, you know,
00:19:19.660 Russian operatives
00:19:21.800 trying to, you know,
00:19:22.840 use rumors to divide us.
00:19:25.040 We have a president who,
00:19:27.040 when George W. Bush,
00:19:28.900 you know,
00:19:29.500 gave a call for us
00:19:31.460 to come together,
00:19:32.280 it was, you know,
00:19:32.660 a beautiful call
00:19:33.680 from a former president
00:19:34.640 and for Trump
00:19:35.700 to attack him on the spot.
00:19:37.800 That, to me,
00:19:38.220 was one of the several
00:19:39.280 just horrible low points
00:19:40.740 of this whole thing.
00:19:42.600 It also just shows us
00:19:43.320 how far from normal politics
00:19:45.020 we've wandered,
00:19:46.420 because, you know,
00:19:47.000 here we have
00:19:47.820 a current Republican president
00:19:49.300 vilifying a previous
00:19:51.240 Republican president
00:19:52.240 who was making nothing more
00:19:54.320 than a call
00:19:54.960 for national unity
00:19:56.000 and a transcendence
00:19:57.080 of partisanship.
00:19:58.380 And the current president
00:19:59.480 can't even transcend
00:20:00.580 his own thin-skinned concern.
00:20:02.900 No, that's right.
00:20:03.420 I know.
00:20:03.640 When that happened,
00:20:04.360 I didn't get angry at all.
00:20:05.600 I was laughing.
00:20:06.440 It's like,
00:20:06.700 oh my,
00:20:06.940 this cannot be happening.
00:20:08.380 This cannot be happening.
00:20:09.800 So, no,
00:20:10.160 we are so far beyond,
00:20:11.840 we're just so deep
00:20:12.600 into the absurd.
00:20:13.920 And so, yeah,
00:20:14.280 that's what we have to figure out.
00:20:15.060 Let me just put one distinction
00:20:16.260 on the table
00:20:16.900 is most Americans
00:20:18.920 are pretty reasonable.
00:20:20.440 Most Americans
00:20:21.000 are not that polarized.
00:20:22.840 You have to distinguish
00:20:23.980 between the average
00:20:25.080 and the,
00:20:26.800 sort of the dynamics
00:20:29.120 of the system.
00:20:30.520 And so,
00:20:30.960 let's take,
00:20:31.320 just to take one example
00:20:32.120 on a college campus,
00:20:33.940 most students
00:20:34.340 are pretty reasonable,
00:20:35.640 but we've been,
00:20:36.980 because of social media
00:20:37.680 and other things,
00:20:38.740 you know,
00:20:38.980 the people who will use
00:20:41.060 social media
00:20:41.720 or mount protests
00:20:43.520 can have
00:20:44.500 a disproportionate voice.
00:20:46.660 Same thing
00:20:47.160 in a democracy.
00:20:48.800 There's wonderful work
00:20:49.560 by a group
00:20:49.920 called More in Common,
00:20:51.300 a British organization
00:20:52.660 that surveyed America.
00:20:53.720 They've done
00:20:53.940 really wonderful work
00:20:54.820 on studying polarization
00:20:56.200 in the United States.
00:20:57.500 They find that
00:20:58.440 Americans fall into
00:20:59.980 about seven different groups
00:21:01.240 based on their
00:21:02.020 political attitudes.
00:21:03.820 And four of the groups,
00:21:05.940 which is a large majority,
00:21:07.540 they call the exhausted majority.
00:21:09.620 And these are people
00:21:10.840 who are quite reasonable.
00:21:11.920 Two of the groups
00:21:12.380 are on the left.
00:21:12.940 One is centrists.
00:21:14.020 One is people
00:21:14.660 who are just disengaged.
00:21:16.280 So most Americans,
00:21:18.060 you know,
00:21:18.360 you can't blame
00:21:20.200 most Americans.
00:21:21.940 But because of the nature
00:21:24.000 of social media,
00:21:25.180 the nature of Congress,
00:21:26.340 the nature of cable news,
00:21:29.240 various people
00:21:30.000 have megaphones
00:21:31.140 that are pursuing
00:21:32.260 either commercial interests
00:21:33.680 or ideological interests.
00:21:35.960 And so you get absurdities,
00:21:37.920 well, like Fox News
00:21:38.920 saying,
00:21:39.620 you know,
00:21:40.240 that remdesivir is bad
00:21:42.320 and chloroquine is good.
00:21:43.480 And this is after
00:21:44.020 the scientific studies
00:21:44.940 have come out
00:21:45.420 showing the reverse.
00:21:47.000 So what I'm saying is
00:21:48.100 don't give up on Americans,
00:21:49.820 but it's almost time
00:21:51.160 to give up on
00:21:51.880 the system
00:21:53.080 or the network
00:21:53.780 that we have.
00:21:54.880 And by give up,
00:21:55.400 I don't mean
00:21:55.780 that there's no hope.
00:21:56.320 I just mean like,
00:21:57.620 man,
00:21:58.200 we can't just go back
00:21:59.420 to normal.
00:21:59.840 We've got to dig deep,
00:22:01.040 figure out what's wrong
00:22:01.920 and fix this
00:22:03.420 so that this becomes
00:22:04.300 the bottoming out,
00:22:05.260 that 2020 becomes,
00:22:06.520 you know,
00:22:07.020 the worst year
00:22:07.740 in a long time
00:22:08.560 and it's something
00:22:09.940 changes by the end
00:22:10.880 of this decade.
00:22:12.320 How do you view
00:22:12.880 the next,
00:22:14.220 let's say six months?
00:22:15.040 So the next six months
00:22:15.900 is overshadowed
00:22:18.060 entirely by the 2020
00:22:19.620 presidential election,
00:22:20.960 right?
00:22:21.100 It's just going to be
00:22:22.260 politics all the time.
00:22:24.520 When it's not pandemic,
00:22:25.940 it'll be politics.
00:22:27.020 Yep.
00:22:27.400 And we,
00:22:28.740 you know,
00:22:29.000 obviously don't know
00:22:29.660 how much the economy
00:22:31.120 is going to unravel
00:22:32.080 in the meantime,
00:22:33.060 but it seems like
00:22:34.680 it's poised to unravel
00:22:37.320 to an impressive degree.
00:22:39.380 I mean,
00:22:39.540 we're certainly flirting
00:22:41.040 with a real depression
00:22:43.100 if, you know,
00:22:44.400 joblessness numbers
00:22:45.260 are any indication.
00:22:46.920 And again,
00:22:48.360 the most hopeful
00:22:50.540 predictions for a vaccine,
00:22:53.080 which is really
00:22:53.660 the only thing
00:22:54.660 that will fully reset
00:22:56.400 our circumstances
00:22:57.480 with respect to public health,
00:22:59.300 nothing arrives
00:23:00.260 before,
00:23:00.940 you know,
00:23:01.180 something like,
00:23:02.820 it would be a miracle
00:23:03.520 if it arrived in January.
00:23:04.680 Right?
00:23:05.200 And even that is,
00:23:07.200 very few people
00:23:08.000 are imagining
00:23:08.860 that it's sooner
00:23:10.200 than a year from now.
00:23:12.060 And again,
00:23:12.600 it's,
00:23:13.520 we've got to remind ourselves
00:23:14.360 of how amazing
00:23:15.700 that would be.
00:23:16.380 I think the fastest vaccine
00:23:17.920 we've ever developed
00:23:19.060 was four years
00:23:20.420 for the mumps
00:23:21.020 and the average
00:23:22.020 is 15 years.
00:23:23.180 One year would be,
00:23:24.140 you know,
00:23:24.400 a massive breakthrough.
00:23:26.160 And let's say
00:23:27.320 we improve on that
00:23:28.620 and it's,
00:23:29.580 we get a vaccine
00:23:30.460 by January.
00:23:31.500 Still,
00:23:31.960 we have this period
00:23:33.000 where not just
00:23:34.420 our country,
00:23:35.640 but the entire world
00:23:36.980 has been pitched
00:23:37.840 into a circumstance
00:23:38.660 of real uncertainty
00:23:40.820 financially,
00:23:42.780 economically,
00:23:43.520 and,
00:23:44.080 and I think
00:23:45.180 as a result,
00:23:46.400 politically.
00:23:47.420 How are you viewing
00:23:48.360 the next six months?
00:23:50.040 And,
00:23:50.800 I mean,
00:23:50.940 there's just so many
00:23:51.520 concerns on the table.
00:23:52.420 Like,
00:23:52.580 how do we even have
00:23:53.320 a safe election,
00:23:54.500 right?
00:23:54.800 If we can't vote by mail,
00:23:56.380 right?
00:23:56.580 How do we get people
00:23:57.320 to actually vote?
00:23:59.020 What are you thinking about
00:23:59.800 for the next six months?
00:24:01.560 So,
00:24:01.780 you know,
00:24:02.020 I completely agree
00:24:02.780 that it's going to be
00:24:03.500 all pandemic
00:24:04.900 and Trump
00:24:05.920 all the time
00:24:07.140 with just sideshows
00:24:08.500 over Biden
00:24:09.240 and other things
00:24:10.520 here and there.
00:24:11.360 So,
00:24:11.780 there's no chance
00:24:12.440 of the fever breaking
00:24:13.300 until after the election.
00:24:15.880 I'm certainly hoping
00:24:16.880 that Trump
00:24:17.340 is not re-elected.
00:24:19.440 I think that,
00:24:20.280 you know,
00:24:20.700 as many people said,
00:24:21.740 oh,
00:24:21.840 well,
00:24:22.020 you know,
00:24:22.180 there are adults
00:24:22.640 in the room.
00:24:23.320 In the first year or two,
00:24:24.100 there were many good people
00:24:25.220 in government
00:24:26.020 and I think
00:24:27.300 there are not so many
00:24:28.680 of them
00:24:29.680 at the upper level
00:24:30.260 anymore.
00:24:31.080 So,
00:24:31.720 the point is that
00:24:32.380 the craziness
00:24:33.560 we've seen
00:24:34.160 in the last year or two
00:24:35.260 would get even worse.
00:24:37.980 So,
00:24:38.160 I think that if
00:24:38.900 Trump is re-elected,
00:24:40.040 I think the damage
00:24:41.540 to our democracy
00:24:42.480 and our reputation
00:24:43.940 in the world,
00:24:44.480 our standing in the world,
00:24:45.760 I don't,
00:24:46.500 I'm terrified
00:24:47.380 to think what would happen.
00:24:49.480 If Biden wins
00:24:51.600 or,
00:24:52.020 you know,
00:24:52.540 there could be some route
00:24:53.360 in which he's not the nominee
00:24:54.380 or,
00:24:54.740 you know,
00:24:54.860 who knows what's going to happen.
00:24:56.380 But if Biden wins,
00:24:58.280 it would be great
00:24:59.360 if we had a bold
00:25:01.500 and inspiring leader
00:25:02.900 and,
00:25:03.500 you know,
00:25:04.120 I'm not expecting
00:25:04.940 that Biden will rise
00:25:05.740 to that level,
00:25:06.460 but,
00:25:06.900 you know,
00:25:07.040 who knows.
00:25:08.520 There is a,
00:25:09.100 of course,
00:25:09.480 there's a chance
00:25:10.000 for a reset
00:25:11.400 of a lot of things.
00:25:12.480 It's very hard to predict
00:25:13.420 how things would play out.
00:25:15.600 The one,
00:25:16.340 the one thing
00:25:16.780 I would question
00:25:17.280 what you said
00:25:17.860 is you say
00:25:18.480 nobody is predicting
00:25:19.280 a vaccine
00:25:19.800 for a very long time.
00:25:20.880 Yes,
00:25:21.020 that's true.
00:25:21.920 But,
00:25:22.440 you know,
00:25:22.600 this is one of those things
00:25:23.500 like we've been told
00:25:24.140 a lot of things
00:25:24.840 about what,
00:25:26.160 about the virus,
00:25:27.160 like don't wear masks
00:25:28.180 and,
00:25:28.520 you know,
00:25:28.700 wash your hands
00:25:29.340 for 20 seconds.
00:25:30.560 And it turns out
00:25:31.300 a lot of that
00:25:32.020 was either wrong
00:25:32.760 or at least not based
00:25:33.520 on evidence.
00:25:35.160 It is true
00:25:35.580 that experts tell us
00:25:36.900 it's likely to be
00:25:37.460 a long time
00:25:38.060 and you're right
00:25:38.640 that no vaccine
00:25:39.280 has ever been invented
00:25:40.000 that quickly.
00:25:40.680 But,
00:25:41.000 you know,
00:25:41.200 there's a hundred,
00:25:42.320 I just saw on the news
00:25:43.020 the other day,
00:25:43.460 there's a hundred
00:25:44.200 vaccines that are in development
00:25:45.860 and three or four of them
00:25:46.680 are going into clinical trials now.
00:25:48.540 And,
00:25:48.720 of course,
00:25:49.200 we're not,
00:25:49.560 there's no way
00:25:50.040 we're going to follow
00:25:50.640 the old protocol
00:25:51.300 where we,
00:25:51.880 you know,
00:25:52.460 inoculate a lot of people
00:25:53.580 and wait a year
00:25:54.220 to see if how many got sick.
00:25:55.400 No,
00:25:55.520 we're going to do
00:25:55.840 challenge trials.
00:25:56.640 People are going to
00:25:57.040 volunteer like crazy
00:25:58.120 to be infected
00:25:59.280 with the virus
00:25:59.780 to see if they have
00:26:00.420 the immunity.
00:26:00.820 So,
00:26:01.260 I just raise this
00:26:02.060 as just one example
00:26:03.740 of how a lot of things
00:26:05.360 that are put forth
00:26:06.060 as facts about this,
00:26:07.360 you have to at least
00:26:08.080 actively look
00:26:08.960 and say,
00:26:09.900 okay,
00:26:10.140 is this really a fact?
00:26:11.160 Do we really know this?
00:26:12.600 And,
00:26:12.820 you know,
00:26:12.980 under what scenarios
00:26:13.680 might this not come true?
00:26:15.240 And,
00:26:15.420 of course,
00:26:15.700 if,
00:26:16.080 suppose one of these,
00:26:16.840 you know,
00:26:16.920 there's one just starting
00:26:17.480 at NYU,
00:26:18.100 just,
00:26:18.280 I saw it on the news
00:26:18.820 on Friday.
00:26:19.380 They're injecting,
00:26:21.620 well,
00:26:21.820 they're giving the vaccine
00:26:22.480 to people this week
00:26:23.600 and then they'll start
00:26:24.780 exposing them
00:26:25.480 or some of them,
00:26:26.040 I think.
00:26:26.460 I'm not sure
00:26:26.820 what the plan is exactly,
00:26:28.060 but they'll have an answer
00:26:29.020 within a couple months.
00:26:30.400 And so,
00:26:30.820 let's just suppose
00:26:31.560 it works.
00:26:32.760 Well,
00:26:33.020 that would really
00:26:33.800 change everything
00:26:34.660 and in a way
00:26:35.780 that I think,
00:26:36.300 obviously,
00:26:37.000 could greatly benefit
00:26:38.140 Trump.
00:26:39.260 What I'm hoping,
00:26:40.400 you know,
00:26:40.760 presidents,
00:26:41.440 leaders often get a bump
00:26:42.400 because of a crisis.
00:26:43.780 Trump got hardly any,
00:26:45.200 but,
00:26:45.840 you know,
00:26:46.180 it's the incompetence,
00:26:47.160 which is what I'm hoping
00:26:48.160 will turn off
00:26:49.460 the middle of the country.
00:26:50.780 It's the bumbling
00:26:51.620 incompetence
00:26:52.440 that I think
00:26:53.580 is likely to be powerful
00:26:54.880 for a lot of people
00:26:56.280 who are not part
00:26:56.920 of his base.
00:26:58.580 But if somehow,
00:26:59.840 you know,
00:27:00.140 if there's a scientific
00:27:01.260 breakthrough
00:27:01.760 and the vaccine
00:27:02.840 comes quickly,
00:27:03.920 a lot of people
00:27:04.500 will say,
00:27:04.860 see,
00:27:05.160 it's just like Trump said,
00:27:06.220 it'll just magically
00:27:06.880 go away.
00:27:07.940 So,
00:27:08.240 you know,
00:27:08.400 I just think we can't,
00:27:09.160 it's very hard to game out
00:27:10.180 how things are going to play out
00:27:11.400 both scientifically
00:27:12.280 and economically.
00:27:15.000 Yeah,
00:27:15.180 yeah,
00:27:15.380 I would place a bet
00:27:16.920 on what seems to be
00:27:17.980 the pervasive incompetence
00:27:19.540 at the moment.
00:27:20.160 I mean,
00:27:20.240 just because even if we had
00:27:21.520 a vaccine today
00:27:22.840 that we knew worked,
00:27:24.840 we have to roll that out
00:27:26.540 to,
00:27:26.920 you know,
00:27:27.220 in our case,
00:27:27.900 350 million people
00:27:29.200 and our struggle
00:27:31.140 to even get
00:27:32.120 testing going
00:27:33.720 is instructive.
00:27:35.920 You know,
00:27:36.160 so you just imagine
00:27:36.860 having to produce
00:27:37.780 the vials of vaccine
00:27:40.240 and if this is an injectable,
00:27:41.880 right,
00:27:42.120 as opposed to something
00:27:43.160 that you can inhale.
00:27:45.020 It's daunting.
00:27:45.860 Yeah,
00:27:46.020 but you know,
00:27:46.280 look,
00:27:46.480 it could be invented
00:27:47.680 in China.
00:27:48.400 You know,
00:27:48.520 we're all assuming
00:27:49.120 that it's going to be
00:27:49.580 invented by Americans,
00:27:51.160 but you know,
00:27:51.560 there's a lot going on
00:27:52.320 in Europe,
00:27:52.820 in Israel,
00:27:53.400 in China.
00:27:54.920 So,
00:27:55.560 then all the more reason
00:27:56.260 to worry that we're not
00:27:57.160 first in line
00:27:58.020 to get it,
00:27:58.840 right?
00:27:59.000 That's right.
00:27:59.860 Well,
00:28:00.220 so I don't know how
00:28:01.820 in the weeds
00:28:03.040 you've gotten
00:28:03.620 with Trump supporters.
00:28:06.140 This is a,
00:28:06.680 I've commented on this
00:28:07.680 in several places
00:28:08.860 and the thing about
00:28:09.880 the Trump phenomenon
00:28:10.720 that has been
00:28:11.980 most mystifying to me
00:28:13.380 is that among
00:28:15.220 his supporters
00:28:16.280 and not even people
00:28:17.800 who are unsophisticated,
00:28:20.280 even people who,
00:28:21.340 I'm just,
00:28:21.900 I'm surprised to even
00:28:22.780 discover they are,
00:28:24.420 they did support him
00:28:25.480 at all.
00:28:26.380 What I find
00:28:27.660 that is truly mystifying
00:28:30.140 and really
00:28:30.840 just confounds
00:28:32.540 any effort
00:28:33.280 to have a reasonable
00:28:34.460 conversation about
00:28:35.680 politics
00:28:36.380 is a
00:28:37.800 total unwillingness
00:28:39.300 to admit
00:28:40.020 that there's anything
00:28:41.540 consequentially wrong
00:28:43.680 with him.
00:28:44.420 Yeah.
00:28:44.920 That his lack of
00:28:46.480 understanding
00:28:47.620 of complex issues,
00:28:49.260 that his
00:28:49.800 bluster,
00:28:51.100 his dishonesty,
00:28:52.620 that any of this
00:28:53.860 is in any way
00:28:55.460 negative,
00:28:57.300 right?
00:28:57.980 What I feel like
00:28:59.040 I meet
00:28:59.480 in trying to convince
00:29:01.140 Trump supporters
00:29:02.240 of anything
00:29:02.800 is just an
00:29:03.920 absolute stonewalling
00:29:05.860 on points
00:29:07.520 that just seem
00:29:08.600 objectively true
00:29:10.140 and my noticing
00:29:11.240 them
00:29:11.620 is not at all
00:29:12.860 a sign of my own
00:29:13.860 partisanship.
00:29:14.600 Just to say that
00:29:15.580 Trump lies
00:29:16.660 more than is
00:29:18.280 normal
00:29:18.760 in a politician.
00:29:19.600 That is as
00:29:21.500 objectively true
00:29:22.560 as the
00:29:23.040 Pythagorean theorem.
00:29:24.600 There's just no
00:29:25.320 possibility
00:29:26.200 of debating that
00:29:27.660 and yet
00:29:28.200 even that
00:29:29.440 will not be
00:29:30.120 conceded
00:29:30.800 or if conceded
00:29:32.500 there'll be some
00:29:33.600 assertion that it
00:29:34.480 just doesn't matter.
00:29:36.060 All politicians
00:29:36.740 lie
00:29:37.280 is the mantra
00:29:38.520 you will reliably
00:29:39.720 hear at that
00:29:40.740 juncture in the
00:29:41.440 conversation
00:29:41.880 and there's
00:29:43.460 something like
00:29:43.880 a hundred points
00:29:44.740 like that.
00:29:45.940 It's hard to
00:29:46.440 understand
00:29:47.080 what is
00:29:48.860 at the root
00:29:49.980 of it
00:29:50.260 because
00:29:50.700 this is not
00:29:51.620 an ordinary
00:29:52.180 form of
00:29:52.700 tribalism.
00:29:53.460 This is not
00:29:54.020 like members
00:29:55.540 of the Christian
00:29:56.700 right who are
00:29:57.760 Christian fundamentalists
00:29:58.800 and they have a
00:29:59.400 whole worldview
00:30:00.160 organized around
00:30:01.180 there having
00:30:02.240 grown up
00:30:02.820 evangelical
00:30:03.380 or whatever
00:30:03.860 and now
00:30:04.240 they're voting
00:30:04.820 for whoever
00:30:06.120 it is
00:30:06.480 George Bush
00:30:07.040 because he's
00:30:08.320 on their side
00:30:09.240 and he's going
00:30:09.660 to put in
00:30:10.020 the right
00:30:10.280 conservative judges
00:30:11.120 and block
00:30:11.580 abortion.
00:30:12.200 It's not part
00:30:13.280 of a whole
00:30:13.780 system of belief
00:30:14.720 like that.
00:30:15.260 It's just
00:30:15.820 in many cases
00:30:17.080 the only thing
00:30:18.160 that seems
00:30:18.840 to be
00:30:19.320 organizing it
00:30:20.360 ideologically
00:30:20.940 is a
00:30:22.320 revulsion
00:30:23.040 at the
00:30:25.000 status quo
00:30:25.820 that was
00:30:27.220 repudiated
00:30:28.000 in the
00:30:28.420 2016 election.
00:30:30.520 The business
00:30:31.040 as usual
00:30:31.600 that Hillary
00:30:32.220 Clinton
00:30:32.580 represented
00:30:33.220 we don't
00:30:34.140 want any
00:30:34.480 more of
00:30:34.740 that
00:30:35.060 and also
00:30:36.060 we probably
00:30:36.440 don't want
00:30:36.680 to pay
00:30:36.880 any more
00:30:37.180 in taxes
00:30:37.680 and you
00:30:38.380 get those
00:30:38.760 two variables
00:30:39.660 clattering
00:30:40.920 around a
00:30:41.620 person's
00:30:42.000 brain
00:30:42.320 and it
00:30:43.280 it has
00:30:44.240 summed
00:30:45.180 to something
00:30:45.820 like a
00:30:47.120 cultic
00:30:48.080 unwillingness
00:30:49.060 to admit
00:30:49.540 the obvious
00:30:50.240 just across
00:30:51.100 the board
00:30:51.620 whenever the
00:30:52.180 conversation
00:30:52.660 turns to
00:30:53.400 Trump.
00:30:54.160 Yeah so let
00:30:54.600 me let me
00:30:55.160 give you
00:30:55.500 the handy
00:30:56.640 a handy
00:30:57.240 little
00:30:57.500 psychological
00:30:58.000 tool that
00:30:58.840 that explains
00:30:59.880 that can
00:31:00.440 explain this
00:31:01.140 so there's
00:31:02.200 a there's
00:31:02.640 a wonderful
00:31:03.120 term there's
00:31:03.660 research by
00:31:04.200 Tom Gilovich
00:31:04.940 at Cornell
00:31:05.580 who studied
00:31:06.760 motivated reasoning
00:31:07.540 and I got
00:31:08.300 this little
00:31:08.600 formula from
00:31:09.160 him he says
00:31:09.880 when we
00:31:10.840 when we
00:31:11.260 want to
00:31:11.800 believe
00:31:12.100 something
00:31:12.580 we don't
00:31:13.460 look at
00:31:13.740 the evidence
00:31:14.180 and say
00:31:14.620 is the
00:31:14.940 evidence
00:31:15.260 mostly on
00:31:15.940 the side
00:31:16.720 that I
00:31:17.040 want to
00:31:17.300 believe
00:31:17.580 we just
00:31:18.360 say can
00:31:19.700 I believe
00:31:20.200 it do I
00:31:20.560 have permission
00:31:20.980 to believe
00:31:21.460 it meaning
00:31:21.820 can I find
00:31:22.760 one example
00:31:23.840 one argument
00:31:24.840 one piece
00:31:25.320 of evidence
00:31:25.800 and if I
00:31:26.620 can I'm
00:31:27.400 done I
00:31:27.840 stop thinking
00:31:28.300 because if
00:31:28.960 someone challenges
00:31:29.560 me I can
00:31:30.340 point to this
00:31:30.800 piece of
00:31:31.100 evidence
00:31:31.460 whereas if
00:31:32.660 you don't
00:31:32.840 want to
00:31:33.020 believe
00:31:33.160 something you
00:31:33.660 say must
00:31:34.420 I believe
00:31:35.000 it am I
00:31:35.340 forced to
00:31:36.020 believe it
00:31:36.500 so I've
00:31:37.020 had the same
00:31:37.340 experience as
00:31:38.060 you I
00:31:38.460 have several
00:31:39.180 you know I
00:31:39.880 with a lot
00:31:40.220 of people
00:31:40.560 on the
00:31:40.820 left and
00:31:41.140 the right
00:31:41.420 and I
00:31:41.600 have some
00:31:41.840 very smart
00:31:42.460 correspondents
00:31:43.520 who are
00:31:43.760 Trump supporters
00:31:44.460 and I've
00:31:45.300 had that
00:31:45.520 exact debate
00:31:46.340 with him
00:31:46.680 about whether
00:31:47.400 there's
00:31:47.820 something wrong
00:31:49.200 with him
00:31:49.660 and you
00:31:50.360 know the
00:31:50.700 psychologists
00:31:51.400 the psychiatrists
00:31:52.540 say the
00:31:52.940 most likely
00:31:53.480 most likely
00:31:54.960 diagnosis is
00:31:55.660 narcissistic
00:31:56.260 personality
00:31:56.840 disorder
00:31:57.360 he makes
00:31:58.500 everything about
00:31:59.220 him and
00:32:00.160 you and I
00:32:00.620 think that
00:32:00.980 that's as
00:32:01.340 objective a
00:32:01.960 fact as
00:32:02.460 the sun
00:32:02.820 rises in
00:32:03.320 the east
00:32:03.620 that he
00:32:03.880 does that
00:32:04.220 more than
00:32:04.700 other people
00:32:05.180 but once
00:32:06.400 you understand
00:32:06.820 that everybody's
00:32:07.680 asking not
00:32:08.640 is it true
00:32:09.480 but must I
00:32:10.200 believe it
00:32:10.780 well the
00:32:11.440 answer is
00:32:11.980 always no
00:32:12.900 there's almost
00:32:13.820 nothing that you
00:32:14.480 have to believe
00:32:15.260 certainly not
00:32:16.720 anything about
00:32:17.260 politics or
00:32:18.040 anything that
00:32:18.380 can't be
00:32:18.700 measured exactly
00:32:19.500 precisely and
00:32:20.380 you know with
00:32:20.920 no no you
00:32:22.100 know questioning
00:32:22.620 about what the
00:32:23.100 rules are so
00:32:24.260 you and I can
00:32:24.940 point to well
00:32:25.860 look at the
00:32:26.200 fact checkers
00:32:26.860 they find you
00:32:27.400 know 10,000
00:32:28.300 errors well you
00:32:29.860 know the Trump
00:32:30.920 supporters will
00:32:31.400 simply point out
00:32:32.140 that the fact
00:32:32.640 those fact
00:32:33.520 checkers work
00:32:34.140 for the you
00:32:34.780 know the
00:32:34.960 Washington
00:32:35.640 post or
00:32:36.460 or snopes or
00:32:37.500 other places
00:32:37.920 that have
00:32:38.200 known left
00:32:38.680 wing biases
00:32:39.760 and they're
00:32:40.140 right so it's
00:32:41.400 very hard to
00:32:42.060 get at the
00:32:42.600 truth and you
00:32:43.600 know I think
00:32:44.400 of course there
00:32:44.820 is a truth but
00:32:46.220 when Trump
00:32:47.040 supporters ask
00:32:47.720 must I
00:32:48.300 believe it the
00:32:49.260 answer is
00:32:49.560 always no and
00:32:51.360 one of the
00:32:52.000 best ways to
00:32:52.620 get a little
00:32:53.200 bit more
00:32:53.560 humility here
00:32:54.300 and calm down
00:32:55.420 the anger a
00:32:55.940 little bit is
00:32:57.140 to say just
00:32:58.260 always turn it
00:32:58.780 around and
00:32:59.360 say is there
00:33:00.080 some different
00:33:00.820 issue on
00:33:01.940 which my side
00:33:02.760 is just as
00:33:03.340 obtuse and
00:33:04.940 you know I
00:33:05.820 think people
00:33:06.280 on the right
00:33:07.120 would point out
00:33:07.820 that well you
00:33:08.380 know people
00:33:09.180 on the left
00:33:09.740 pretty much
00:33:10.160 anything about
00:33:10.860 race and
00:33:11.320 gender and
00:33:12.300 LGBT and
00:33:13.480 you know
00:33:13.740 immigration I
00:33:14.460 mean there's
00:33:14.660 all these
00:33:15.280 issues that
00:33:16.680 are sacred
00:33:17.240 issues on
00:33:18.000 the left at
00:33:18.620 least in my
00:33:19.040 part of the
00:33:19.660 left and
00:33:20.020 you know
00:33:20.320 universities
00:33:20.900 but as you
00:33:22.020 know I
00:33:22.400 spend a lot
00:33:22.980 of time
00:33:23.240 hammering the
00:33:23.840 left for
00:33:24.340 its
00:33:24.600 yeah
00:33:24.840 yeah
00:33:25.440 I admire
00:33:25.740 your guts
00:33:27.560 yes yes
00:33:28.420 so I get
00:33:29.340 it from both
00:33:29.720 sides yet
00:33:30.940 I mean when
00:33:31.580 you just look
00:33:32.260 at at
00:33:32.760 the way in
00:33:33.580 which we
00:33:33.980 have shed
00:33:35.000 influence in
00:33:36.000 the world
00:33:36.520 in the last
00:33:37.620 few years
00:33:38.160 where we
00:33:38.920 have just
00:33:39.240 by turns
00:33:40.060 terrified
00:33:41.360 our allies
00:33:43.040 and gratified
00:33:44.340 our actual
00:33:45.740 adversaries
00:33:46.660 it's just
00:33:47.400 yeah it's
00:33:48.320 mind-boggling
00:33:48.920 that you
00:33:49.580 have a
00:33:49.900 something like
00:33:51.040 40 percent
00:33:52.020 of American
00:33:52.480 society that
00:33:53.640 sees absolutely
00:33:54.980 no problem
00:33:55.720 with this
00:33:56.060 I mean worse
00:33:56.480 they see
00:33:56.920 some this
00:33:57.480 as some
00:33:57.860 form of
00:33:58.220 progress
00:33:58.760 yeah
00:33:59.280 okay so
00:34:00.160 let me
00:34:00.320 so here's
00:34:00.760 the metaphor
00:34:01.120 that's helped
00:34:01.620 me understand
00:34:02.260 the otherwise
00:34:04.020 just unfathomable
00:34:05.220 state of our
00:34:06.300 country now
00:34:07.340 so I began
00:34:08.540 to feel
00:34:09.200 around 2014
00:34:10.120 2015
00:34:10.660 that something
00:34:11.500 was deeply
00:34:12.200 wrong
00:34:12.660 like
00:34:13.520 something has
00:34:15.320 changed about
00:34:15.980 the universe
00:34:16.520 and I played
00:34:18.440 with this
00:34:18.720 I just had
00:34:19.160 this uncomfortable
00:34:19.860 feeling for
00:34:20.780 a couple of
00:34:21.460 years
00:34:21.700 and finally
00:34:22.240 a year or two
00:34:22.780 ago I started
00:34:23.280 working this
00:34:24.220 metaphor into
00:34:24.980 my talks
00:34:25.580 suppose that
00:34:26.860 God one day
00:34:28.580 just doubled
00:34:29.560 the gravitational
00:34:30.200 constant
00:34:30.700 so you know
00:34:31.740 in our
00:34:32.000 universe
00:34:32.280 there's like
00:34:32.620 25 physical
00:34:33.940 constants
00:34:34.600 you know
00:34:35.140 the mass of
00:34:35.660 an electron
00:34:36.040 things like
00:34:36.540 that
00:34:36.820 and if
00:34:37.720 God just
00:34:38.160 said one
00:34:38.580 day let's
00:34:38.840 just double
00:34:39.280 the gravitational
00:34:39.920 constant
00:34:40.340 just for
00:34:40.720 fun
00:34:41.040 like everything
00:34:42.080 would go
00:34:42.420 totally haywire
00:34:43.300 in the physical
00:34:44.180 world and you
00:34:44.720 know planets
00:34:45.180 would move
00:34:45.600 in their orbits
00:34:46.180 and planes
00:34:46.860 would come
00:34:47.220 out of the sky
00:34:47.900 and it would
00:34:48.420 just be you
00:34:48.960 know bizarre
00:34:49.560 and disastrous
00:34:50.260 and I think
00:34:52.780 that what
00:34:53.120 happened is
00:34:54.000 basically that
00:34:55.080 but in the
00:34:55.540 social world
00:34:56.280 and that is
00:34:57.380 you know
00:34:58.080 connectivity
00:34:58.500 is generally
00:34:59.580 good but
00:35:00.580 we're now
00:35:00.920 hyperconnected
00:35:02.080 that's changing
00:35:03.060 a basic
00:35:03.440 parameter of
00:35:04.200 the universe
00:35:04.680 we're so
00:35:05.200 connected
00:35:05.660 but it's more
00:35:06.680 than that
00:35:06.980 it's not just
00:35:07.540 you know
00:35:08.000 like oh
00:35:08.460 we're you know
00:35:08.820 because giving
00:35:09.980 us telephones
00:35:10.720 and email
00:35:11.220 I mean we've
00:35:11.640 been getting
00:35:11.920 more and more
00:35:12.280 connected for
00:35:12.940 centuries
00:35:13.500 and that's
00:35:14.200 generally been
00:35:14.500 a good thing
00:35:14.980 it's the nature
00:35:16.060 of the connectivity
00:35:16.820 it's connectivity
00:35:18.000 in which we
00:35:18.800 are communicating
00:35:19.580 not privately
00:35:20.980 but in front
00:35:22.140 of an audience
00:35:22.760 and the audience
00:35:23.760 rates the
00:35:24.560 communication
00:35:25.140 so this I
00:35:26.340 think is what
00:35:26.680 social media
00:35:27.180 has done to
00:35:27.960 us
00:35:28.300 that is when
00:35:29.400 Facebook and
00:35:30.180 MySpace came
00:35:31.500 out it was
00:35:32.720 just you know
00:35:33.660 look here's my
00:35:34.420 page here are
00:35:35.340 all my friends
00:35:36.080 here are the
00:35:36.440 bands I like
00:35:37.380 you know there's
00:35:38.040 some showing off
00:35:38.700 but it wasn't
00:35:39.260 toxic and it
00:35:39.880 was not bad
00:35:40.420 for democracy
00:35:41.180 I have an
00:35:42.580 article in the
00:35:42.980 Atlantic last
00:35:43.820 November with
00:35:44.300 Tobias Rose
00:35:44.860 Stockwell where
00:35:45.480 we show how
00:35:46.160 beginning in
00:35:47.020 2009 when
00:35:48.400 Facebook added
00:35:49.120 the like button
00:35:50.040 and then Twitter
00:35:51.060 copied it and
00:35:51.720 Twitter added the
00:35:52.460 retweet button
00:35:53.220 and Facebook
00:35:53.720 copied it and
00:35:55.000 then they both
00:35:55.520 algorithm sized
00:35:56.360 their news feeds
00:35:57.320 much more so
00:35:58.260 between 2009
00:35:59.120 and 2012 the
00:36:00.960 nature of human
00:36:01.740 connectivity changed
00:36:03.320 radically in ways
00:36:05.040 that I think are
00:36:05.580 very very bad for
00:36:06.660 democracy that is
00:36:08.100 it wasn't just that
00:36:09.120 we could now talk
00:36:09.760 to each other
00:36:10.180 privately for free
00:36:11.120 it's that a lot
00:36:12.620 more of our
00:36:13.080 conversation was
00:36:13.840 now in public
00:36:14.860 being raided which
00:36:15.840 means it was
00:36:16.300 inauthentic often
00:36:17.420 dishonest and with
00:36:19.000 a lot more
00:36:19.680 intimidation you
00:36:21.020 know I hate
00:36:21.580 Twitter I hate
00:36:22.200 going on Twitter
00:36:22.820 I'm also fascinated
00:36:23.660 by it and I you
00:36:24.520 know I
00:36:25.000 it's like opening
00:36:25.800 a garbage can
00:36:26.520 and watching rats
00:36:27.240 and cockroaches
00:36:27.880 fighting and
00:36:28.460 there's something
00:36:28.800 fascinating about
00:36:29.600 it but things
00:36:30.620 really changed
00:36:31.380 after 2012 and
00:36:32.820 the Russians
00:36:33.220 noticed it and
00:36:34.100 they've been trying
00:36:34.600 to mess with our
00:36:35.020 democracy for 50
00:36:36.140 years in 2014 is
00:36:38.180 when they realized
00:36:38.960 hey there's this
00:36:40.280 great outrage
00:36:41.360 machine that the
00:36:42.080 Americans have
00:36:42.640 built for us and
00:36:43.820 it's we don't have
00:36:44.420 to go over there
00:36:45.020 we don't have to
00:36:45.620 fly agents over to
00:36:46.540 mess them up we
00:36:47.160 can just sit here
00:36:47.660 in St.
00:36:47.980 Petersburg do it
00:36:48.840 so you know I
00:36:50.300 think that you
00:36:51.760 know I hear your
00:36:52.360 incomprehension I
00:36:53.420 hear your you
00:36:54.800 know your
00:36:55.160 frustration things
00:36:57.120 are terribly wrong
00:36:58.240 and we could
00:36:59.280 blame we could
00:37:00.520 blame those Trump
00:37:01.220 supporters we could
00:37:02.060 say they must be
00:37:03.120 insane they must be
00:37:04.420 badly motivated but
00:37:06.040 that's not likely to
00:37:06.840 be true they're
00:37:07.960 likely to be normal
00:37:08.760 human beings and
00:37:10.100 so I think we have
00:37:10.940 to look elsewhere
00:37:11.520 that's why I'm so
00:37:12.360 mystified because you
00:37:13.140 know the people I
00:37:14.100 have in my personal
00:37:14.900 life who are Trump
00:37:16.580 supporters I know
00:37:17.900 to be you know
00:37:18.820 smart and well
00:37:19.640 intentioned and it's
00:37:21.220 just that they're
00:37:22.280 completely aloof with
00:37:23.980 respect to all of
00:37:26.120 the downsides of his
00:37:28.780 personality and and
00:37:30.620 what to my eye are
00:37:31.700 the obvious risks
00:37:32.940 being magnified by
00:37:34.660 those downsides
00:37:35.480 yeah what an amazing
00:37:36.360 species we are that
00:37:37.240 we can believe such
00:37:38.400 obviously false things
00:37:39.760 I think I think
00:37:41.000 there's some people
00:37:41.580 who've done some
00:37:41.960 work on that
00:37:42.600 yeah yeah so yeah
00:37:45.460 no I agree that
00:37:46.580 the style of
00:37:47.480 communication and
00:37:48.680 it's created an
00:37:50.220 information space
00:37:51.860 where it really is
00:37:53.040 just total war all
00:37:54.620 the time that's
00:37:55.340 right information
00:37:55.840 terms that's right
00:37:56.980 yeah and that's no
00:37:58.140 way yeah and I
00:37:58.840 don't think our
00:37:59.180 democracy can survive
00:38:00.120 that I think that
00:38:01.140 if things keep going
00:38:02.300 the way they're
00:38:02.800 going our country is
00:38:04.000 going to fail
00:38:04.460 catastrophically I'm
00:38:05.760 not predicting that
00:38:06.480 it will because I
00:38:07.220 don't think things
00:38:07.860 will keep on going
00:38:08.620 the way they're
00:38:08.940 going but the
00:38:09.780 trends are really
00:38:10.560 bad and they've
00:38:11.120 been really bad for
00:38:12.180 for at least 10
00:38:12.880 years more than
00:38:13.440 that even so what
00:38:15.040 would you change
00:38:16.240 I mean if you
00:38:16.580 could actually get
00:38:17.560 Jack Dorsey and
00:38:19.000 Mark Zuckerberg and
00:38:20.040 other people to just
00:38:20.980 take your advice
00:38:22.200 what would you
00:38:23.600 change
00:38:24.640 so yeah there's I
00:38:25.580 mean there's all
00:38:26.160 kinds of systems I
00:38:26.980 change including you
00:38:28.360 know elections and
00:38:29.480 Congress and all that
00:38:30.220 but if we're going to
00:38:30.620 focus on social media
00:38:31.660 Tobias and I offered a
00:38:33.400 couple of suggestions
00:38:34.380 the most important
00:38:36.300 one the most important
00:38:37.160 single thing that we
00:38:38.220 think needs to change
00:38:39.720 is there has to be
00:38:41.200 some kind of identity
00:38:42.780 verification for our
00:38:44.140 major platforms I'm
00:38:45.860 we're not saying that
00:38:46.720 you have to post with
00:38:47.580 your real name that
00:38:48.820 we understand that
00:38:49.480 there's often a need
00:38:50.340 to not to use an
00:38:51.740 avatar or a fake name
00:38:53.200 but imagine but you
00:38:55.600 know if democracy is
00:38:56.600 moving into a virtual
00:38:57.740 public square if if
00:38:59.380 what's fundamental to
00:39:00.180 our democracy is how
00:39:00.960 we engage with each
00:39:02.060 other and we're no
00:39:03.460 longer doing that in
00:39:04.280 newspapers and real
00:39:05.180 public squares we're
00:39:05.900 now doing it on
00:39:06.680 Facebook and Twitter
00:39:08.240 and Instagram places
00:39:09.040 like that I think
00:39:09.700 these places have an
00:39:11.000 obligation to create a
00:39:13.100 kind of public square
00:39:14.060 that that fosters some
00:39:16.740 sort of understanding
00:39:17.500 some sort of working
00:39:18.400 out and that really
00:39:19.520 cracks down on
00:39:21.060 intimidation it is
00:39:22.440 stunning to me that
00:39:24.000 you can make death
00:39:25.240 threats rape threats
00:39:26.620 racist rants you can
00:39:28.040 say anything you want
00:39:29.360 and the worst that'll
00:39:30.960 happen to you is you
00:39:32.080 know eventually your
00:39:33.540 account will be closed
00:39:34.100 down and then you
00:39:34.600 could just make 10
00:39:35.240 others with no
00:39:36.260 verification and the
00:39:38.040 Russians figured this
00:39:38.760 out long ago and a
00:39:40.420 lot of Americans do it
00:39:41.380 too so if we're
00:39:43.300 serious about having
00:39:44.080 a democracy that has
00:39:45.020 a public square and
00:39:45.880 that public square
00:39:46.520 happens on these
00:39:47.080 platforms I think
00:39:48.380 there has to be at
00:39:49.220 least enough skin in
00:39:51.000 the game that our
00:39:52.720 accountability that
00:39:54.120 when people open an
00:39:55.060 account on on
00:39:56.120 Facebook or Twitter
00:39:57.100 Instagram or any of
00:39:57.920 the major platforms
00:39:58.720 there they let's
00:40:00.460 let's suppose it
00:40:01.140 worked like this the
00:40:02.820 platform would send
00:40:03.900 them out to some
00:40:04.680 other entity maybe
00:40:05.640 it's a non-government
00:40:06.660 agency entity it's a
00:40:08.280 non-profit the
00:40:09.480 internet has a number
00:40:10.160 of those and that
00:40:11.300 that entity just
00:40:13.500 verifies that you
00:40:14.880 are a real person
00:40:15.940 associated with a
00:40:16.880 country and that you
00:40:18.620 are over 18 or not
00:40:20.500 if you're under 18 it
00:40:21.680 would you know there
00:40:22.360 might be another cut
00:40:22.960 off like 13 because
00:40:24.320 right now any 11
00:40:25.960 year old can get on
00:40:26.840 any platform that she
00:40:27.980 wants to and that's a
00:40:29.200 whole nother set of
00:40:29.780 issues we can talk
00:40:30.500 about mental health
00:40:31.240 effects on girls and
00:40:32.800 all kinds of other
00:40:33.420 effects on teenagers
00:40:34.400 but I think that's the
00:40:35.720 most important thing is
00:40:36.640 that we have to reduce
00:40:37.500 trolling intimidation
00:40:38.900 you know I don't
00:40:39.740 want to go into a
00:40:40.420 public square where
00:40:41.140 anybody can like you
00:40:42.160 know you know hit me
00:40:43.220 over the head or throw
00:40:44.020 acid in my face and
00:40:44.900 run away laughing and
00:40:46.180 there's nothing I can
00:40:46.780 do about it that's
00:40:47.940 number one
00:40:48.360 yeah so is there a
00:40:50.240 tension between that
00:40:51.980 and our you know
00:40:53.980 broader concern about
00:40:55.280 free speech I mean
00:40:56.720 obviously these are
00:40:57.300 private platforms and
00:40:58.260 they can regulate speech
00:40:59.320 however they want but
00:41:00.220 given that they're
00:41:01.600 essentially becoming
00:41:03.120 internet infrastructure
00:41:04.480 and they are becoming a
00:41:06.540 kind of public square for
00:41:07.640 which there's no
00:41:08.380 alternative the airing
00:41:10.200 on the side of just
00:41:11.920 basically defaulting to
00:41:13.300 the constitution has
00:41:14.980 seemed tempting what
00:41:16.940 what yeah how do you
00:41:18.160 think about free speech
00:41:19.060 concerns sure so I
00:41:21.280 would hate to live in a
00:41:22.020 country in which if
00:41:22.960 somebody espoused an
00:41:24.720 opinion that somebody
00:41:26.340 else or the government
00:41:27.140 didn't like that that
00:41:28.000 person could be arrested
00:41:28.860 or punished so to me
00:41:30.420 that's the core of free
00:41:31.300 speech you're not there
00:41:31.920 are no thought crimes
00:41:32.740 there are no speech
00:41:33.660 crimes other than
00:41:34.700 obviously intimidation
00:41:35.880 threats there are
00:41:36.420 certain categories that
00:41:37.220 are not constitutionally
00:41:38.120 protected so I would
00:41:40.220 not want I don't want
00:41:41.400 a solution in which
00:41:42.140 platforms have to look
00:41:43.600 at what you say and
00:41:44.620 and judge each thing you
00:41:46.260 say what I'd rather is
00:41:48.500 that it's not focused on
00:41:49.880 the thing you say it's
00:41:51.880 focused on the the
00:41:54.200 features of the space
00:41:56.360 and so if as long as we
00:41:59.220 allow anonymous trolls
00:42:00.720 in well do you have a
00:42:02.160 constitutional right to
00:42:03.200 say whatever you want
00:42:04.020 without anyone knowing
00:42:05.320 who you are I don't
00:42:06.800 think so do you have a
00:42:08.280 right to do you have a
00:42:09.680 right to reach millions
00:42:10.540 of people no you have a
00:42:12.020 right to say what you
00:42:12.740 want without being
00:42:13.360 punished but you know as
00:42:14.420 as it's sometimes said
00:42:15.080 freedom of speech does
00:42:16.080 not mean freedom of
00:42:16.960 reach the platforms are
00:42:18.600 under no obligation to
00:42:19.760 let you reach millions of
00:42:21.240 people with claims that
00:42:22.740 chloroquine is them is a
00:42:24.060 miracle cure that's not
00:42:25.240 that's not free speech so
00:42:27.020 I think just you know
00:42:28.480 these platforms they're not
00:42:30.320 they're not individuals
00:42:31.860 talking you know in the
00:42:33.460 public square and they're
00:42:34.560 not newspapers there's
00:42:36.240 somewhere in between and
00:42:38.020 we don't our law doesn't
00:42:39.000 quite account for that
00:42:39.820 yet but I think just as we
00:42:41.020 have a lot of
00:42:41.700 responsibilities placed on
00:42:42.860 newspapers and magazines
00:42:44.040 I think we need some sort
00:42:45.820 of in-between thing for
00:42:46.940 these platforms and that
00:42:47.900 means no you can't just
00:42:49.300 open a hundred accounts and
00:42:50.640 say whatever you want all
00:42:51.620 day long and attack people
00:42:52.820 without without anyone
00:42:54.100 knowing who you are
00:42:54.840 right so now what are your
00:42:57.300 thoughts about the 2020
00:42:59.400 election and you know now
00:43:02.080 the the concern about the
00:43:03.940 Biden campaign and his
00:43:05.760 viability yeah really on two
00:43:07.660 fronts I mean so that we
00:43:08.600 have the Tara Reid allegations
00:43:10.840 and surrounding those we have
00:43:14.300 this fairly credible charge of
00:43:18.320 hypocrisy against the left
00:43:20.200 because you know we're on the
00:43:21.500 left we're all about me too and
00:43:23.140 believe all women but then the
00:43:26.180 inconvenient woman shows up
00:43:27.800 making fairly shocking claims
00:43:30.600 about the only candidate
00:43:32.840 standing between us and four
00:43:34.360 more years of Trump and what we
00:43:36.180 see is a either a massive
00:43:38.720 disinclination to even hear the
00:43:41.360 allegations and once that
00:43:43.720 becomes untenable we what we've
00:43:45.200 now seen is an analysis of the
00:43:47.800 allegations that you know does
00:43:49.940 frankly suggest a kind of double
00:43:52.400 standard where you know we yeah
00:43:54.320 we could go hard against Brett
00:43:56.000 Kavanaugh when he was nominated
00:43:57.980 for the Supreme Court based on
00:43:59.800 more or less nothing but the
00:44:01.700 fairly dim memory of one person
00:44:04.740 and we're in a similar situation
00:44:07.200 here and behaving rather
00:44:09.880 differently I mean the way I
00:44:11.920 reconcile this you know is just
00:44:14.500 that I think Trump is so dangerous
00:44:18.500 I think four more years of him
00:44:19.860 would be so awful for many of the
00:44:22.220 reasons you mentioned and I do
00:44:24.720 think there's something
00:44:25.740 especially awful about doubling
00:44:27.600 down on Trump for a second term
00:44:30.120 I mean what that says about our
00:44:31.520 country that's right it would
00:44:32.300 validate that it wasn't a fluke we
00:44:34.000 really meant it yeah yeah we know
00:44:36.000 exactly what we're buying here and
00:44:37.680 we're going to buy it again for
00:44:38.700 four more years yeah I mean it's
00:44:40.700 just I don't know how American
00:44:42.580 standing recovers I mean we
00:44:44.240 literally have to have the Messiah
00:44:46.280 come for 2024 to reboot but so
00:44:49.580 given that you know I honestly
00:44:51.920 don't care what is true here I
00:44:55.480 mean it's like I can own that he
00:44:57.980 might have done something
00:44:58.760 absolutely awful which should in a
00:45:01.300 normal world disqualify him for
00:45:03.980 the presidency I don't feel like I
00:45:05.840 know that I don't feel like I don't
00:45:07.260 know that I just feel that whoever
00:45:09.380 Joe Biden is or has been he's better
00:45:12.780 than Trump yeah just his facade of
00:45:15.820 professionalism as a normal politician
00:45:18.240 and a normally empathic person is so
00:45:22.400 much better than what Trump manages
00:45:24.300 to muster as a person that there's
00:45:26.900 really nothing to decide here and for
00:45:28.720 me that seems to skirt hypocrisy I'm
00:45:32.000 not inclined to treat Tara Reid's
00:45:33.820 allegations differently than than
00:45:37.080 Blossy Ford's and if that's the apt
00:45:39.240 comparison it's just that the context
00:45:42.140 is so different that in this case they
00:45:44.300 they don't matter I consider this a
00:45:46.160 political emergency that only has one
00:45:48.200 adequate resolution which is somebody
00:45:50.120 other than Trump becomes president
00:45:51.820 yeah so without getting into the
00:45:54.040 details I have not been following the
00:45:55.220 story closely enough to have a view
00:45:56.500 about what might have actually
00:45:57.440 happened but the key thing that I
00:45:59.700 would want us to focus on here if
00:46:01.780 you're asking about the implications
00:46:03.180 for the election is enthusiasm passion
00:46:06.340 things like that so Trump won the
00:46:09.380 election he didn't in 2016 not because
00:46:12.380 people loved him and wanted him but
00:46:15.140 because we have negative partisanship in
00:46:17.160 this country that is since 2004 we vote
00:46:20.180 more political scientists tell us that
00:46:21.920 you know the the strategy for president
00:46:23.920 used to always be you run to the
00:46:25.360 outside to get your party's nomination
00:46:26.760 and then because America is a fairly
00:46:28.660 moderate country you have to run to
00:46:30.300 the center to get that to win in the
00:46:32.080 general and in 2004 call Karl Rove
00:46:34.860 correctly calculated that the center
00:46:36.920 had shrunk so much that the key was
00:46:39.200 turnout and so they went with gay
00:46:41.200 marriage to try to inflame the
00:46:43.000 evangelicals and it worked they got
00:46:45.060 higher turnout on the right so since
00:46:47.780 then that has been more of a winning
00:46:49.840 strategy and negative partisanship
00:46:51.880 voting against what you don't want is
00:46:55.400 more powerful than voting for what you
00:46:57.380 do want and that I think explains how
00:47:00.920 Donald Trump was able to win in 2016
00:47:03.260 when it seems as though he didn't even
00:47:05.480 want to win he made no preparations for
00:47:08.100 it he didn't spend any of his own money
00:47:09.420 he didn't campaign that hard so he you
00:47:12.660 know Hillary Clinton ran a terrible
00:47:14.100 campaign and against someone who
00:47:16.640 wasn't trying to win and was a complete
00:47:18.340 mess and had no ground game and didn't
00:47:20.320 play by the the normal rules and even
00:47:24.680 though she won the popular vote he still
00:47:27.220 did win by the recognized rules of the
00:47:29.560 game so and that's because her people
00:47:32.980 were not passionate and you know the
00:47:34.820 tone in your voice just before when you
00:47:37.100 were saying why of course you're going
00:47:38.140 to vote for Biden was similar to what a
00:47:40.260 lot of people were saying about Hillary
00:47:41.500 obviously there were very different
00:47:42.740 issues but people weren't passionate
00:47:44.240 about her but they would say well yeah
00:47:45.940 but I mean but she's better than the
00:47:47.000 alternative so that is how Trump won it
00:47:50.280 she it should have you know he should
00:47:51.760 have lost in a landslide but he but he
00:47:53.980 didn't my fear is that while Biden is
00:47:57.500 not an inspiring candidate I do believe
00:47:59.460 the people have known him for a long
00:48:00.540 time who say that he's a fundamentally
00:48:02.480 decent man I that doesn't mean that he
00:48:05.220 didn't do something inappropriate with a
00:48:06.760 with a young woman in the Senate I have
00:48:08.120 no idea but the there's not a lot of
00:48:12.160 enthusiasm for him before people
00:48:14.360 generally like him Democrats I think
00:48:16.680 were okay with him but a lot of groups
00:48:19.060 were not and and that was the big
00:48:20.680 question was you know will the people
00:48:22.920 who wanted Bernie Sanders or Elizabeth
00:48:24.920 Warren you know will they come back to
00:48:27.780 vote for him in the fall and now you add
00:48:30.480 in this which is going to alienate a lot
00:48:33.740 of people particularly women and
00:48:35.540 particularly young women for whom these
00:48:37.020 issues are much more salient these days
00:48:38.760 so I'm extremely concerned about the
00:48:42.840 fall election because I think the
00:48:44.840 Democrats you know the Republicans I was
00:48:47.620 I was fully expecting the Democrats to
00:48:49.800 win no matter who the nominee was unless
00:48:51.160 it was Sanders I was expecting the
00:48:52.540 Democrats were going to win because of
00:48:55.880 the passion issue but now I don't know I
00:48:58.940 don't know what's going to happen and if
00:49:00.320 if Biden is not if a number of
00:49:02.220 constituencies are not enthusiastic
00:49:03.560 they're not going to turn out especially
00:49:05.800 if there are still risks to turning out
00:49:08.580 and especially if mail-in voting is not
00:49:10.460 easy and universal for God's sakes it's
00:49:13.000 you know during a pandemic of course we
00:49:15.700 should all be voting by mail or by
00:49:17.740 internet or by or by other remote
00:49:19.280 methods but everything's so politicized
00:49:21.440 and there's so much incompetence that that
00:49:22.760 may not happen so I don't know what is
00:49:25.060 going to happen and it's another reason
00:49:26.720 for for alarm what about the perception
00:49:29.380 this is the second thing that's dogging
00:49:31.660 Biden the perception of his senescence
00:49:34.800 essentially I mean he's obviously lost a
00:49:37.800 step with respect to his speech and
00:49:39.840 memory and again we're in an environment
00:49:42.740 where there is an asymmetry here with
00:49:46.720 respect to the way his glitches play to
00:49:50.260 the average audience and the way Trump's
00:49:52.400 glitches play I mean Trump is a producer
00:49:54.980 of word salad yeah much if not most of
00:49:58.180 the time and yet it doesn't make him
00:50:00.600 seem old now that's just more Trump
00:50:03.840 right it's like he's got the energy of a
00:50:05.700 20 year old on Adderall yeah so he's full
00:50:09.100 of life and he's just chaos whereas every
00:50:12.520 single glitch every hiccup in his speech
00:50:16.460 for Biden you're holding your breath
00:50:19.300 hoping he can get to the end of the
00:50:20.460 sentence yeah the optics are so different
00:50:22.800 it's surprising and I mean this is the
00:50:25.060 other thing that worries me no that's
00:50:26.520 that's right this is this is why I was
00:50:28.640 not I was not a fan of Joe Biden I mean I
00:50:32.180 like him personally as a person I agree
00:50:34.020 with you he's you know he's a reasonable
00:50:35.480 person but he was you know he ran for
00:50:38.360 president twice before he was a bad
00:50:39.740 candidate and he was not he's not a good
00:50:42.380 campaigner he's not eloquent and you know
00:50:45.360 as a psychologist what I can add is that
00:50:47.260 the research on cognitive aging is just
00:50:49.920 stunning people are at their peak in terms
00:50:52.420 of fluency and speed in their 20s and then
00:50:56.280 it's kind of downhill downhill from there
00:50:58.460 until you get to your 50s or 60s and then
00:51:01.740 this downward slope accelerates so in your
00:51:03.840 70s it really accelerates so most 70 year
00:51:06.520 olds are still doing okay on cognitive
00:51:08.320 tests although they're not nearly as sharp
00:51:09.700 as they used to be but as you go beyond
00:51:11.780 70 by the time you get to 80 most 80
00:51:14.260 year olds are not doing so well obviously
00:51:15.600 you know some are but if Biden was not a
00:51:18.160 good candidate long ago when his brain
00:51:19.800 was much younger I think it's you know
00:51:22.380 there's not much reason to think that
00:51:23.600 he's gonna be much better now and I think
00:51:25.900 we're seeing the signs of that so as you
00:51:27.680 say it's also the issue of vitality and
00:51:29.520 that matters in politics people want a
00:51:31.900 vigorous leader not one who seems frail
00:51:34.240 or scattered so for a lot of reasons you
00:51:37.920 know I think that obviously most or many
00:51:40.440 Democrats wish they had perfect candidate
00:51:42.760 many Democrats think that there were better
00:51:44.880 candidates and with the with the Tara
00:51:47.820 Reid allegations now the you know the
00:51:49.920 candidacy is even weaker so my god is
00:51:52.420 this a drama I mean just when you think
00:51:53.960 it can't get more insane it gets more
00:51:55.880 insane so who do you think he should pick
00:51:58.560 for his VP that I you know that I don't
00:52:01.960 know I've not given any thought to I I
00:52:03.640 imagine that he committed to well I don't
00:52:05.980 know why but you know he committed to
00:52:08.120 picking a woman I suppose knowing that this
00:52:10.680 these allegations were coming so we know
00:52:13.560 once he's done that I don't have I'm not
00:52:15.680 so I'm not a political prognosticator I
00:52:17.720 can't read the you know the horse race
00:52:19.780 politics I don't have a view on that
00:52:21.720 part of your analysis of what social
00:52:25.320 media has done to us and the new kind of
00:52:28.940 balkanization of our epistemology has
00:52:31.660 it you've you've spent some time focusing
00:52:33.960 on the young I mean you know Gen Z and
00:52:37.540 below I mean now we're we're soon dealing
00:52:40.080 with with a cohort of people for whom
00:52:42.240 social media is as common a fact of the
00:52:45.820 world as water which is to say there's
00:52:48.060 never been a time where they were without
00:52:49.460 it yeah and we're also having a younger
00:52:52.600 generation that seems destined to graduate
00:52:57.560 into a economic environment that is just
00:53:01.620 as objectively punishing as any in our
00:53:06.080 lifetime I mean when you think of you
00:53:07.840 what it would be like to be looking for a
00:53:10.620 job in six months unless we reboot here in
00:53:15.100 some way that just is ushers in a
00:53:18.440 renaissance of a sort that will be
00:53:20.560 fundamentally surprising it's hard to see
00:53:22.760 how we escape a fairly dismal economy for a
00:53:25.880 good long while how do you think about the
00:53:29.120 cohort you're currently teaching as
00:53:30.760 undergraduates what's the near future hold
00:53:33.100 yeah so paradoxically it it it could be it
00:53:36.780 could end up in the long run being good for
00:53:39.400 them that is you know clearly it's going to be
00:53:41.720 devastating to their economic prospects in
00:53:44.040 the near term and research on previous
00:53:46.020 generations that graduate into bad economies
00:53:48.060 shows that it does hurt their earnings for the
00:53:49.820 rest of their life on average so I'm not
00:53:51.860 saying this is good overall but the trajectory
00:53:54.680 the the outlook for Gen Z was horrific it was
00:53:57.660 terrible their rates of anxiety depression self-harm and
00:54:01.220 suicide have been spiking upwards since 2012
00:54:04.740 roughly especially for well suicide is up for
00:54:07.480 both both genders but depression anxiety is
00:54:10.540 especially up for girls and so Greg Lukianoff and I
00:54:13.620 wrote this book the coddling of the American mind
00:54:15.560 and we think the two major causes there are many but
00:54:18.340 the two major causes are the vast overprotection
00:54:20.820 the safetyism that we put on kids in the 90s we stopped
00:54:24.580 letting them play outside we told them the world is
00:54:26.860 dangerous we let them just play with devices
00:54:29.480 inside and the normal risk-taking the normal
00:54:32.320 adventures the normal testing the limits of your
00:54:35.480 physical abilities and that we we denied them beginning in
00:54:39.620 the 90s and early 2000s and so this we think is the major
00:54:43.340 reason why Gen Z is coming out so much more fragile and
00:54:46.920 depressed and anxious than the Millennials were since we're
00:54:49.840 talking about kids born 1996 and later the other factor we
00:54:53.240 believe is too early exposure to social media and here I
00:54:56.420 actually have some news to news to report brand new news the
00:54:59.960 long-running debate over screen time I think is actually
00:55:03.400 nearing a resolution that is in in the coddling of the
00:55:06.800 American mind Greg and I focused on social media that's
00:55:08.920 what we thought was worst but we did sometimes refer to
00:55:11.860 screen time or that parents should limit screen time and
00:55:15.480 some other researchers pushed back on us and said no look you
00:55:18.680 know here's the our evidence is that the amount of hours spent
00:55:21.640 on screens that isn't related to mental illness and then Gene
00:55:25.100 Twenge and I reanalyzed data and are basically able to show
00:55:28.680 that consistently if you look at almost all the data sets that
00:55:31.840 show no overall effect of all screen time well if you dig in and
00:55:36.880 you say okay not all screens including TV but rather just social
00:55:41.000 media and not all kids but just girls then you consistently find a
00:55:47.080 relationship between heavy social media use and depression and
00:55:50.300 anxiety and it shows up in lots of data sets lots of different
00:55:53.340 studies and experiments back this up that when people go off of
00:55:56.580 social media they tend to get happier so anyway all I'm saying is I
00:55:59.900 don't think parents need to freak out about screens per se if what
00:56:03.960 they're concerned about is depression anxiety but they should
00:56:06.540 still look out if what they care about is that their kids actually
00:56:09.600 do other things like go outside or learn to climb trees or go out
00:56:13.200 with their friends in person which of course will happen again
00:56:15.780 someday but not this year yeah well so what do you do with the fact
00:56:18.540 that now a concern about the dangers even invisible dangers out in
00:56:24.780 the world seems all too warranted right so now we have a cohort of
00:56:28.580 kids I mean I've got two daughters six and eleven who are now
00:56:33.020 quarantined and having a fairly unusual experience I mean they're
00:56:37.060 happily our limitations on screen time have been impressively relaxed so
00:56:42.460 they're they're enjoying that but yeah but tell me about social
00:56:45.420 media is your 11 year old on Instagram no no no good all you all I mean I'm
00:56:50.580 going to be as conservative as as can be achieved on that front but there are
00:56:56.820 elements of it that are starting to leak into her experience now just because
00:57:00.740 of yeah the classroom is on you know zoom and they have common projects where
00:57:05.540 they're commenting on each other's work and so they're they're texting and so
00:57:08.860 there's communication in front of an audience happening you know a fair
00:57:12.420 amount and how that differs from social media really is just that it's not open
00:57:17.540 to the rest of society it's just her among her friends but even there it just
00:57:23.140 seems to me like a whole new module has been installed in her brain which is you
00:57:28.400 know her attention is being captured by somebody else's response to something she
00:57:33.040 put out there and that you know that has many of the features of what that would
00:57:36.700 concern one with social media yeah that's right so to the extent that
00:57:40.400 screens foster direct face-to-face interaction talking on the phone by
00:57:45.940 FaceTime that's all great there's no problem at all there I actually bought my
00:57:50.160 son an Xbox when this all hit he wanted one for a long time and the research
00:57:56.340 doesn't seem to show that it's related to anxiety depression although it is very
00:57:59.940 addictive and it does tend to fill up all the available time so he has three hours a
00:58:03.840 day on Xbox but it's great that he you know he's really playing it with his
00:58:07.200 friends so to the extent that these devices facilitate real direct
00:58:11.480 interactions that's great but yes as you say the problem is a lot of these are
00:58:15.580 indirect interactions where people are rating and commenting and that seems to
00:58:18.520 be especially hard on girls so I think this is so this could get worse but here's
00:58:23.700 where I think things could get reset there is actually danger out there now
00:58:27.780 because of the virus now not that much for kids but it's a physical thing whereas what
00:58:32.600 we were getting to before this hit was emotional safety we were treating kids
00:58:37.220 that as though they were so fragile that if they were exposed to bad news that they
00:58:41.940 would somehow be damaged and what I'm hoping is that this this pandemic will
00:58:48.100 reset some of our safetyism and move us away from sort of the trivial things we've
00:58:52.180 been looking at that the effort to protect kids self-esteem the effort to
00:58:56.100 protect them from words and ideas so having more adversity in your childhood
00:59:02.440 could end up being beneficial and this is the idea of anti-fragility which is
00:59:06.460 really central to our book the word was coined by Nassim Taleb you know lots of
00:59:11.080 people have many views about about him but I'll just say that that idea but that
00:59:16.320 idea I think that idea is a good one yeah I should say he he has views about many
00:59:21.260 people too so yes I've noticed but so I just want I don't want to miss this one
00:59:25.920 point so but what you just said suggests to me there's another trap to fall in here
00:59:29.740 which is yeah overprotection if I'm trying to curate just go back to where what you
00:59:35.100 just asked me with respect to my allowing my daughter the social media experience I
00:59:41.760 mean one the impulse there is to protect her from the onslaught of negative or you
00:59:49.120 know destabilizing or anxiety producing information that I don't want her to have
00:59:53.780 and it seems to me there are two potential pitfalls there one is just this is
00:59:59.040 another form of coddling right I'm trying to protect her from something that she
01:00:02.700 should develop the tools to just assimilate or one could say that and then
01:00:06.940 there's just this other feature which I think is natural to worry about which is
01:00:10.560 if all of her friends are on Instagram and she's the one who isn't well then then
01:00:15.880 there's just a social exclusion penalty that you would imagine a young teen would
01:00:20.640 pay yeah so to take your first point it does seem as though I might be
01:00:24.680 contradicting myself I'm saying that in general kids should be exposed to
01:00:28.540 adversity they should learn from experience and you should let them make
01:00:31.500 mistakes yes in general that is true but there are certain things such as let's
01:00:36.680 say alcohol heroin prostitution gambling where we say you know what my 11 year
01:00:42.600 world is not ready for that you know maybe when she's 16 18 or well obviously
01:00:46.420 not prostitution but the point is that there are certain things that an
01:00:50.060 adolescent brain is just not not ready for right and and what I found from
01:00:54.340 speaking with a lot of middle school and high school kids is I ask them all right
01:00:58.820 so you know how many of you have been shamed on social media okay all hands go up
01:01:02.160 and I say now how many of you think that being shamed on social media
01:01:04.880 toughens you that is you go through it you say you know what I don't care what
01:01:08.760 people think of me you know I've been shamed so many times I don't care
01:01:11.280 anymore no hands go up how many of you think it makes you more cautious more
01:01:14.940 fearful you double check and triple check yourself you're not authentic most
01:01:19.100 hands go up so there's something about public shaming and exposure that is
01:01:23.880 especially unhealthy for middle school kids and especially for girls so I'm not
01:01:28.400 saying you know I it's a losing battle to keep it out of high school but look the
01:01:31.800 minimum age you have to be 13 to get an account but by fifth or sixth grade most of
01:01:35.980 the girls have it in many schools and that is something that I'm really trying
01:01:39.720 to change as long as there is now evidence that social media is particularly
01:01:44.660 bad for girls now the Millennials weren't harmed by it they didn't get this
01:01:48.280 until they were in their 20s but I suspect that middle school is the place to
01:01:52.120 focus I think we really need to try to get social media out of middle school and
01:01:55.520 that would solve your second question because yes if it's only your kid you
01:01:59.320 know when I kept my son off of video games he did feel excluded because the other
01:02:03.000 boys were all on it all day long so it has to be done systemically and that's
01:02:06.580 why I think middle school is the place to focus if anybody's listening to this
01:02:10.020 was any influence over middle school try to get a school-wide policy that
01:02:14.460 discourages parents from letting their kids from discourages anyone from
01:02:17.980 having a social media account until they get to high school wait till they're
01:02:21.380 14 or so wait till they're in high school but you know middle school is so
01:02:23.980 hard already especially on girls so don't make it harder so now let's pivot to
01:02:30.000 topics which you know on their face may seem impressively detached from our the
01:02:35.860 current concerns but not really I mean I want to talk about human well-being and
01:02:41.280 experiences of the positive end of the spectrum of human psychology and how we
01:02:47.300 conceptualize this terrain and this is if anything an interest in this has been
01:02:54.200 heightened by our current circumstance because so many people have been
01:02:58.700 forced into something that impressively resembles a kind of retreat right I
01:03:05.760 mean the people are experiencing solitude to a degree that is not normal for
01:03:10.360 them and for most of us there's been a forced reprioritization of values we have
01:03:18.380 a vantage point from which to see how we've been living all these years and the
01:03:22.520 kinds of things that have captivated our attention and much of that has been
01:03:27.120 stripped away or at least shuffled to a degree that many people are experiencing
01:03:32.760 even a silver lining to this quarantine because they're experiencing better time
01:03:39.080 with their families in many cases or this heightened sense of uncertainty the sense
01:03:43.940 that really anything can happen at any time and that's always been true right but we
01:03:50.780 live most of our lives as though we take a lot for granted and taking those things for
01:03:55.460 granted amounts to a kind of death denial and a sense of control that has never really
01:04:02.040 been factual so there's a there's a lot to motivate a conversation about things like
01:04:08.500 meditation and psychedelics and what they can reveal about the nature of the self and
01:04:14.400 experiences of self-transcendence so um let's dive into the the deep end of the pool john
01:04:20.020 yep perhaps to start give me a sense of your your background here i know you spent some time in
01:04:26.400 india at some point in either in graduate school or as a postdoc but remind me what how you came to be
01:04:32.200 interested in these topics sure so you know because i study morality i've been interested in moral
01:04:39.160 transformations you get that from religious experiences william james book varieties of religious
01:04:45.260 experiences full of all these sudden moral rebirth from an encounter with with god so i've always
01:04:52.140 been interested in these self-transcendent experiences and their capacity to change
01:04:55.560 people's moral nature but actually there's a there's a very personal reason for it and and i've been
01:05:01.300 looking forward to talking about this with you because you've been out on this for a long time
01:05:05.800 if you'd like to continue listening to this podcast you'll need to subscribe at samharris.org
01:05:15.260 you'll get access to all full-length episodes of the making sense podcast and to other subscriber
01:05:20.340 only content including bonus episodes and amas and the conversations i've been having on the waking
01:05:25.920 up app the making sense podcast is ad-free and relies entirely on listener support and you can
01:05:32.040 subscribe now at samharris.org