Making Sense - Sam Harris - October 17, 2017


#101 — Defending the Republic


Episode Stats

Length

33 minutes

Words per Minute

151.04063

Word Count

5,080

Sentence Count

299

Misogynist Sentences

1

Hate Speech Sentences

1


Summary

Cass Sunstein is the Robert Walmsley University Professor at Harvard Law School, where he is the founder and director of the Program on Behavioral Economics and Public Policy. He is by far the most cited law professor in the United States. From 2009 to 2012, Cass served in the Obama administration as administrator of the White House Office of Information and Regulatory Affairs. He has testified before congressional committees, been involved in constitution-making and law reform activities in a number of countries, and written many articles and books, including Nudge with Richard Thaler. And he s written other books on his own, two of which are under discussion today: Hashtag Republic, Divided Democracy in the Age of Social Media, and the forthcoming In Impeachment, a Citizen s Guide. In this episode, we talk about the polarization and fragmentation of American society, choice architecture, the importance of face-to-face interactions for problem-solving, group polarization and identity politics, the much vaunted wisdom of crowds, the possibility of ever having a direct democracy, the process of impeachment, and other topics. As I say at the end, I found this conversation truly educational, and I hope you find listening to Cass as valuable as I did. I found it truly educational. And I hope that you find it valuable as much as I do. Sam Harris - Making Sense Podcast - by Sam Harris, PhD - by The New York Times - by Cass Sunstein and by Thomas Labine, Jr. - by Tom Labine - by Alex Blumberg - by David Rothkopf, PhD, by Jonathan Goldstein - by Jeffrey Epstein, by David Sidorov, by Jonathan Chafin, , and by Robert Mays, and , by John Heacock & is by Jonathan Horschig , by Robert Fennell ( ) . And by Thomas Levenstein, "This is a great book?" by Thomas Lantz, and in this is an excellent piece by John Rennell, in the introduction to this book by or ? # on this is a good thing? also by Thomas Lavelle, # # and so much more? - and so on, and this isn't a good one? and I think so, too, and so, so, in this episode


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.820 Today I'm speaking with Cass Sunstein.
00:00:50.080 Cass is the Robert Walmsley University Professor at Harvard Law School, where he's the founder
00:00:55.120 and director of the Program on Behavioral Economics and Public Policy.
00:00:59.760 He is by far the most cited law professor in the United States.
00:01:04.100 Amazing.
00:01:05.580 From 2009 to 2012, he served in the Obama administration as administrator of the White House Office
00:01:12.460 of Information and Regulatory Affairs.
00:01:15.140 He has testified before congressional committees, been involved in constitution-making and law
00:01:20.400 reform activities in a number of countries, and written many articles and books, including
00:01:26.180 Nudge with Richard Thaler.
00:01:29.240 And Thaler actually won the Nobel Prize in Economics since we recorded this podcast.
00:01:34.320 And he's written other books on his own, two of which are under discussion today.
00:01:37.560 The first is Hashtag Republic, Divided Democracy in the Age of Social Media, and the forthcoming
00:01:45.420 in Impeachment, a Citizen's Guide.
00:01:48.880 And Cass and I talk about the polarization and fragmentation of American society.
00:01:55.320 We talk about choice architecture, the importance of face-to-face interactions for problem-solving,
00:02:02.860 group polarization and identity politics, virtuous forms of extremism, the much-vaunted wisdom
00:02:09.880 of crowds, the possibility of ever having a direct democracy, rational limits on free
00:02:16.720 speech, the process of presidential impeachment, and other topics.
00:02:22.920 As I say at the end, I found this conversation truly educational.
00:02:27.600 There was a lot I didn't understand about impeachment in particular until today.
00:02:33.340 And I hope you find listening to Cass as valuable as I did.
00:02:36.020 And I now bring you, Cass Sunstein.
00:02:45.380 I am here with Cass Sunstein.
00:02:47.500 Cass, thanks for coming on the podcast.
00:02:49.660 Thank you so much.
00:02:50.360 Great to be here.
00:02:51.640 Before we jump into your book, there are actually two books I want to discuss.
00:02:57.880 I'll start with Hashtag Republic.
00:03:00.600 But perhaps you can just summarize your background.
00:03:03.480 You have a very diverse background.
00:03:05.960 You've touched a lot of topics.
00:03:07.580 You've served in government.
00:03:08.880 How do you view your decades of work at this point?
00:03:13.320 Work in progress.
00:03:14.940 So I started as a law professor at the University of Chicago after a short stint at the Justice
00:03:22.200 Department.
00:03:23.020 And I guess before that, I had clerked for a couple of judges, including Thurgood Marshall.
00:03:27.520 I was at Chicago for many years.
00:03:30.760 I did some work connected with governments, including our own.
00:03:35.100 But I didn't leave the academy until Barack Obama became president.
00:03:39.980 And then I worked in the White House, helping to oversee government regulation for about four
00:03:45.820 years.
00:03:46.580 After that, I left to be an academic again at Harvard.
00:03:52.360 The president asked me to be on his group on surveillance and national security.
00:03:59.960 I did that for approximately a year.
00:04:02.580 That was a part-time job as I was teaching.
00:04:05.680 And after that, the defense department asked me to be on the Defense Innovation Board, which
00:04:11.600 I was on up until quite recently, which works on the subject of national defense and innovation.
00:04:18.440 Since I left a government as full-time job, that was in late 2012, I've worked with our
00:04:26.120 government and various other governments on strategies that can promote health and increase
00:04:32.000 safety and maybe help employment go up and poverty go down, environmental protection and
00:04:39.400 issues of that kind.
00:04:40.460 And you and I have never met, but we have an editor in common.
00:04:44.260 We've got Thomas Labine linking us.
00:04:46.580 He's edited two of my books.
00:04:48.460 I think he's edited maybe more of yours.
00:04:50.860 He is fantastic.
00:04:52.220 Yeah, he really is.
00:04:53.240 Thomas is an amazing person and a kind person and also a brilliant editor with real creativity
00:05:00.700 about how to make things go, at least in my experience, in better directions.
00:05:04.860 Yeah, yeah.
00:05:05.800 Well, it's great to meet you virtually here.
00:05:07.840 I want to focus on Republic first and mostly, because it's so timely.
00:05:15.660 I mean, the other book I want to touch at the end, Impeachment, is also super timely.
00:05:20.300 But when did you actually write Republic?
00:05:22.800 It came out earlier this year, but when were you actually writing it?
00:05:26.800 I was working on it for approximately, I'd say, 18 months before the election.
00:05:34.040 So I actually finished it right about the time of President Trump's victory.
00:05:40.900 But I'd been thinking hard about it for probably two years before then.
00:05:47.160 It actually builds on earlier work on the question of polarization.
00:05:50.920 But my own experience in Washington and my own observation, I guess, since leaving Washington kind of fixed me on the issue of polarization and mutual misunderstanding.
00:06:06.740 And that was kind of 2014-2015 obsession.
00:06:12.240 So you analyze the forces that are leading to increased fragmentation and polarization in our society.
00:06:21.940 And it's fragmentation and polarization of all kinds.
00:06:25.940 It's political, which I'm sure we'll focus on, but it's also moral and intellectual and religious.
00:06:31.100 I mean, every way in which belief systems can be segregated, there's a similar dynamics to what's happening there.
00:06:37.220 The most natural place to start here is with a few of the concepts you introduced very early in the book.
00:06:43.360 And the first is the concept of choice architecture.
00:06:47.120 And this can be summarized with Nicholas Negroponte's rather dystopian idea of what's called the daily me.
00:06:56.600 And many of our listeners will not have heard of either of these phrases while actually living with their implications every day.
00:07:02.640 So perhaps you can explain choice architecture and its consequences.
00:07:08.680 Dick Thaler, an economist, and I worked on a book called Nudge a few years ago.
00:07:14.060 And one of the driving ideas is if you're in a grocery store or in a cafeteria or on a website, there's actually an architecture for choosing.
00:07:23.460 So a grocery store might have Pepsi there, or it might instead have Diet Pepsi there, or it might have carrots, or it might have chocolate bars.
00:07:36.680 And what's at the checkout counter actually matters in terms of what people purchase.
00:07:43.140 And what's approximate and visible, what's eye level, actually matters in terms of what people end up stocking in their refrigerator.
00:07:51.040 So too, in a cafeteria, it could have a choice architecture that favors meat, or fish, or vegetables, or brownies.
00:08:01.920 And if the designer is smart, they will create an architecture that's good for you, or good for them, or hopefully good for both you and them.
00:08:13.260 A website designer who is alert to the importance of particular choices will know that if you put things in big font and colorful letters, and you make it really simple, you can attract the eye.
00:08:28.780 And the thing that's first on a list is likely to be what people will purchase.
00:08:34.460 And that is a clue for someone who's trying to sell clothes or books.
00:08:39.480 And so choice architecture is really everywhere.
00:08:43.700 A rental car company is engaged in choice architecture.
00:08:47.120 The U.S. government is engaged in choice architecture.
00:08:49.980 So is the state where you live, and so are some of the great forces in society, and so are some of the less great forces in society.
00:08:58.700 Sometimes we're choice architects when we don't really even think about it.
00:09:02.080 So a teacher of kids is doing choice architecture all the time.
00:09:06.560 So is a doctor who is saying, you know, we have five options for you.
00:09:10.040 Here's the first.
00:09:11.160 And saying that will often bias the decision.
00:09:15.000 Parents are definitely choice architects.
00:09:17.380 I have two small kids, and one's five, the other is eight.
00:09:20.820 And they are extremely effective choice architects with respect to their father.
00:09:25.200 They know how to design situations.
00:09:27.240 So it's going to get me to do what they want.
00:09:28.060 They're running in the wrong direction.
00:09:30.100 Completely.
00:09:30.840 And I'll run after them, and then we're going in the direction they want.
00:09:34.240 So for media, both a social media platform like Facebook and a newspaper like, let's say, the Wall Street Journal and a network is a choice architect.
00:09:50.300 Facebook, let's take as an example, can have a news feed that has one kind of default choices.
00:09:58.960 So it can say, you know, we know from your choices and from our algorithm, this is the kind of thing that you look at.
00:10:05.340 Or we know that this is the kind of thing that most people look at, and it can feed you stuff that fits with your own political convictions, or it can feed you stuff that fits kind of with what the median person in your state likes, or it can feed you stuff that is serendipitous and diverse.
00:10:25.180 And that's also true of the local newspaper, which can say, I'll provide you, you know, an assortment of things that it's probably good for someone in your area to know about, or which can think, you know, most of our readers, they are right of center, they are left of center, or we are right of center and left of center, and we're going to provide that perspective.
00:10:46.180 So that's choice architecture, too.
00:10:48.940 So choice architecture is everywhere.
00:10:50.840 The term kind of suggests, I hope, helpfully, that the architecture that we often just take as kind of a background fact or furniture of life often will have big effects on what we end up selecting.
00:11:05.800 The idea of Daily Me, which comes from the farsighted Nicholas Negroponte, actually comes from a pretty long time ago, the 1990s.
00:11:15.000 And he said, you know, we're not going to have TV stations and general interest newspapers so much in the future.
00:11:23.920 We're going to have a system where every person can design his or her Daily Me.
00:11:30.080 So if your name is Bob, you can create the Daily Bob.
00:11:34.260 And suppose what you're most interested in is Star Wars, that can highlight Star Wars on your screen.
00:11:41.360 And suppose what you really think is the problem of unlawful immigration is spiraling out of control.
00:11:49.120 Those are your political focus.
00:11:51.740 Your Daily Me can tell you both about the immigration problem, and it can talk about it in a way that you find congenial rather than silly or unhelpful.
00:12:02.880 And so the Daily Bob can look one way, the Daily Mary can look a completely different way.
00:12:08.680 And in a way, Silicon Valley has, for a long time, seen this power of personalized choice architecture, let's call it, as the ideal.
00:12:18.580 That's heaven.
00:12:19.900 That's democracy.
00:12:21.220 That's freedom.
00:12:22.360 So that each of us gets a news diet that isn't what anyone else thinks we need, but is specifically based on our own values and tastes.
00:12:30.780 And obviously, the incentives for these companies run in the direction of more perfectly fulfilling this formula.
00:12:40.940 And so what we have essentially is an arms race for people's attention.
00:12:45.900 And the Daily Me version of things one would expect would be stickier.
00:12:52.340 You know, if you can deliver me information which I find captivating, and the algorithm keeps prioritizing that, and I can be captivated by outrage, I can be captivated by desire, you know, the ads get better and better at actually delivering me things that I will be tempted to buy.
00:13:12.760 You know, the incentives seem aligned to fulfill Negroponte's idea.
00:13:19.120 It's a very, what is it, promising or seductive business model, either for a startup or for an established company.
00:13:31.680 A few years ago, I was traveling a lot, and I found myself getting on my Facebook page a lot of luggage ads.
00:13:39.720 And how'd they know?
00:13:41.100 I was in the market for luggage.
00:13:42.760 They figured it out.
00:13:44.000 They were right.
00:13:45.200 And I was much more likely to click on luggage ads than I would be to click on ads for, let's say, sneakers.
00:13:53.280 And they knew that.
00:13:54.880 And that is in their economic interest.
00:13:57.000 So if you're a political candidate, let's say, who wants to win a particular state, if you know these people will like me better if I suggest that I'm with them on this issue, or those people are likely to give me money if they see my face over the following five words, then you can get very, very precise.
00:14:22.300 And as you say, that's the incentive of someone who wants to win an election.
00:14:27.880 So both for companies and for political aspirants, and it looks like the Russian government, by the way, knew about this in their role in the U.S. 2016 election, there is a business model that suggests that building on the idea of the daily me is a good one.
00:14:48.740 Now, there are some reasons to think that people are a little more complicated than that, and there may be other business models that are as good or better, but certainly we can call it the business model of the aughts, meaning the first decade of the 21st century.
00:15:06.260 The daily me model has been all the rage.
00:15:08.780 So in opposition to this, you discuss the value of what you call serendipity, you've already used the word, and also irritation and shared experience.
00:15:21.240 Describe what you mean by those three things.
00:15:24.120 Okay, so there are three very different ideas.
00:15:26.720 The idea of serendipity means that in a great city, large or small, it may be that when you go around the corner, you'll see a Lebanese restaurant or a sports event that you didn't know you had any interest in.
00:15:44.460 Maybe the sports event is soccer, and you thought you were bored by soccer, but whoa, those kids are pretty good.
00:15:52.100 Or the Lebanese restaurant you never tried, but looks interesting.
00:15:57.900 And you might see, some of the times, a political protest about something.
00:16:04.500 You know, it might be Black Lives Matter, or it might be abortion is murder.
00:16:07.940 And you might learn from that both what your fellow citizens think, and you might also think to yourself for a moment, maybe more than a moment, gosh, I didn't know that that was on the view screen of my neighbors, and maybe they have a point.
00:16:25.860 And that can change your mind, it can change your life, and can certainly broaden your horizon.
00:16:30.360 So the idea of serendipity is that a good choice architecture, let's call it, for communications has a lot of surprises in it.
00:16:40.120 And, you know, some of these words aren't the most familiar.
00:16:43.720 Choice architecture probably has too many vowels or something.
00:16:50.020 The word architecture has a lot of them.
00:16:51.500 But it's pretty familiar that if you go on a news station or a talk show that interests you, you'll see stuff that you never would have specifically said you want to hear about that.
00:17:08.900 But it could be stuff that will be the most important thing you see that week or hear that week.
00:17:14.840 So you might hear something about a problem or an initiative in India, and while you thought you had no interest in India, the problem is something that alarms you, or the initiative is something that you think, gosh, the whole world can learn from that, maybe benefit from that.
00:17:33.960 And that can be, you know, super important for people.
00:17:37.780 And I think in individual lives, if you think about what your job is, or what you're reading, or who you're married to, or who your friends are, or how you got to do the thing you're doing a few hours from now, chances are there's some serendipity that played a huge role.
00:17:52.700 And what I'm urging is that in a democracy, serendipity is a frequent and great force for, what is it, broadening, and also togetherness in a non-cornball, non-Hallmark card way.
00:18:14.020 So that's serendipity.
00:18:15.540 Now let's talk about irritation.
00:18:16.920 It may be that if you were reading, let's say, a good daily newspaper, you'll encounter an editorial or a column that thinks exactly the opposite of what you think.
00:18:30.640 So you might be inclined to think, you know, I like a $15 minimum wage.
00:18:34.900 I think Senator Sanders is on the right track in calling for that.
00:18:38.720 But then you might read something that says a $15 minimum wage is actually going to be terrible for the economy, and it's going to be very harmful for people at the bottom of the economic ladder, because employers aren't going to hire as many people if they have to hire people who, for $15 an hour, they'll hire them at $10, but not at $15, and so the minimum wage actually hurts the poor.
00:19:00.100 That, if you like the minimum wage, reading that is very irritating.
00:19:02.860 It's not congenial, but you might think, oh, maybe that's true.
00:19:09.300 Or if you think that, you know, President Trump is on the right track on issue A, B, or C, and then you read something that suggests he's all wrong, it might be very irritating, but it might move you.
00:19:24.460 So irritation can be good.
00:19:26.400 Also, meaning it's productive of learning and kind of understanding what our fellow citizens think.
00:19:31.760 And you might also learn something that will produce a shared experience.
00:19:37.200 So the Super Bowl, the response to something great, like a celebration of something very good that's happened in a community, or July 4th, I would actually single out July 4th, because of what brings Americans together.
00:19:52.640 It's a shared experience, and that can create some social glue for people who might otherwise think of one another as, you know, occupants of the same country, but kind of enemies.
00:20:06.020 To be clear, it's not just the information bubble we're talking about.
00:20:10.040 We're talking about a lack of a face-to-face interaction in many cases.
00:20:14.200 So there's the whole phenomenon of telecommuting.
00:20:17.360 In fact, you and I are telecommuting right now to do this interview.
00:20:20.700 And so this interview is made possible by this great technology, but this technology is enabling, in this case, me, to do the vast majority of my interviews remotely in this way.
00:20:31.420 And that enables me to make it very easy for someone like you to come on the show, but it also comes at a kind of cost.
00:20:40.340 And I remember I recently met a CEO of a very big company, and it's now, you know, a multibillion-dollar company with thousands of employees.
00:20:50.540 And something like 99% of them work from home or work from a Starbucks.
00:20:56.560 I mean, there's like, you know, 15 people in an office in San Francisco, and everyone else is elsewhere.
00:21:02.900 So there's more to it than just the media streams we will emphasize here.
00:21:08.360 I think you're making a great point, and it's not emphasized enough in my book.
00:21:12.860 And I learned it, actually, very concretely in the government, where sometimes if you're trying to solve a problem and, you know, you're in one building and people are half an hour away, you send them an email.
00:21:27.780 And the email doesn't have the right tone or will not be read as having the right tone.
00:21:33.540 And in any case, problem-solving is harder if you're using text sometimes than if you're actually looking at someone's face.
00:21:44.400 And a phone or, you know, something that involves voices of any technology, it can be better than email, as you're saying, because it's more personal.
00:21:57.700 But I found in government and after, if you really want to solve a problem, often the best thing is to get a group of people in the room.
00:22:08.540 And we decreasingly rely on that.
00:22:11.480 And one reason it's good is that you can understand different perspectives better if you see people's faces, and they'll understand yours.
00:22:19.800 And another thing, so it softens some of the interactions.
00:22:24.160 And another thing is that, and I don't know of data on this, but I bet there either is some or there will be some, that creativity grows because the sparks fly.
00:22:36.800 And we've all seen that where, you know, in an office or in a family even, if people are all talking to each other face-to-face, something new will come out that couldn't have been produced by email or even by phone.
00:22:50.720 Yeah, and all the communication at that point isn't merely transactional in the way that it is when you're sending an email.
00:22:57.900 I think this is just now a ubiquitous experience that everyone will be very familiar with, where you're having some communication that's growing increasingly fraught by email.
00:23:07.300 And if you're wise, you will realize that the medium is very likely contributing to the problem.
00:23:15.600 I mean, the tones are getting misread and everything is sounding sharper than you intend.
00:23:20.860 Or, in fact, you just become a slightly different person behind the keyboard.
00:23:24.900 I mean, the maniac comes out a little bit and your response to another person isn't being modulated by a face-to-face encounter or even by being able to hear the humanity and their tone of voice on the phone.
00:23:40.240 I'm sure other people have had this experience, but I've had exchanges with people by email that just either, you know, aren't working or they're just kind of strangely unpleasant.
00:23:50.220 And then you get on the phone with the person or you see them face-to-face and there's this sort of shock of recognition that, oh, okay, this is that person, right?
00:24:00.700 I mean, this person has a different shape in your experience than the shape they had acquired in the back and forth by email.
00:24:09.860 And it does kind of break a spell, which was defining the communication and defining it almost invariably in an unproductive way.
00:24:21.220 Completely.
00:24:22.080 No, I think that's a deep point.
00:24:24.320 And it has implications for a zillion different things, including political polarization.
00:24:30.120 So I had a friend in the government with whom I was frequently at odds on what to do, and she just developed a habit of saying, call me.
00:24:41.000 Right, yeah.
00:24:41.560 So I'd write her an email and she'd say, call me, and we became great friends.
00:24:45.080 And in D.C., you know, I was working for a Democratic president, and the Republicans were often not happy with what President Obama was doing.
00:24:55.420 But I learned so much from Republicans in the House and the Senate from face-to-face conversations where not seeing some email, they didn't write a lot of emails, but they would write a lot of letters, which would be harsh or accusatory or something.
00:25:12.620 But they'd actually have often very good ideas.
00:25:16.480 And when it's in person, then you're not, you know, fighting, are you bad?
00:25:21.940 You're thinking, what's the best way to solve a problem?
00:25:24.380 And that, what you're saying about, you know, human interactions, either in business or in families or whatever, it's basically a national thing.
00:25:36.960 So what I was thinking as you were talking is the harsher or misread stuff that we all experience, in a way our political process is facing that.
00:25:49.280 And that's a challenge for, let's say, infrastructure.
00:25:54.380 Yeah.
00:25:55.240 It's just a random problem.
00:25:57.240 Right.
00:25:57.680 But to be clear, you're not exactly nostalgic for some former time where you think communication was better across the board.
00:26:06.300 You actually believe, I think you say this in the book, that things are just simply better now than they were in the past.
00:26:13.620 It's a matter of fixing problems now, but there is no, there's no place you would point the Wayback Machine if you could.
00:26:20.240 Completely.
00:26:21.240 So if you ask me when was communications in America the best it's ever been, I would say today.
00:26:29.760 And if it's next week, probably next week's going to be a little better than today.
00:26:33.740 Just because, you know, if you can talk to each other across, you know, very long distances, that's a huge improvement.
00:26:43.440 And if you can learn stuff just at a click about something you care about that may really affect your life, that can be, you know, incalculably great.
00:26:53.460 But one way to think of it is the cell phone is a fantastic advance, but if you're using it while you're driving, the chance of you being in an accident is higher than it would be if you were using it when you were driving.
00:27:08.180 And the technology is great, but to optimize, let's say, the, you know, the benefits of its existence would be a good idea.
00:27:18.020 And we're not nearly there yet with respect to social media.
00:27:23.380 Well, I want to talk about social media and how Twitter and Facebook have been behaving themselves.
00:27:28.520 But before we get there, I think we should talk about the phenomenon of hyperpolarization in groups.
00:27:35.700 And this is a general phenomenon that you describe in the book where like-minded people become more extreme once they begin associating with one another.
00:27:46.460 And this is, it may sound paradoxical on its face, but it really functions by dynamics that are fairly easy to understand.
00:27:57.460 Perhaps you should explain, maybe the Colorado study is the place to start here, but talk about what happens in groups among the like-minded.
00:28:05.580 Okay, so what we did in Colorado was to get a bunch of people in Boulder, which is a left of center, together to talk about climate change, affirmative action, and same-sex unions.
00:28:19.580 We asked them for their views privately and anonymously.
00:28:24.020 Then we had them discuss the issues together and come to a verdict.
00:28:27.840 And then we asked them to record their views privately and anonymously.
00:28:31.680 And there was reason to expect that if you got a group of people together, they'd end up coming to the middle of what the group members privately thought, and that would be their verdict, and then they'd all be in the middle.
00:28:46.620 But that's not what happened.
00:28:48.360 They were kind of to the left on all three issues.
00:28:51.640 They went way to the left on all three issues as a result of talking to each other.
00:28:56.560 So the left of center people in Boulder had some diversity on climate change and affirmative action before they talked to each other.
00:29:05.160 After they talked to each other, they were more extreme, they were more confident, and they were pretty well unified on all of those issues.
00:29:13.180 This isn't just a left of center phenomenon.
00:29:15.520 We did the same thing in Colorado Springs, which is right of center.
00:29:18.720 And as the people in Boulder went whoosh to the left, the people in Colorado Springs went whoosh to the right.
00:29:25.940 And it's just because they were talking with like-minded others.
00:29:29.200 So the basic rule is that usually people who are inclined in a certain direction end up, after talking to each other, thinking a more extreme version of what they thought before they started to talk.
00:29:42.700 And we can explain, I think, why sometimes in primaries, both of our political parties go left and go right, has something to do with this.
00:29:52.560 Why within cults, people end up sometimes getting all extreme.
00:29:58.020 That's often the phenomenon of group polarization, as it's called.
00:30:02.040 Why terrorists often get radicalized.
00:30:06.020 And also why people who, you know, do great things like attack extreme injustice, why they get radicalized, because they're all talking to each other.
00:30:17.580 And you say that the mechanisms are pretty intuitive.
00:30:21.640 I think you're completely right that the leading one is if you have a group of people who think, let's say, that the minimum wage should be raised.
00:30:29.540 That's what they tend to think.
00:30:30.700 Some of them aren't sure.
00:30:31.980 They're talking with each other.
00:30:33.100 They'll hear a lot of arguments about why the minimum wage should be raised, because that's what most of them think.
00:30:39.620 And they won't hear a lot of the arguments the other way.
00:30:42.220 And the arguments that they hear will be kind of tentative, as well as few.
00:30:45.980 And then if they're listening to each other, after they've heard all the arguments, they'll think, oh, minimum wage really should be raised a lot.
00:30:53.540 And it's just because of the arguments they're hearing.
00:30:55.740 And if you have a group of people who tend to think the minimum wage should not be raised, exactly the mirror image of what I've described will happen.
00:31:04.020 And I'm smiling as I talk, because we actually taped our conversations in Colorado.
00:31:10.540 And so I've seen them.
00:31:11.680 And in real time, you can completely see the process where the people on the right are going more right because they're talking to people who think conservative thoughts.
00:31:24.120 And the conservative thoughts are going to look numerous and excellent.
00:31:27.280 And the disagreement will seem rare and kind of stupid.
00:31:32.840 There's also this phenomenon of reputation management within the group, where you have your concern for how you're appearing in this group that's now getting constellated around a consensus.
00:31:45.180 And that will tend to filter out any expressions of doubt about this forming consensus.
00:31:51.680 And it functions as a kind of attractor state for convergence.
00:31:55.640 You're completely right.
00:31:57.740 So in Boulder, our left of center groups, you can see them talking about climate change, whether the U.S. should sign an international agreement.
00:32:06.360 And some of the people who are left of center were a little nervous about that.
00:32:10.660 They thought, you know, I don't know what would happen to American sovereignty if we yielded to an international agreement.
00:32:17.800 They just weren't sure.
00:32:18.900 But you could see them looking at their fellow citizens of Boulder and thinking, oh, man, if I say that, I'm going to look really bad in their eyes.
00:32:27.980 So I'm just going to agree with them.
00:32:30.500 This strikes me as yet another argument against identity politics.
00:32:34.940 Do you see a connection there?
00:32:36.780 I think exactly right.
00:32:38.680 But identity politics, we're going to be the same thing.
00:32:42.680 If you'd like to continue listening to this conversation, you'll need to subscribe at SamHarris.org.
00:32:48.560 Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along with other subscriber-only content, including bonus episodes and AMAs and the conversations I've been having on the Waking Up app.
00:33:00.780 The Making Sense podcast is ad-free and relies entirely on listener support.
00:33:05.160 And you can subscribe now at SamHarris.org.
00:33:08.020 And you can subscribe to SamHarris.org.