Making Sense - Sam Harris - August 16, 2017


#92 — The Limits of Persuasion


Episode Stats

Length

59 minutes

Words per Minute

163.48497

Word Count

9,797

Sentence Count

486

Misogynist Sentences

5

Hate Speech Sentences

10


Summary

In this episode, Dr. David Pizarro and Dr. Tamler Summers join me to discuss the moral panic that has swept over American universities and colleges, and how they can do something about it. They talk about their own experiences with it, and why they think it s a problem, and what they do to try to fix it. We also talk about what they think about the lack of free speech on campus, and whether or not it should be a problem at all. And we answer some of your questions. Sam Harris is a professor of psychology at Cornell University, and is the author of The Moral Landscape: A Guide to Moral Relativism and the Emotion of Disgust. He is also the co-host of the podcast The Very Bad Wizards, and hosts a movie review podcast called "Free Will" hosted by Tamler and David. Please consider becoming a supporter of their podcast by becoming a patron patron. We don t run ads on the podcast, and therefore it s made possible entirely through the support of our listeners, so if you enjoy what we re doing here, please consider becoming one! If you re not a patron, you ll get access to the full-length episodes of The Making Sense Podcast wherever you re listening to the podcast. Thanks to our sponsorships, we don t need ads, we re making the podcast available, and you can support the podcast wherever you like the podcast is available. Sam and I hope you enjoy the podcast! . Make sure to become a patron of Making Sense by becoming one of our sponsors, and we re getting the most out of this podcast by listening to Making Sense. and listening to our podcast. . . . and I look forward to hearing the Making Sense, and of course, making sense! by you re making sense by by Sam Harris in the making sense. by me, Sam Harris and I by the podcast by you, too! - Sam Harris, by: Tamler & David Pizzarro to make sense on this podcast, by: Dr. Pizaro ( ) , and I do & Tamler - and I think you ll like it. , I look out for people who are on the front lines as well, and they are making sense? And so on, and so on and so forth, and all too much more.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.380 Today I am speaking with the very bad wizards, David Pizarro and Tamler Summers.
00:00:52.100 They have a podcast by that name, which I've been on, I think, twice.
00:00:57.700 We debated free will at great length, so if you're interested in that topic, you can listen
00:01:03.820 to us there, and I recommend you listen to their podcast.
00:01:07.020 They touch fascinating subjects and in quite the irreverent way, and they do fantastic movie
00:01:14.480 reviews as well.
00:01:16.680 David Pizarro is a professor of psychology at Cornell.
00:01:20.060 He focuses on morality and moral judgment and the emotion of disgust, and needless to say,
00:01:29.520 all of that is incredibly relevant to this time and any other.
00:01:34.880 And his partner in crime, Tamler Summers, is a professor of philosophy at the University
00:01:39.400 of Houston.
00:01:40.340 And he focuses primarily on ethics and political philosophy and the philosophy of law.
00:01:45.860 And he specializes in topics like free will and moral responsibility, punishment, revenge,
00:01:54.020 honor.
00:01:55.020 Again, fascinating and all too relevant.
00:01:58.780 In this podcast, we essentially took questions from Twitter.
00:02:03.760 People had heard us on the Very Bad Wizards podcast and had topics they wanted us to address.
00:02:09.980 We talk about free speech on campus.
00:02:13.080 We do a fairly long post-mortem on my podcast with Scott Adams.
00:02:18.200 So if you haven't heard that, you might listen to that first.
00:02:22.400 Otherwise, feel free to skip ahead, especially if you're sick to death of hearing me talk about
00:02:26.920 Trump.
00:02:28.400 We talk about moral persuasion.
00:02:31.360 And then we get into things like meditation and the sense in which the self may or may not
00:02:38.460 be an illusion.
00:02:39.960 Again, I encourage you to subscribe to their podcast because they are quite good.
00:02:46.140 And now I bring you the Very Bad Wizards.
00:02:54.860 I am here with the Very Bad Wizards.
00:02:57.040 David Tamler, thanks for coming on the podcast.
00:03:00.080 Thanks for having us.
00:03:01.240 Thank you, Sam.
00:03:01.820 I will have introduced you, and people may have heard our previous interviews on your
00:03:07.240 show, but remind everyone where you are and what you guys tend to focus on when you're
00:03:13.240 not causing trouble on your podcast.
00:03:16.200 Well, I am a professor of philosophy at the University of Houston.
00:03:21.740 You are Tamler.
00:03:22.740 And I am Tamler Summers, right.
00:03:24.340 And when I'm not podcasting on Very Bad Wizards with David, I am working on this book, which
00:03:33.200 I've been working on for quite a while, for the last few years, that's coming out in the
00:03:39.340 spring, in the early spring, called In Defense of Honor.
00:03:43.140 And it's about honor and morality.
00:03:47.940 Yeah, you like honor.
00:03:48.880 That's something we could talk about.
00:03:50.340 We can add that to the list of things.
00:03:51.980 Yeah.
00:03:52.500 Yeah.
00:03:52.720 I look forward to that.
00:03:54.340 And I'm David Pizarro from Cornell University.
00:03:57.440 When I'm not podcasting with Tamler, I'm losing my cool on occasion.
00:04:01.580 I do research on moral judgment, and especially on the effects of emotion on judgment.
00:04:09.340 So the emotion of disgust is something that maybe for the last 10 years I've been researching
00:04:14.520 and how that can influence judgment, political judgment, and moral and social judgment.
00:04:19.220 And then just trying to teach the young minds, trying to sucker them into getting PhDs.
00:04:26.040 Our listeners want us to talk about the moral panic on campuses as one of the items.
00:04:31.080 We went out on Twitter asking for topics.
00:04:33.860 And I know you guys disagree with some people who think that it's a huge problem.
00:04:39.280 And so I want to get into that because you guys are also on the front lines as professors.
00:04:43.680 But first, let's just start with your podcast.
00:04:46.340 Your podcast is fantastic.
00:04:48.240 I'm a huge fan, and I'm a fan, even though it seems every other time I tune in, you've
00:04:55.620 said something disparaging about me.
00:04:58.020 That's Tamler trolling you.
00:05:00.520 I wipe my hands clean of this one.
00:05:03.320 I think early on, I was disparaging of certain remarks from your book, The Moral Landscape
00:05:12.220 on moral relativism.
00:05:14.340 Since then, I think we've been very even-handed and balanced.
00:05:18.620 And we don't even say anything about it.
00:05:20.020 You would think that.
00:05:22.420 I believe Tamler's watching a different movie.
00:05:25.520 It's an emotional truth, what I just said.
00:05:28.280 It's not a fact-based truth, maybe.
00:05:30.940 Persuasive to somebody, nonetheless.
00:05:33.220 Your podcast is great, and people should check it out, and we will provide a link, or all
00:05:38.080 the relevant links on my blog.
00:05:40.200 But I'm just wondering, so your podcast, you're both professors full-time, and you have a fairly
00:05:47.020 edgy podcast.
00:05:48.520 I mean, you guys, you get into topics, and you express opinions that I would think could
00:05:56.360 conceivably get you in trouble.
00:05:58.760 And this does actually connect with this first topic that has been suggested to us, this
00:06:03.100 idea of a fundamental and spreading intolerance to free speech that's taking hold at the universities.
00:06:11.160 Do you guys ever worry about what you're doing on the podcast with respect to your jobs?
00:06:16.320 I mean, do you both have tenure?
00:06:17.540 How do you think about your life at this point?
00:06:19.760 Okay, well, I'll start by saying, I think that at first it was what some people refer
00:06:27.280 to, to use an analogy if I may, refer to as security through obscurity.
00:06:30.820 I was sort of convinced at first that nobody would be listening, and therefore it would
00:06:36.600 be perfectly okay.
00:06:38.380 But I've been actually quite surprised.
00:06:40.160 So as our listenership has grown, thanks to the many wonderful guests, including Sam,
00:06:45.760 and as our audience has grown, I do not think, and Tamler, you can correct me, I think one
00:06:53.920 of the things that is so nice about the long-form podcast sort of discussion format is that people
00:07:01.840 can hear, they get to know you in a way that the things that you say are in a context of
00:07:09.980 conversations.
00:07:10.900 And for lack of a better word, I think they get to know your character a little bit.
00:07:13.980 And some of the crazy things we say, people really are good at taking it in context.
00:07:18.300 And I don't, maybe one or two emails.
00:07:21.180 Specifically devoted to taking us out of context.
00:07:23.700 That's right.
00:07:24.540 One time I expressed the fear that we'd be taken out of context, and that Twitter account
00:07:28.980 started up.
00:07:30.740 And I don't know, you know, I think maybe one or two times we've had somebody email us
00:07:36.220 with maybe some anger about what we've said.
00:07:40.220 But you mean from your own institutions?
00:07:42.460 No.
00:07:42.760 No, no, no, no.
00:07:43.380 From our listenership.
00:07:44.520 From our own institutions, I genuinely think, I mean, part of it is I haven't, I haven't made
00:07:49.820 it sort of anything that I talk about too much in my own institution, in part because
00:07:55.920 of that worry.
00:07:57.240 Honestly, to connect it to the topic, this is one of my points of evidence when I say
00:08:04.480 that I think people exaggerate the degree to which there's a chilling effect or that
00:08:09.640 people can't express their views if they don't toe the line with, you know, the progressive
00:08:16.540 agenda or whatever.
00:08:18.360 It's, you know, I think neither of us do that.
00:08:21.560 I think, you know, maybe me even less than Dave.
00:08:27.360 And I haven't heard one single, not a single complaint from any colleague who listens to
00:08:34.600 it, from any person at my institution who listens to it.
00:08:37.300 And there are, there are a bunch.
00:08:39.580 Nobody has taken umbrage by a single thing that we've said.
00:08:45.680 And we've said some repugnant shit as, you know, that's part of our, that's part of our
00:08:50.900 trademark.
00:08:51.340 And I think it's for the reason that Dave says is, you know, people get to know us and
00:08:57.420 they know, I think that our hearts are in the right place.
00:09:01.440 And so as long as they know that they're going to allow you to be a little edgier or more
00:09:09.940 inappropriate and not try to shut you down.
00:09:12.820 And so this is one of the things that makes me think that these incidents are not as, it's
00:09:20.780 not as widespread a phenomenon as it's portrayed by some in the media.
00:09:27.120 But there's a relevant part there that we didn't answer, which is we both have tenure, but we,
00:09:30.680 we, I think we got them, we got tenure after maybe like, what were you, a year of doing
00:09:37.240 the podcast?
00:09:38.020 When we started, I don't think we had tenure, but we do have tenure, just to add that.
00:09:41.460 Right.
00:09:42.000 Okay.
00:09:42.820 Are you guys as irreverent or edgy in the classroom, or is there a very big difference
00:09:48.920 between your podcast persona and your, your professor hat?
00:09:54.400 I teach a course intro psychology, which is largely freshmen, uh, with about 800 students
00:10:01.000 enrolled for many of them, it's their first experience in a lecture course in college.
00:10:04.960 And while I probably tone it down, um, I don't purposefully, I mean, part of it is you're
00:10:12.740 a person that kind of changes depending on the situation.
00:10:14.740 So we, we, we, it's more like we tone it.
00:10:16.920 We, we raise it up a notch on the podcast sometimes, but, but largely I say crazy things in my class
00:10:24.560 all the time.
00:10:25.260 Um, and I've had students who take delight in writing it down.
00:10:30.360 Um, the, there was once a, uh, somebody on Facebook who would, who would quote me, um,
00:10:36.340 extensively.
00:10:37.080 Well, I got a word document at the end of one semester from a student with a list of all
00:10:41.380 the crazy things I had said, but usually again, I think not on the first day, sort of you,
00:10:46.600 you build, you build yourself up and always, I think at least I try in the, in an attempt
00:10:53.320 to communicate something well.
00:10:55.360 So if I drop an F bomb, it's usually because I want somebody to remember something.
00:10:58.920 I'll give an example.
00:10:59.680 When I, when I talk about evolutionary psychology, for instance, um, I remind students that if
00:11:03.960 a claim is made that natural selection costs something, it has to be directly tied to the
00:11:09.980 mechanism of survival and reproduction, um, or else, or else it doesn't work through natural
00:11:16.520 selection.
00:11:17.160 So I just remind people, unless it leads to more fucking, um, it's the, it's not an evolutionary
00:11:22.420 argument, like adaptiveness, and I say that in an, in an attempted song.
00:11:28.380 Well, it's an attempt to, much to the chagrin of my mother, it's an attempt to solidify a
00:11:33.480 principle.
00:11:34.560 Maybe I'm just making you.
00:11:35.620 It sounds a little post-talk to me.
00:11:37.520 You just want to laugh.
00:11:38.600 You've got 800, 18 year olds in front of you.
00:11:41.040 It's your one moment of standup for the day.
00:11:43.720 And Tamler, are you, do you tone it down?
00:11:45.640 Because I'm not drunk usually when I teach.
00:11:49.120 So, so that's one difference, but every once in a while for the podcast, we, uh, we put
00:11:55.400 down a few, um, probably me again, a little more frequently.
00:11:59.140 I've done that once.
00:12:00.440 Plus some other things, which, but, um, but anyway, so, uh, I think it's exactly what Dave
00:12:08.280 said.
00:12:08.540 You build up a little trust over the course of the semester and they sort of get you and
00:12:14.460 you're, you know, like I, I'm somebody that likes to go up and approach the line.
00:12:20.000 I get bored when everybody is talking and it's a little too, everyone's being too polite
00:12:25.520 or dancing around certain topics.
00:12:27.800 And I think that students like that.
00:12:31.020 And especially now when I think a lot of these students, at least at my institution, which
00:12:35.940 is a public institution and they're, and they're working jobs and they're, uh, stressed out
00:12:42.560 taking five classes and a lot of them have family issues that they're dealing with and
00:12:47.980 anxiety issues that they're dealing with.
00:12:50.140 It is nice to just have a place where people can, you know, not watch what they say and not
00:12:57.680 feel like they have to walk on eggshells.
00:12:59.220 So that's at least the kind of environment that I try to build.
00:13:02.880 And again, in classes, I have yet to find that, uh, to be a problem, even remotely, like
00:13:11.100 not one single complaint, at least one that has reached me.
00:13:16.180 Now we have to reconcile our worldviews because, and you know, many of these principal experts,
00:13:21.860 really.
00:13:22.080 How do I square what you guys have just said with what Jonathan Haidt is saying and, and
00:13:30.640 really canonizing in the, the heterodox academy, you know, worrying about this creeping moral
00:13:38.180 panic that is fundamentally antithetical to the, the core values of a university.
00:13:43.960 I'm sure David knows Jonathan, but perhaps you do too, Tamler.
00:13:47.820 You guys really should have him on your podcast to talk about these things because I'd like to
00:13:51.480 hear what he would say, but he's really worried about this.
00:13:54.880 And then you have the cases of like Nicholas Christakis, who I'm sure at least David knows
00:14:01.340 Yale.
00:14:02.560 You have Brett Weinstein who at Evergreen University, which has gotten a lot of attention.
00:14:09.160 And that just went fully off the rails.
00:14:12.760 Uh, as far as I know, he's, he's, I'm not even sure he, his family is back in town yet
00:14:17.680 based on safety concerns.
00:14:20.080 And then you have the, the, the Rebecca Tuval incident.
00:14:23.840 And I actually had lunch with her to talk about her experience not that long ago.
00:14:27.380 So it's totally possible that you guys are right.
00:14:31.000 And that these are, are individual cases that suggest very little about the rest of what's
00:14:37.120 going on on campuses.
00:14:38.760 But take the first part.
00:14:40.160 How do you think about how height is describing this?
00:14:44.980 It's a tough question because I think this is one of those cases where two things can
00:14:49.600 be true.
00:14:50.640 And one other thing, Tamler, I should say that you see your, your step-mom is Christina
00:14:55.520 Hoff Summers, who is just this, basically as far as I can tell, she has a cult following
00:15:01.220 on the right, you know, or center right for the way she's brought attention to, to this
00:15:07.480 sort of issue.
00:15:08.720 Yes.
00:15:09.560 Especially as it relates to gender and, uh, yeah.
00:15:13.720 And so, yes, this is a debate I have often and certainly every Thanksgiving, you know, I'm
00:15:20.880 pretty close to my stepmother.
00:15:22.240 So we go back and forth.
00:15:24.300 You know, it's funny, like if you listen to us talk about it, I think we can both concede
00:15:30.280 a little bit of, and this is how I feel about height too.
00:15:33.000 You know, I thought the coddling of the American mind was, you know, one of those first sort
00:15:39.520 of overhyped pieces that captured the attention and the imagination of everybody.
00:15:45.800 And I think people aren't good at, at looking at a video like the Christakis video or the
00:15:51.680 Evergreen State video and, and, and, and they're bad cases.
00:15:57.120 They're really bad.
00:15:58.100 I mean, there's no denying it if that was going on in every, or the, or the Charles Murray
00:16:03.560 thing, right.
00:16:04.180 If that was going on in, in the universities, then people would be right that, that, to,
00:16:09.360 to panic about this.
00:16:10.760 But what's I think difficult for people to process is day in and day out, how many things
00:16:20.720 happen at the thousands and thousands of universities, uh, across the country where there's no stifling
00:16:29.480 of speech, there's no chilling, there's no, there's none of that.
00:16:32.680 You know, Charles Murray successfully gave that same talk at a hundred universities probably
00:16:40.360 before Middlebury.
00:16:42.120 And, you know, Evergreen State is a little bit of a whack job, uh, liberal arts college
00:16:49.820 to begin with, you know, and for a while, this isn't true anymore, but for a while, anytime
00:16:55.120 there was an article written about this, they, it was Oberlin, like something happened in Oberlin
00:17:00.640 because that's just what Oberlin is.
00:17:02.720 It's been like that for 50 years and it'll probably be like that for another 50 years.
00:17:08.580 So I think it's important to separate the.
00:17:12.120 What's wrong, what's legitimately wrong, uh, that's going on at, at, at these particular
00:17:18.280 institutions for what is going on in quote unquote, the American university.
00:17:24.720 Cause I think those two things are different, but, you know, I understand like height will
00:17:29.360 kind of could concede some of that and say it is at these more privileged private institutions
00:17:35.920 that this is occurring, but that's still a significant worry.
00:17:40.400 And, you know, I have some sympathy with that.
00:17:43.260 Yeah.
00:17:43.620 And, and just to, to make clear, I think that, that, um, there, Tamler and I disagree about
00:17:50.520 this often.
00:17:51.160 Um, although, although we share a lot of the sentiment, uh, you know, I think that it's
00:17:56.280 important to separate arguments about frequency with arguments about importance.
00:18:01.320 And, and I, I do think that there is a probably measurable chilling effect in that, um, that
00:18:11.040 some professors are less willing to say some of the things that they used to say, um, or
00:18:16.740 they think twice about it.
00:18:18.380 And I do think there's probably a measurable difference in the average undergrad, um, in
00:18:23.980 the way that they think about a lot of these things.
00:18:25.900 And then we can separate whether the reaction of panic, which I think with Tamler is, is
00:18:30.520 responding to is, is the right, the right sort of reaction to, to the problem as it currently
00:18:37.000 stands, which I, I agreed is, is probably not, it, it does get overblown and it captures
00:18:42.400 attention.
00:18:42.840 Um, but, but I nonetheless do worry about it.
00:18:47.220 And I do think that, um, that we are creating, um, an environment in which people pause before
00:18:53.680 they say some things.
00:18:54.660 But I, I always try to emphasize that there's, there's a way in which a lot of this is actually
00:19:00.060 progress.
00:19:00.900 I do want people to pause before they say some things.
00:19:03.340 And so if that's, what's called chilling, then, then good.
00:19:06.500 I, I, I think I mentioned this on one of our podcasts.
00:19:09.000 I don't know if it made the final edit, but, um, I did have a professor once tell me that
00:19:15.340 he, he really felt like he couldn't tell the same jokes that he used to.
00:19:18.380 And I said, like, what kind of jokes?
00:19:20.160 And then he, he gave me an example and it was a pretty racist joke.
00:19:23.560 And I was like, good.
00:19:26.720 In his defense, he wasn't from the U S and he didn't think it was a racist joke.
00:19:30.760 Right.
00:19:31.280 You know, it hasn't stopped Dave from his, you know, constant stream of antisemitism.
00:19:37.460 So, so, you know, it's, I feel like, I feel like that's the canary in the coal mine.
00:19:43.360 The minute, you know, that gets squashed, I will, I will announce to the world.
00:19:48.020 Right.
00:19:48.680 First they came for the antisemites.
00:19:50.420 That's right.
00:19:51.120 And I did nothing.
00:19:51.880 I just want to add that I think sometimes, like, I think Dave's right that sometimes professors
00:19:57.400 feel like they have to watch what they say, but sometimes that's their fault, not the environment's
00:20:03.560 fault.
00:20:03.820 Like they've been reading too much of the Atlantic and too much, you know, whatever the latest
00:20:11.500 column on the heterodox blog.
00:20:14.560 And now they've convinced themselves that they can't say anything that might border on
00:20:19.900 inappropriate.
00:20:20.920 Sometimes you just have to man up and just say the thing that you want to say.
00:20:26.060 And if there's any blowback from that, then you'll deal with it, you know, or, or woman
00:20:31.200 up or woman up.
00:20:32.240 Yes.
00:20:32.460 Sorry.
00:20:33.020 Oh, or woman.
00:20:33.980 Can you cut that?
00:20:34.920 I'm going to get a big go of that.
00:20:35.680 I can't believe you.
00:20:36.520 That's a keeper.
00:20:38.020 So, uh, no.
00:20:39.800 So, so I do think I was having this talk with a professor, um, at a conference and he was,
00:20:45.860 he said, you know, I was in this faculty meeting and then, you know, an hour later, this faculty
00:20:51.920 member tweeted out something.
00:20:54.160 She didn't use my name, but something that I had said in the faculty meeting.
00:20:58.480 And I was like, so who, so who cares?
00:21:00.340 So what?
00:21:01.160 So maybe she'll tweet out something that you said at a faculty meeting.
00:21:04.540 That doesn't mean you shouldn't say it.
00:21:06.740 That's just life.
00:21:08.160 It's life that when you say something, sometimes people will react in a certain way and you
00:21:13.360 deal with it then.
00:21:15.040 Yeah.
00:21:15.160 I mean, the problem is that there, we have these cases, which may certainly on your, your
00:21:19.580 account are outlier cases where this stuff just goes completely haywire and you have someone's
00:21:26.300 career destroyed, or there's at least a, just a massive public shaming experience that
00:21:32.680 follows in precisely that pattern.
00:21:35.180 If you have a tweet sent from a otherwise private meeting or what was that incident where
00:21:39.880 the guy, a guy wore a shirt to a conference and he was just vilified endlessly for the
00:21:45.360 insensitivity of his shirt.
00:21:47.320 Again, we have these cases that get media attention and at minimum advertise how haywire this can
00:21:57.120 go.
00:21:57.460 So it's easy to see how this would propagate back and cause everyone to choose their words
00:22:02.740 more carefully.
00:22:03.800 I guess it's partly, it's easy, but it's not, it's not an excuse.
00:22:08.340 It's not a full excuse.
00:22:09.760 You know, professors generally are smart enough to understand the difference between a widespread
00:22:16.680 phenomenon and some cases that still, I think, can reasonably be called isolated.
00:22:23.200 And, you know, like anything, like a terrorist attack, you don't want to overreact to it.
00:22:28.300 You don't want to completely take away everybody's freedoms just because there was this one terrorist
00:22:34.720 attack in Orlando.
00:22:36.720 So, you know, that's...
00:22:38.940 I will say that I think it's important to say that in, in the, many of the incidents that
00:22:42.720 we've described, these people were treated horribly and unfairly and, and there's no lack
00:22:46.860 of assholes who, who are, who are causing people grief.
00:22:52.940 But I, I always think that this is, the response to me is more important than, than the, whatever
00:23:02.580 growing number of, of undergraduates who are easily offended.
00:23:06.640 I think that this is actually, what, what do we make of this?
00:23:09.800 What do we do with this?
00:23:10.800 And if it is anything like a trend, if it's not isolated incidents and it is the beginnings
00:23:15.360 of a, you know, some zeitgeist changing, um, more so than ever, I think that the, the role
00:23:21.620 of the professor is, I think we've failed our students.
00:23:25.980 If, if by the end of our classes, for instance, um, they, they still, uh, don't, I think part
00:23:35.220 of the training of say a seminar in mind is for students to come out of there comfortable
00:23:39.100 with expressing opinions and not vilifying others who they disagree with.
00:23:42.740 And I think that the response to any claims of alarm and, and, and these trends or whatever
00:23:49.940 being dangerous ought to be met with open and clear conversation with our students and not
00:23:56.980 with a response that it's just these, these students who are like completely progressive
00:24:01.320 liberals on the left who are ruining things because of post-modernism.
00:24:04.500 You know, I would want to talk to that student to, you know, bring them in, let them teach
00:24:10.520 by example, what it means to have a respectful disagreement.
00:24:14.720 The issue with post-modernism connects us to another item that many have suggested we talk
00:24:19.640 about.
00:24:19.920 And I think this is something that you slammed me for on one of your podcasts, the conceptual
00:24:25.560 penis hoax.
00:24:26.560 Is there a mess we need to clean up there?
00:24:28.860 I don't think we slammed you on the podcast about that.
00:24:32.100 Well, what happened is I was among the people who forwarded this hoax.
00:24:35.960 I think I read a piece of their paper on my podcast and then retweeted it.
00:24:42.760 And then many people have now judged it to have been a false hoax or at least a misfired
00:24:48.880 hoax.
00:24:49.620 I mean, we don't have to spend a lot of time on it, but I think you guys saw it as an example
00:24:53.240 of skeptics not being nearly skeptical enough because they just practice their own version
00:24:59.580 of confirmation bias by spreading this thing, which in the end wasn't what it seemed to
00:25:04.880 be.
00:25:05.100 Is that still how you think about it?
00:25:06.200 Because I think the authors both defended themselves, right?
00:25:09.540 And I think even Alan Sokol wrote a fairly appreciative piece about it, or at least partially
00:25:16.820 appreciative piece about it.
00:25:18.500 I think what was like, and we had James Lindsay on our podcast and we talked at length about
00:25:24.580 it and I think that, not that I'm encouraging you to listen to it, but at the end of that,
00:25:29.140 I was more disappointed with his response than ever.
00:25:34.400 And I think it is a case where, yeah, we were taking to task many in the, you know, whatever
00:25:41.600 skeptic community, if you want to call it that.
00:25:43.660 I don't know how you feel about the label, for falling prey to confirmation bias.
00:25:48.000 And I think our point was just generally that this was, you know, published in a really
00:25:52.320 low tier journal after being rejected from a mid tier journal.
00:25:55.360 And I thought, well, what would be evidence of a good scholarship if not being rejected
00:26:00.240 from journals?
00:26:02.280 From an unranked journal.
00:26:03.820 They were rejected from an unranked gender studies journal and got it published in a paper
00:26:09.100 published, not gender studies journal.
00:26:11.980 It requires no defense of genders.
00:26:14.540 I mean, I think we're all on record as saying this is like spectacular bullshit coming out
00:26:19.600 of some of these fields, but there's something about the arrogance and the quickness of mockery.
00:26:26.040 And this is something I want to talk to you.
00:26:27.520 This is your podcast, so you can direct us.
00:26:29.380 But I did want to talk to you about the, in this broader context of moral persuasion about
00:26:33.920 the role of this mockery and, and I don't think I've been struck maybe, especially in
00:26:43.100 the last few, few weeks or few months as, as our audience has grown and we get more and
00:26:51.080 more people interacting with us on Twitter.
00:26:54.700 I don't know if it's just some belief that this is an effective way of convincing others
00:27:00.120 of the truth.
00:27:01.260 But I've, I found the authors or at least the one author we talked to of the hooks to
00:27:04.740 be very dismissive and, and, and quite, quite arrogant about the way that he presented his,
00:27:10.780 his case in a way that Sokol himself was not right.
00:27:14.460 And I find, for instance, you to be very reasonable when you talk, but you have a wide
00:27:21.420 army of people who aren't that way.
00:27:23.800 And so I don't know how you feel about when you see, you probably get so many tweets that
00:27:28.100 it's hard to keep up, but, but when you see people who sort of on your behalf are acting
00:27:33.320 in ways that I don't think that you would ever act.
00:27:35.860 There are really two topics here.
00:27:37.100 One, one is whether mockery is ever useful and, and, and persuasive to the people you're
00:27:43.120 mocking or whether, I think, I think you guys have even more global doubts about whether
00:27:49.160 just hard criticism is ever persuasive to the people you're criticizing.
00:27:53.200 Whether a frontal assault atheist style on religious faith ever wins hearts and minds.
00:28:00.680 I think that's something that at least Tamler has doubted in the past.
00:28:03.320 Well, I mean, it depends what you mean by frontal assault, but.
00:28:06.140 Then there's the issue of, of how one's fans or, or listeners or readers in, in my case,
00:28:13.920 represent me in how they respond to, to people who criticize me or, or, or my podcast guests.
00:28:20.080 And on that second point, I, for me, it's very clear.
00:28:24.040 And I, I've, with some frequency, I mean, I can't keep doing this, but with some frequency,
00:28:30.000 I admonish my listeners not to be jerks.
00:28:34.080 And I've said on a few podcasts, listen, you, you're doing me no favors, no matter how much
00:28:40.540 you hate what someone said on my podcast, no matter how wrong you think they are, you're
00:28:45.960 not doing me any favors if you now just flame them on social media.
00:28:51.440 I don't want a person's experience coming on the podcast to be that that was the worst
00:28:56.200 thing they ever did in their lives because of how they were treated by, by a fairly large
00:29:01.300 audience.
00:29:02.140 In fact, I want it to be the opposite.
00:29:03.800 I want everything that comes their way to be really smart and civil, no matter how hard
00:29:10.000 hitting it actually is, or no matter how critical it is of their position, it has to be civil
00:29:15.700 and relevant.
00:29:17.200 And so, yeah, I'm fairly clear about how I wish people would represent my audience.
00:29:24.680 But I, you know, I have very little control over what people actually do apart from saying
00:29:28.440 things like that periodically.
00:29:29.560 I guess the, so, I mean, and there's, right, you don't have control over what the people
00:29:37.420 who are fans of yours do, and all you can do is model good behavior, you know, which I think
00:29:44.400 you did.
00:29:44.780 I mean, you did win the Scott Adams, you know, almost to the point where it was heroic, the
00:29:52.820 degree to which.
00:29:54.320 We'll see if I can still model it, and now that we talk about it.
00:29:56.860 Yeah, but, but so.
00:29:59.880 There's some Chris Talkus level patience.
00:30:01.500 But the question that Dave alluded to before about whether mockery is an effective tactic
00:30:09.840 to change people's minds, I think is a, you know, it's something that I think skeptics and
00:30:19.080 sometimes atheists, I guess maybe I just disagree with them because I don't have any great
00:30:26.480 evidence on whether mockery changes minds or not.
00:30:30.000 Certainly in my experience, mocking somebody, calling them stupid, calling them, you know,
00:30:36.740 obviously irrational or whatever is not a, it just makes people more defensive.
00:30:42.240 It makes people dig their heels in more, and the, the way I think to, to change minds is
00:30:51.540 to be respectful of their opinion and to really try to, you know, see the best side of it
00:30:59.220 as, and, and, and to engage with it, even if you find it indefensible on, on some level,
00:31:08.440 just as a purely practical, instrumental goal of changing somebody's mind, you know, in my
00:31:17.460 experience as someone who's no stranger to mockery, that's not what I want to trot it
00:31:22.260 out for.
00:31:23.320 Mockery is fun, can be funny.
00:31:25.660 It can get the people who already agree with you to agree with you more and to be more proud
00:31:34.060 of themselves for being on the right side of the view, but it doesn't change the minds
00:31:39.200 of the people that you're mocking.
00:31:41.780 I would just say that that assumption is pretty readily disconfirmable.
00:31:48.400 I mean, it doesn't change some people's minds, I'll grant you that.
00:31:51.720 It might not, it might not even change most minds and, and most minds, depending on what
00:31:56.760 the belief system is, might just not be available for change, right?
00:32:01.420 So there's nothing you're going to say on a podcast or in a book, however well-tempered,
00:32:08.280 that's going to change the mind of a, you know, a real jihadist or get him to question
00:32:13.140 his, his faith.
00:32:14.360 But, you know, I, I've been amazed to learn that some of the most hard-hitting stuff I've
00:32:19.580 put out there, you know, the stuff I've said about Islam and the end of faith or in various
00:32:24.080 YouTube videos has actually penetrated and reached even totally devout conservative people
00:32:31.540 in communities in Pakistan, right?
00:32:33.600 Where the people are, you know, are now closet atheists, right?
00:32:37.020 Based on what I or Richard Dawkins or Christopher Hitchens have said about their religion.
00:32:42.180 And obviously that's not, those people themselves must be outliers.
00:32:46.360 But you have to picture people at every point on the spectrum of credulity with respect to
00:32:53.240 any ideology.
00:32:54.340 And so there's, there are the people who are, you know, fundamentalists and have never questioned
00:32:58.260 the faith.
00:32:58.840 And there are people who are halfway between that and being, you know, fairly just nominal
00:33:05.000 adherents of the faith.
00:33:06.580 And they can be tipped in either direction.
00:33:09.120 And if they see something very hard-hitting, but also obviously well thought out, directed
00:33:16.880 at this thing that they have been told is so important and so beyond doubting, you don't
00:33:23.500 know how many of those people you capture.
00:33:24.920 And I can just say that, you know, having done this for more than a decade, there's personally
00:33:30.060 a kind of an endless stream of confirmation that minds get changed through confrontation with
00:33:38.220 evidence and argument, however, actually disrespectful and hard-hitting.
00:33:43.640 And I, I, maybe some, there were some distinctions that came to mind as, as we continue to talk
00:33:49.520 about this.
00:33:49.940 And, and one is that, that, that I don't, at least what I know of the discussions that
00:33:55.960 you've had have struck me as mockery.
00:33:57.540 Um, and I find even, even in instances of strong disagreement, I don't think that you are
00:34:06.180 disrespectful, but, but I think that the question of, of whether mockery is effective may be
00:34:11.280 just the wrong way for me to think about it, because it may very well be that you change
00:34:16.700 some minds through mockery, but that, that isn't the way that I want to, to do it.
00:34:21.320 And maybe there are some tactics that just are so, I mean, there are some issues that
00:34:24.620 are so important that you, you might adopt a by any means necessary approach, but I, I
00:34:28.700 find it distasteful and disrespectful.
00:34:31.940 Yeah.
00:34:32.940 Yeah.
00:34:33.940 I don't know how we define mockery, but so for instance, the way I speak about Trump,
00:34:37.900 right?
00:34:38.400 Well, this is not everyone's cup of tea.
00:34:41.440 I mean, obviously Trump supporters who are totally incorrigible hate what I say about Trump
00:34:48.060 and they, they must be unreachable, but I think, I gotta think even there it reaches somebody.
00:34:56.280 And, and on certain points, there is just no other way to say it.
00:34:59.680 I mean, to fail to convey the feeling of moral opprobrium that, that seems to me just central
00:35:08.140 to the response I'm having to Trump.
00:35:10.520 Yeah.
00:35:10.900 I mean, to delete, to leave that off the table is to actually not communicate what I think
00:35:15.240 about Trump and what I, what I feel everyone has good reason to believe about him.
00:35:19.840 And so I guess the, the, the respect side comes in where I can give a sympathetic construal
00:35:26.840 of why someone didn't see it that way at first, or maybe even doesn't see it that way now.
00:35:33.680 And I can certainly sympathize with someone who hated Clinton and felt for their own reasons
00:35:39.640 that Trump was probably a better choice.
00:35:41.460 There's, there's definitely a, a, a discussion to be had that they can dignify the other side.
00:35:46.140 And I, you know, I spent a whole podcast running down Clinton with, with Andrew Sullivan.
00:35:50.020 So I'm, I'm sympathetic with the other side, but to actually just focus on a specific example,
00:35:56.380 like Trump and Trump university, as I did with Scott Adams and to not express just how despicable
00:36:04.980 that was and how despicable it is not to find it despicable.
00:36:10.380 Now, I was somewhat hamstrung in my conversation with Scott because I have to play host and debate
00:36:16.100 partner, but kind of the host has to win.
00:36:18.720 At least I'm using it as a heuristic now that the host has to win in those moments and,
00:36:23.900 and keep it, keep it civil at, at, at all costs.
00:36:26.800 But to give him a pass on that, I feel is a moral failing in itself and an intellectual
00:36:33.860 one. So, and to not communicate that is, is dishonest.
00:36:38.120 I guess, um, what you did with Scott Adams is, as I see it different, you weren't mocking him.
00:36:46.020 You weren't, I, I'm not saying you shouldn't express your feelings or you should sugarcoat
00:36:51.820 how you feel and what you believe about Donald Trump.
00:36:55.820 But when you look at what you did with Scott Adams, you were very deliberately trying to see
00:37:01.680 his perspective, trying to understand why he was defending the positions that he was defending.
00:37:08.880 And, um, I don't know, like, I see that more as an example, even though he wasn't going to be
00:37:14.960 persuaded either way. I see that as an example of more what I'm talking about than what you're
00:37:23.200 talking about. And I think this is what doesn't happen with Trump, with liberals and Trump voters
00:37:28.680 is they are dismissed in like the basket of deplorables. They're just dismissed as this
00:37:34.920 monolithic group of racist idiots who vote against their own interests constantly. And
00:37:41.560 just to be clear, I'm highlighting not what I said to Scott or about Scott, but what I say about Trump.
00:37:49.460 Trump, there's no way to sugarcoat it. I am being as disrespectful as you can possibly be
00:37:55.420 about Trump. So imagine what I would have to say to Trump to his face if I ever met him to square
00:38:01.780 with what I've said.
00:38:02.920 I'm talking about a Trump voter and trying to convince a Trump voter to change their mind. Say,
00:38:09.240 we get to the next election time and you're canvassing with the Trump voter,
00:38:14.280 the way to change their mind both as a party and as an individual person isn't going to be,
00:38:22.340 I don't think, to make fun of them because that's what was tried. And that's what seemed like almost
00:38:31.820 a galvanizing, it had a kind of a galvanizing effect to the voters.
00:38:37.840 But what do you think of something like the SNL sketches against Trump and Sean Spicer?
00:38:42.700 Yeah. So I was going to get to another distinction about humor because there's not a clear line.
00:38:54.020 And all I can do, I think, is point to the sort of attitude that somebody holds toward another human
00:39:03.720 being where humor is actually a great way to satirize and to condemn. And by the way,
00:39:10.700 I also agree with you that what I'm not saying is that there aren't cases of just sheer moral
00:39:16.240 condemnation that we shouldn't pull our punches. We should be very, very comfortable to say,
00:39:21.480 I agree with you. I think Trump is somebody who I wouldn't have anything good to say about him.
00:39:27.040 And I think so much of what he's doing is wrong and setting the wrong example. And with humor,
00:39:32.220 I think humor, there is often a line there. And I find that I can distinguish the kind of humor that I
00:39:41.800 think is good satire for me in my reaction from stuff that just gets nasty in some way in the tone
00:39:49.800 with which it's being done. And I think the power of humor is that it tells a truth in a way that
00:40:00.440 disarms people. It doesn't bring their walls up. Not always, but it has the power to do that.
00:40:07.540 I think I've gotten so much more insight from people like Dave Chappelle and Louis C.K.
00:40:12.120 because they tell some pretty difficult truths in a funny way. I think, though, that it can get to a
00:40:21.880 mean spirit. And then I just don't like it as much. I don't like that feeling that somebody's
00:40:28.420 disrespecting. And I think when I said mockery, for instance, what I meant was somebody who is
00:40:32.500 unwilling to engage. And I found, I think, in our James Lindsay interview about the hoax,
00:40:40.040 I found an unwillingness to engage or just a stopping point at their willingness to talk about
00:40:48.140 opposing views that is what distressed me or what bothered me, I guess.
00:40:53.020 I haven't listened to that, so I'll have to do that. So let's open it up to this larger
00:40:58.140 issue of moral persuasion. And this follows rather directly from what Scott Adams was claiming on my
00:41:06.400 podcast, that Trump is this brilliant persuader and that persuasion is really not about facts and
00:41:16.040 needn't be about facts. I mean, it's not a bad thing that it's not about facts. This is one thing that
00:41:23.020 again, in my role as host, I couldn't fully communicate how reprehensible I feel this position
00:41:31.420 is. And I'm not saying anything about Scott that I wouldn't say to him. It's just hard to kind of
00:41:37.980 split the baby in real time when you're on your own show. And I say this now fully aware that it will
00:41:43.500 get back to Scott. But I just feel like this, he seemed totally comfortable. In fact, he seemed
00:41:49.560 fairly jubilant about caring not about what is true, but about what people can be led to believe.
00:41:59.240 It just matters what people can be led to believe. Don't you understand, Sam? That's the game we're
00:42:04.140 all playing. That's what this life is about. It's about persuading people to get what you want out of
00:42:09.540 life. And Trump is great at that. And that, as a kind of the linchpin of an ethical worldview,
00:42:16.300 there's so much, where do I start? Everything is wrong with that. As a scientist, as a philosopher,
00:42:22.920 as a journalist, as a compassionate person who just wants to have his or her beliefs track
00:42:28.520 reality. I mean, whoever you are attempting to build a better society, I don't see how you can
00:42:35.880 be comfortable with that as your starting point. And yet, he does have a point. I mean, the fact that,
00:42:42.660 one thing that was astonishing after our podcast was to see how differently our two respective
00:42:49.020 audiences perceived it. I mean, my audience vilified him and his audience vilified me. And I mean, it was
00:42:56.480 clear that they thought he had destroyed me. What an embarrassment. You know, it was like career
00:43:02.760 suicide for me to have someone as brilliant and as persuasive as Scott on my podcast to just,
00:43:08.900 you know, do the Jedi mind trick on me. By the way, we've had some of your followers listen to our
00:43:14.440 long podcast on free will and say, Sam destroyed you guys. And I always sort of laugh because I'm
00:43:19.680 like, you know, I don't think that the destruction of it. I did destroy you guys. I was like, you know,
00:43:26.580 I don't, I think that they, that that was, that was me. I have another account.
00:43:29.800 You have like an account with six followers. The Scott Adams interview, it's a, it's a, it's,
00:43:37.920 it's a funny thing to listen to. You get kind of disoriented and, and, and there was a kind of
00:43:43.220 postmodern feel to it. There was a kind of postmodern critical theory kind of perspective that he seemed
00:43:52.260 to be inhabiting with facts and, and, and reason-based arguments or at least sort of, you know,
00:44:00.600 objective reason-based arguments that could be independently evaluated just didn't play the role
00:44:07.500 for him that it played, that it plays for you and that it's, you know, mostly we think plays for,
00:44:14.600 for all of us. And there was a meta level as trying to, when, you know, when you two would
00:44:22.140 debate, say the Russia investigation or climate change, and he would say, well, you know, the
00:44:28.600 Paris deal was a hoax and you weren't, but Trump said climate science was a hoax. And, you know,
00:44:33.580 all of a sudden there were shifting terrain. And then you start to wonder, is Scott Adams treating
00:44:40.600 this very debate as something to be like a vehicle for persuasion? Not of you. He probably knew that
00:44:51.220 you weren't going to be persuaded, but, so he's not trying to win the argument in the, or the debate
00:44:57.620 in the sense that we understand that. He's trying to do what he says Trump is a master at doing,
00:45:03.280 which is persuade people to appreciate Trump or to find something in him that they haven't found
00:45:13.140 before. And then it was like, now I don't, it's like, how do you assess this, uh, this argument
00:45:20.140 at all if he's not even trying to win the argument as I understand winning arguments, you know?
00:45:25.300 No, I think that's true. I think he's very sincere about his insincerity. I think he's got,
00:45:30.440 he's got this bad faith structure to his game and he's fine with that. And I feel that there is an
00:45:38.940 immense number of, of intellectual and ethical problems that follow from that. And, and we
00:45:45.140 couldn't fully get into it, but it's a, I do find it very frustrating, but in his defense,
00:45:51.080 the aftermath and just everything we see around us proves at least one part of his thesis,
00:45:58.400 the two movies analogy, our audiences, my audience and Adam's audience are, were clearly watching
00:46:05.780 different movies of that podcast and perceived it totally differently. And the question of,
00:46:13.120 of moral persuasion, how do you bridge that gulf? Honestly, I'm at a loss when you can't get
00:46:20.920 facts that would be morally salient in another context to matter to someone for the purpose of
00:46:31.340 a political discussion. I mean, like when, when I, when one point I made with him, which, to which he
00:46:35.460 didn't have a rebuttal. I mean, I think he basically agreed with me. You know, I said, listen, if I did
00:46:40.720 any one of these things that I just named that you're not disputing Trump has done, if I did any
00:46:46.940 one of these things, it would be the end of me. And for good reason, I mean, you would not come
00:46:51.820 on this podcast if you had heard that I have a, had a Trump university in my backstory, or if I had
00:46:57.220 been, you know, barging into the dressing rooms of, of the beauty pageant contestants under my sway,
00:47:02.740 or I mean, any, any of these things. And, you know, you would rightly recognize that I'm a schmuck
00:47:08.080 who shouldn't be taken seriously. He does sort of split the difference here. And in other moments,
00:47:13.200 he says, well, who am I to judge any of that? And I'm not the Pope. And I, you know, when he's
00:47:16.800 talking about Trump, he's, or he says, oh, he's lived more publicly than you sort of implying
00:47:21.800 that who knows. But, and I do wonder about someone who feels that he is in no position to judge
00:47:30.700 the litany of abuses to morality and reason we see just pouring out of Trump's life.
00:47:38.200 I think his better argument was that you shouldn't, like, we're not hiring him to model,
00:47:46.620 to be a model citizen, good behavior, where it's like, you want that dirty lawyer, or as
00:47:52.980 Dave would say, the Jew lawyer to win your case for you.
00:47:57.380 God, it's character assassination.
00:47:59.260 You don't want the lawyer that's the most upstanding citizen when you're in a battle,
00:48:06.100 you know, for your, you know, whether you're going to go to prison or not, or for a lot of
00:48:11.320 money.
00:48:12.280 There's so much to disagree with him about. And, but I'll tell you what I found the most
00:48:17.880 distressing. And, and, and again, I actually found him to be like an interesting, respectful
00:48:26.100 dude when he was discussing. So this isn't this, but, but I, but I get, I reserve the right,
00:48:31.320 as Sam, you were saying before, to just fundamentally disagree with him. And what I found the most
00:48:34.840 distressing in his whole, in the whole interview was, as you point out, the amorality of his,
00:48:43.100 of his arguments, but another one, just the insistence on praising Trump for his persuasive
00:48:48.320 powers and unwillingness to talk about what he was persuading people about, that he was avoiding
00:48:54.300 any discussion of content. So, so it's fine if you want to get in what he wants. And that's
00:49:00.860 intrinsic good. Intrinsic good. And it made me think, you know, for some people, this is an insult.
00:49:06.240 Some people, it might be a compliment, but, but I, it was very Anne Randish. And I was, I, I was struck
00:49:13.140 by that being a good in and of itself that, that sort of, you know, we've reached 33rd level persuasive
00:49:20.460 powers. And so you got to admire the guy, but if your persuasive powers are being used to not care
00:49:27.540 about the, the, the future of the environment, um, or, or to, to discriminate against people
00:49:34.400 or whatever, um, how, how is that a good, I mean, but you couldn't get him to discuss that.
00:49:41.580 And it was always bringing it back to, well, this is just part of his masterful game.
00:49:46.920 Um, which is like, great. You might be a really, really great marksman, but if you're shooting
00:49:51.820 people, I don't like you. And, and I, at this point he would tell me, well, the, I've failed
00:49:57.380 because my use of analogy, um, but, but, but I think I, I, I found it when it's all said
00:50:07.340 and done, I found it almost monstrous to, to think of a president and endorsing him for,
00:50:13.380 for doing that, for, for being good at that.
00:50:16.180 Yeah. Well, yeah. Also not to see the cost or forget about what he's persuading people
00:50:20.240 toward the fact of just having this style of communication that is so, so dishonest that
00:50:28.760 more or less there's just every assumption now is that there's something false in what he said.
00:50:36.180 Even if you're his fan, you have to bracket everything he says with this basic uncertainty
00:50:41.800 about whether he means it and the cost of, of, of that to our society and to our politics.
00:50:46.900 The downside of that is, is so obvious, but, you know, he clearly doesn't care about it.
00:50:53.980 Your question about, you know, there are these two movies and the movies seem to be operating
00:51:01.540 according to different principles too, just in terms of what counts, you know, if the whole,
00:51:09.100 the media takes Trump literally, but not seriously, people take Trump seriously, not literally.
00:51:19.380 And it's like, and this, and I guess that serious part on the Trump voters is that idea of kind of
00:51:24.680 emotional trust or the, you know, they, they trust him emotionally. And so when they, when he goes off
00:51:33.260 on some bullshit tweet storm, they know it's bullshit. They know he's lying, but he has their
00:51:39.840 emotional trust. I mean, I think that there is something right about that, at least as a
00:51:46.360 descriptive explanation for what's going on. And.
00:51:51.020 I actually think that's mostly untrue. I mean, I think I want to call bullshit on that claim too.
00:51:57.860 I mean, for instance, when Trump gets up there and says, you know, my inauguration crowd was bigger
00:52:02.300 than any that had ever been seen. I think most of his fans think that's true when he says it.
00:52:10.160 And they think it's the fake news media out to get him that is disputing it. And if they ever come
00:52:16.040 around to being convinced by the photos, which, you know, half of them probably think are doctored,
00:52:22.300 they think, well, who gives a shit? You know, he's great anyway. And so it's like, there's.
00:52:26.800 But why do they say he's great anyway? Because they trust him. They trust him. He's a fighter.
00:52:31.520 He's a businessman. He's going to fight for their.
00:52:33.460 The way Scott views him is a very unusual way of viewing him. I think people are.
00:52:39.300 They think everyone's out to get him so that most of the criticism about him and most of the fact
00:52:45.020 checking has to be purely malicious. And most of that is just a tissue of lies and conspiracy theories.
00:52:53.000 And there's probably nothing untoward happening with Russia. And, you know, he's he prize almost
00:52:58.960 certainly this really good guy who's just getting hammered by the left wing elite. But then when it
00:53:05.760 when any one piece of this shifts into the certainty column where, OK, no, Trump clearly was lying there,
00:53:14.380 then they have a piece of the Scott Adams view, which is, well, who cares? He's just you know,
00:53:20.180 that's just for effect or that's easy that that works. He did it because it works. Get used to it.
00:53:24.400 But for the most part, I don't think they're there. That's not their first perception. The first
00:53:28.820 perception is he's just under attack. There's a siege. And it's it's driven not by how far from normal
00:53:37.440 and ethical and professional and competent he is. It's driven based on just pure partisan
00:53:44.120 rancor. I mean, people like me are just unhappy to have lost an election.
00:53:48.300 Yeah. No, I mean, I think you're you're right about that. I guess I didn't want to build too much on
00:53:54.680 the psychology of the Trump voter as much as in terms of getting people in that movie to sort of
00:54:04.800 be able to talk and debate. There is something in this idea of building emotional trust. And one of
00:54:12.900 the one of the reasons why the fake news, you know, liberal skewed biased media, you know, all those
00:54:23.540 charges seem so effective. They're very effective on convincing Trump voters that he's being treated
00:54:30.820 unfairly, as he loves to say, is because there is no trust right now for those kinds of institutions.
00:54:39.120 You know, the establishment Republicans, the establishment Democrats and and the news media
00:54:44.420 in general. And so to you know, that's the I think the work that has to be done is building some of that
00:54:53.660 trust back, because without that, there's no terrain to persuade people to revise their opinion of a man
00:55:02.040 that they've put a lot of stake in. A lot of these voters, they it's they they're really motivated to
00:55:09.640 not look like they got played for a sucker to not look like they've been conned. And so only somebody
00:55:16.880 who they have a tremendous amount of trust in and and also I think some some degree of respect for
00:55:25.720 is going to be able to make progress in in changing their minds about that, because there's a lot of
00:55:31.680 biases. I think I think you're being I don't I don't think that there is on that the liberal media
00:55:39.160 has eroded trust and that this is why the people went for Trump. I think it's a much simpler story,
00:55:45.040 which is he was saying shit. A lot of people wanted to hear. They were voting in their self
00:55:49.580 interest for Trump because they really believed it. And one way to take Scott Adams view is and I
00:55:55.000 agree with both of you. I don't think this that Scott Adams represents in any way the average Trump
00:56:00.740 supporter. One way in which I think he's right is that Trump has persuaded a substantial portion of
00:56:08.480 people that he is to be trusted. And I think that that is despite all of the evidence that he is
00:56:14.680 not to be trusted. And so you say to yourself, well, how can people trust him despite all of
00:56:19.460 this evidence that he's a liar, that he makes decisions based on self-interest, not even on
00:56:24.580 principle. And I think it's because he has said a few things that people really, really wanted to
00:56:29.740 hear. And I don't think it's the liberal media has eroded trust and it needs to build it back up.
00:56:35.320 I think it's just totally directional bias.
00:56:38.360 Well, the thing is, though, it has, I mean, I can attest to the failings of the liberal media or
00:56:44.480 the mainstream media on certain topics that are so reliable that I do have a window into how a right
00:56:53.920 wing Fox and Breitbart fan could view the editorial page of the New York Times or even just the news
00:57:03.020 pages, because I've seen them commit errors of fact or to shade their discussion of facts so
00:57:11.040 reliably on certain topics. I mean, the topics of, you know, the link between Islam and terrorism
00:57:16.520 is one where I can just guarantee you I will find in an article some way in which political correctness
00:57:23.280 is distorting the presentation of stark facts. There are whole articles in places like the New York
00:57:29.980 Times talking about terrorist suicide bombings as though the motive were a mystery that is bound
00:57:38.760 to remain impenetrable until the end of time. And there's no mention of Islam. There's no mention
00:57:44.160 of religion. There's just that you have generic words like extremism and all of this to someone
00:57:49.520 who's been paying attention to this problem and is worried about the spread of specific ideas
00:57:55.240 relative to jihadism. It's a very fishy way to describe what's going on. And so it is with something like gun
00:58:02.140 control and gun safety. There'll be a shooting at a school and you'll have the response in the New York
00:58:09.220 Times and you'll just see, you'll see positions being articulated by people who know nothing about guns,
00:58:17.020 who have never shot a gun, who don't, who get everything wrong. I mean, the names are wrong. I mean,
00:58:21.660 we hear them on CNN talking about guns. They pronounce the names of gun manufacturers wrong.
00:58:26.980 I mean, it's just the level of cluelessness is so obvious. And so I can see that it's possible that
00:58:34.200 even in the valid reaction to Trump, there's something demeaning about having to respond or
00:58:43.080 feeling that you have to respond again and again and again to Trump's dishonesty and indiscretions,
00:58:48.860 because every time you do it, you're running the risk of making an error yourself, however small,
00:58:58.980 which seems to put you on all fours with Breitbart or with Trump himself. Or it's just that there's
00:59:07.280 something that erodes your credibility by just taking the time to be endlessly criticizing
00:59:13.400 someone like this for the same points. And so you, when you look at the New York Times now,
00:59:19.200 there are days where the whole paper looks like the opinion page, because they have to take a
00:59:23.100 position against this guy.
00:59:24.200 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:59:34.760 samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense
00:59:39.440 podcast, along with other subscriber-only content, including bonus episodes and AMAs and the conversations
00:59:46.180 I've been having on the Waking Up app. The Making Sense podcast is ad-free and relies entirely on listener
00:59:51.960 support. And you can subscribe now at samharris.org.