Making Sense - Sam Harris - December 14, 2017


#108 — Defending the Experts


Episode Stats

Length

43 minutes

Words per Minute

174.04271

Word Count

7,489

Sentence Count

385

Misogynist Sentences

3

Hate Speech Sentences

4


Summary

Tom Nichols is a professor of National Security Affairs at the U.S. Naval War College and an adjunct professor at the Harvard Extension School. He s also a five-time undefeated Jeopardy! champion, and as one of the all-time top players in the game, he was invited to the Ultimate Tournament of Champions in 2005. He's the author of several works on foreign policy and international security, including The Sacred Cause, No Use, Nuclear Weapons and the Coming Age of Preventive War. And his most recent book, which is the focus of our conversation, is The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters. In this episode, we talk about the death of expertise, the Dunning-Kruger effect, and how to think about the failure of expertise in various areas, including medicine and politics. We talk about conspiracy theories, North Korea, politics, Trump, and other matters related to the Trump administration, and much more. This episode was produced by Sam Harris and edited by Annie-Rose Strasser. Our theme song is Come Alone by Suneaters, courtesy of Epitaph Records, and our ad music is by Haley Shaw. The album art for the podcast was done by Mark Phillips, and the music for this episode was written and performed by Ian Dorsch, and was edited by Matthew Boll, and additional engineering assistance was provided by Ian McKellan, and is available on SoundCloud, and his excellent mixing skills are available on Audible.org, and we are working on a website that will be providing excellent sound effects for the Making Sense podcast. Thanks again for all your support, we really appreciate the feedback we get back to the podcast. We really really appreciate it. -- Thank you, Sam Harris, and your support is really helps us out here. -- The Making Sense Podcast? -- -- -- and we appreciate you, too, too much of it's worth it, and so much more than that we can make it out here, and you can help us out in the making sense of the podcast -- we really really do it, we do it more than we can do it -- it really does it, really means it helps us do it better than that, and it's really means that we do that, really really does so much so we can really do that -- we make it so much of that, thanks you do it... -- thank you, thank you really really, really does that, it's not just that, Thank you really does really, and they're really good, really can really really can do that ...


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.620 So today I'm speaking with Tom Nichols.
00:00:49.560 Tom is a professor of national security affairs at the U.S. Naval War College and an adjunct
00:00:56.020 professor at the Harvard Extension School.
00:00:58.780 He's a former aide in the U.S. Senate.
00:01:02.600 He's also a five-time undefeated Jeopardy!
00:01:05.200 champion.
00:01:06.520 And as one of the all-time top players in the game, he was invited to the ultimate tournament
00:01:11.640 of champions in 2005.
00:01:13.640 He's the author of several works on foreign policy and international security, including
00:01:20.100 The Sacred Cause, No Use, Nuclear Weapons and U.S. National Security, Eve of Destruction,
00:01:27.580 The Coming Age of Preventive War.
00:01:29.260 And his most recent book, which is the focus of our conversation, is The Death of Expertise,
00:01:36.040 The Campaign Against Established Knowledge and Why It Matters.
00:01:38.960 And we talk about the death of expertise, talk about the Dunning-Kruger effect, which many
00:01:46.240 of you have probably heard about, talk about the growth of knowledge and our inevitable
00:01:51.120 reliance on authority, all the while superseding it, and we talk about what to do when experts
00:01:58.200 fail or how to think about the failure of expertise in various areas, medicine in particular.
00:02:04.000 Now, we talk about the repudiation of expertise that we now see all around us in politics.
00:02:11.420 We get into conspiracy thinking a little bit.
00:02:14.800 Then we hit topics that are very much in Tom's area of expertise, North Korea, politics, Trump,
00:02:21.840 and related matters.
00:02:23.680 Tom is a lifelong Republican, but you will find that he is also among the never-Trumpers and
00:02:31.720 has a few things to say on that topic.
00:02:34.400 So, without further delay, I bring you Tom Nichols.
00:02:43.420 Hi, I'm here with Tom Nichols.
00:02:45.100 Tom, thanks for coming on the podcast.
00:02:46.920 Thanks for having me, Sam.
00:02:47.720 I appreciate it.
00:02:48.840 So, I've had several guests on the show who have written books without any awareness of
00:02:54.940 what was coming, and what was coming was the Trump presidency, but their books have been
00:03:00.260 almost perfectly timed for the moment.
00:03:02.900 The book we're going to discuss here is your book, The Death of Expertise.
00:03:07.660 When did you write the book?
00:03:09.000 When were you working on it?
00:03:10.500 Actually, I started writing the book about three years ago.
00:03:14.880 It was originally a kind of a blog rant that then got picked up as an article that ran in
00:03:19.960 late 2013, early 2014.
00:03:22.060 So, I ran it well in advance of the election.
00:03:26.640 Actually, I had no idea that Trump was going to run.
00:03:31.520 And to include some stuff about the election, I actually had to pull the galleys at the last
00:03:35.480 minute and include some discussion of that and Brexit.
00:03:38.440 Yeah, well, so we're not going to focus on Trump per se.
00:03:42.820 I mean, we'll talk about him, but he really is the walking distillation of much of what you
00:03:47.920 write in the book, and it could not come at a better time.
00:03:50.740 So, before we get into your argument and the issues you discuss, just tell us for a couple
00:03:57.040 of minutes about your background as an academic, a person who has served in government in the
00:04:03.520 Navy, what are the kinds of problems you have focused on up until now?
00:04:07.080 Sure.
00:04:07.400 Well, I actually began my life, I'm going to date myself here by admitting this, but I
00:04:12.260 was actually a Soviet specialist back in the day.
00:04:15.900 That could be pretty relevant now as well.
00:04:18.360 Yeah, unfortunately, it's a skill that's coming back into vogue.
00:04:23.240 So, I began my academic career as a Russian-speaking Kremlinologist type.
00:04:30.560 I worked on Soviet foreign and defense policy.
00:04:34.360 I kind of went through a standard policy and academic track.
00:04:39.100 I taught at Dartmouth for a lot of years.
00:04:40.840 I taught at Georgetown.
00:04:42.640 I worked in the United States Senate for the late Senator John Hines of Pennsylvania.
00:04:47.300 I did a lot of consulting in Washington, which, you know, back during the Cold War, if you
00:04:51.240 could speak Russian, there was a lot to do there.
00:04:54.200 And then I, over time, I kind of moved on to broader international security stuff.
00:04:58.460 And I ended up at the Naval War College, which I should add, I don't represent the government
00:05:03.300 or the Navy or anybody in this discussion, where I teach military officers during the
00:05:08.260 day.
00:05:08.540 And I go up to the Harvard Extension School at night, where I teach national security affairs
00:05:13.480 at both places, international relations, nuclear weapons, international humanitarian stuff.
00:05:20.260 So, I kind of moved away from the Russia thing.
00:05:23.900 And, you know, ironically enough, like everybody else, I sort of thought the Russia thing was
00:05:27.700 not going to be a, you know, kind of a lasting skill set.
00:05:31.740 But yet, here we are.
00:05:33.620 Yeah, really.
00:05:34.560 It really seemed completely gone.
00:05:37.040 And all of a sudden, we're back in something like the Cold War.
00:05:41.260 All right.
00:05:41.780 So, there's so much I want to talk to you about here.
00:05:43.400 But let's focus on the book for the moment.
00:05:46.840 There's one topic you raise in the book, which many people will have heard of, and it's the
00:05:52.060 Dunning-Kruger effect.
00:05:53.900 Describe that effect.
00:05:55.500 Well, the Dunning-Kruger effect, as I always like to tell people, it's a frustrating thing
00:06:00.180 that you've experienced at, you know, Thanksgiving dinner that finally has a scientific name,
00:06:04.820 which is that the less competent you are at something, that the dumber you are, the less
00:06:10.860 likely you are to realize that you're dumb, which is why kind of the least informed person
00:06:16.740 at dinner sort of spools off the longest.
00:06:19.640 Or the other analogy I always use is like the guy who goes up and butchers a song during
00:06:27.080 karaoke night, steps off the stage and says, nailed it, because he just doesn't get it.
00:06:33.020 He can't hear it.
00:06:33.760 And so the Dunning-Kruger, these two social psychologists, Dunning and Kruger, did a series
00:06:38.820 of tests where they figured out that the people who are least competent at something tend to
00:06:43.720 be the most likely to overestimate their competence at whatever they're doing.
00:06:47.980 So, you know, people that are bad writers think that they're terrific writers, and that's
00:06:51.800 why they're bad writers, because they can't recognize it.
00:06:54.280 They can't mobilize this skill called metacognition, which is the ability to step back from what you're
00:06:59.780 doing and evaluate it kind of outside of yourself a bit.
00:07:03.760 Yes.
00:07:05.100 Unfortunately, the Dunning-Kruger effect as a meme has spread so widely online that now
00:07:11.300 I've begun to notice that mentioning the Dunning-Kruger effect is often a symptom that one is suffering
00:07:17.000 from it.
00:07:17.820 I don't know if you've noticed this, that people are throwing this around and they do
00:07:21.720 it more or less in the direction of any ideas they don't like.
00:07:25.960 Right.
00:07:26.580 Well, and it's become a synonym for stupid, which it isn't.
00:07:30.240 Right.
00:07:30.440 The Dunning-Kruger effect is a very specific thing of thinking you're good at something
00:07:34.760 when you're not good at something, and that the worse you are at it, the less likely you
00:07:40.540 are to be able to recognize it.
00:07:41.980 Yeah.
00:07:42.200 It should be obvious why that would be the case, at least in one respect, because it's
00:07:48.220 not until you really know a lot about a discipline that you come to recognize how much more there
00:07:56.480 is to know the gradations of expertise.
00:07:58.800 It takes a mathematician of some level to appreciate the most brilliant products of mathematics,
00:08:08.080 and therefore the feats of mathematicians better than him or herself.
00:08:13.500 If you don't have all the tools necessary to have the conversation, you can't even appreciate
00:08:19.880 the high-wire act that's going on over your head.
00:08:22.200 Um, I think too, you know, that, um, the other word I use a lot in the book that I
00:08:27.420 think creates a synergy with the Dunning-Kruger effect is narcissism.
00:08:31.660 Um, because people, as you say, you know, when you become an expert at something, uh, and
00:08:39.040 I, and this is, this is kind of ironic because of course I don't exactly have a reputation for
00:08:42.700 being a self-effacing, humble guy, but it's a very humbling thing to become an expert because
00:08:47.100 you start to realize that what you thought might be interesting and relatively, you know,
00:08:52.420 something you could get your arms around turns out to be immensely complex.
00:08:56.020 Um, it, it's sort of like, um, uh, deciding that you, I think C.S. Lewis has a great metaphor
00:09:02.480 for it when, you know, you love the stories of, of, uh, you know, Homer as a boy, and then you
00:09:07.520 start studying ancient Greek, uh, and say, wow, this is really difficult.
00:09:11.420 There are a few paradoxes here, however, that there's really this paradox of knowledge acquisition
00:09:18.900 that cuts against this thesis of honoring expertise because the advancement of our knowledge really
00:09:27.120 is the result of distrusting and defying received opinions.
00:09:31.400 You have scientists who find that there's something wrong with the consensus on any given
00:09:38.280 topic and they begin to defy it. And you need to have the tools of your discipline in order to do
00:09:44.560 that. But it is just a fact that the growth of knowledge is a process where experts are continually
00:09:53.140 unhorsed by a new generation of experts.
00:09:57.560 And that's key. That's a key thing that I think lay people don't understand. They say, well, you know,
00:10:03.340 experts have to be challenged all the time because they get things wrong. Yes, they do have to be
00:10:07.820 challenged, but by other experts who understand that field and who understand the rules of evidence
00:10:14.340 in that field and who understand what's already been accomplished in that field. Um, you know,
00:10:19.620 the, an example that people often, um, bring up when I talk about this, they say, well, you know,
00:10:24.000 doctors, what do they know? They got it wrong about eggs. And I talk about this in the book
00:10:28.380 because I happen to love eggs. Uh, but you know, who figured out that eggs aren't so bad for you?
00:10:33.540 Well, other doctors did by peer reviewing and testing the assertions of an earlier generation
00:10:40.000 of medical specialists. It wasn't, uh, you know, the, the guy next to you in the diner who says,
00:10:45.680 you know, I, I ate eggs all my life and I feel great. And I think that's where people make that
00:10:49.740 mistake. Another variable here is that there's the problem of specialization. There's just too much
00:10:54.640 to know. It's just impossible to know everything about everything or really even anything about
00:10:59.460 everything. And so we all, no matter how well-educated we become, we all rely on authority
00:11:06.380 in general because there's just not enough time to gather all the tools you would need to verify
00:11:12.540 every claim to propositional knowledge that you want to make. And so you have even the most
00:11:18.140 accomplished scientists say to speak of one area who can't help, but rely on the authority of their
00:11:24.200 peers in areas where they're not competent to investigate. And yet the algorithm of knowledge
00:11:32.700 acquisition is to, when the time is right, or when given sufficient reason to distrust authority
00:11:40.100 and move the boundary of our knowledge slightly further in one direction. And there's also this
00:11:46.400 issue with respect to authority where you can't argue on the basis of your authority. You can't cite
00:11:52.180 your credentials as a reason that you should be taken seriously. I mean, either your argument
00:11:56.620 and your data survive scrutiny or they don't. This reliance on authority is a little fishy. Once you
00:12:04.200 shine the light on it, it seems to disappear. But then when you're not looking at it, it's there and
00:12:09.000 it's actually constraining, and rightfully so, it's constraining how the conversation should run and who
00:12:14.280 should be listened to. Well, let me give you an example, because I think it depends on who's doing
00:12:20.020 the challenging. One of the worst stories I ever heard from my own field in the study of politics,
00:12:27.400 I don't even want to say political science, the study of government. Years ago, a colleague of mine
00:12:32.040 wrote a piece where he thinks he found a kind of mistake or a misinterpretation in a body of work
00:12:39.020 done by a very famous scholar. And the journal sent the piece back to him saying, look, that scholar
00:12:47.100 doesn't make mistakes like this. Now, that is exactly the kind of fishy appeal to authority
00:12:54.340 that you're talking about. I mean, here was a young man. He's a professor. He had the credentials to
00:12:59.860 enter the debate. He'd put the work in. He'd written up his findings. And the answer was,
00:13:06.540 this person is a giant of our field. It is a priori impossible that he could have made that kind
00:13:13.100 of mistake. And I think that's where peer review fails. I think though, the notion of being skeptical
00:13:20.740 of authority is something as someone trained in science myself, I actually began in the natural
00:13:27.280 sciences and I moved on to the social sciences, I think is really important to the furtherance of
00:13:32.040 knowledge, but I don't believe in skepticism for its own sake. An appeal to authority, as one of my
00:13:37.600 friends, I wish I could claim this quote, but a friend of mine came up with a great quote. He said,
00:13:41.400 the answer to an appeal to authority is not an appeal to ignorance. And when people say, well,
00:13:47.160 I distrust eggheads merely by the fact that they are eggheads, that solves nothing. I think the kind
00:13:54.360 of research, you know, where I was talking about in this other article where somebody said, huh,
00:13:58.020 Isaac Asimov always said, the greatest discoveries in science are not attended by words like Eureka.
00:14:03.660 They're attended by words like, gee, that's funny. One of my colleagues looked at this piece and said,
00:14:08.540 gee, that's funny. I don't think that's right. And he brought all the skills and tools to bear.
00:14:13.100 Now, as it turns out, over time, his argument has in fact won the day. But 25 years ago, while this
00:14:19.520 major scholar was still alive, yeah, there was a closing of, you know, circling of the wagons.
00:14:25.320 And that can happen. And science and knowledge fail when that happens. But I would argue that the
00:14:29.740 daily successes of scholarly interaction, expert, you know, cross-checking, peer review, that those
00:14:37.940 successes are far more numerous than the failures. And I think people concentrate on the failures
00:14:42.080 in the same way that they concentrate on spectacular plane crashes, that they think that these
00:14:48.680 magnificent expert failures on occasion kind of negate, it's just like people being afraid of a plane
00:14:54.320 crash, thinking that it negates the safety of air travel. I think people don't realize, and you
00:14:58.660 pointed this out when you talk about the division of labor, I think people don't realize how much
00:15:04.300 around them goes right every single day because of expert knowledge.
00:15:09.820 Right. Well, let's talk a little bit about when experts fail and how to think about that.
00:15:14.620 As your friend suggested, the answer to bad science and failed science or even scientific fraud
00:15:21.600 is just more science and better science. It's never the promotion of ignorance or
00:15:28.560 superstition or conspiracy theory. Burn down the library.
00:15:33.080 Yeah. I mean, so, you know, or, you know, witch doctors or, I mean, it's like the movements away
00:15:39.180 from scientific orthodoxy are almost never, you know, take in the realm of health. You have the fact
00:15:47.720 that it is just this galling fact of medicine that there are differences of opinion about what
00:15:54.280 is healthy to eat or what treatments are appropriate for various conditions.
00:15:58.920 You can get doctors that disagree. You can get failed, you know, protocols that frustrate everyone.
00:16:05.080 All of this is a domain where we are groping in the dark for the facts, you know, to keep death away
00:16:11.560 in this case. But the appropriate response to that uncertainty is not to just, you know, start giving
00:16:18.560 your kids unpasteurized milk because your chiropractor told you to do it. Not every departure from received
00:16:25.520 opinion is getting you closer to the goal. But so how should we think about some of these glaring
00:16:32.780 failures? What would you have people be running in the background on their hard drive to kind of help
00:16:40.180 them emotionally respond when there is this sort of, you know, plane crash of knowledge that happens
00:16:46.780 on a fairly regular basis? I think that's a great question. And, you know, the first thing I'll say
00:16:51.500 is that, look, I share that same distrust. I mean, look, I go to a doctor, I take things, one of my,
00:16:56.700 I still remember when I was a younger guy and I was prescribed something and I just, and I made the
00:17:00.860 drastic mistake of reading that, you know, like when you open up that thing that comes inside the
00:17:05.060 box and it opens up into, you know, big 16 page thing. And I started reading.
00:17:10.360 The notice from hell.
00:17:11.180 Right. You know, that this, you know, this has been known to turn people into wolverines. And
00:17:16.200 so I started reading it and I still remember the phrase that stuck out. The action of this drug on
00:17:22.140 this issue is not well understood. Right. You know, and they just said it point blank. They said, look,
00:17:26.940 this drug, we think it works. We're not quite sure why it does. And I found it both alarming and
00:17:33.680 refreshing at the same time to say the action of this drug is not well understood, but we've done
00:17:37.880 enough clinical tests that it seems to solve the problem and it doesn't cause any other problems.
00:17:42.740 And I think that image that you're talking about of what should be running on the hard disk in the
00:17:47.380 background is a basic level of trust that you would extend to most other people. I mean,
00:17:53.120 you don't get on a bus and breathalyze the driver. You assume it, you assume that the drive, you know,
00:17:59.340 you don't, you don't assume that your letter carrier is stealing your packages. You assume
00:18:06.780 that he or she is a professional who has been delivering packages for a long time, knows how
00:18:10.360 to do it. You don't, you know, hand, you know, you don't walk into your children's school assuming
00:18:16.440 that everybody, you know, faked their teaching credentials. And I think what's really struck me
00:18:22.040 about these attacks on expertise is both how, again, I'm going to use that word again,
00:18:27.060 narcissistic and cynical they are that, um, that has really led people to they're going in position
00:18:35.180 with certain classes of experts is I know you're lying and I know you're incompetent. So let me
00:18:40.480 just take charge of this right now. A big constituency for the book, although again, I, I write a lot about
00:18:46.300 foreign policy and we've had some major failures in foreign policy because of expertise, but a big
00:18:52.740 constituency for the book were, was medical doctors who kept reaching out to me while I was writing it.
00:18:57.060 And telling me stories of people literally walking in and saying, look, I don't want to hear your
00:19:01.300 mumbo jumbo. Here's what I have. And here's what you're going to do. Um, which is really, you know,
00:19:07.220 not, uh, I, I consider myself a very lucky man. I have a great relationship with a doctor who takes
00:19:12.400 good care of me and answers all my questions, but I also make sure to show him that I trust him and
00:19:16.400 that I ask him those questions and that I'll listen when he talks to me. Um, I think with the
00:19:22.260 larger issue of policy failure, there's a somewhat different thing to, that I think people should
00:19:29.000 bear in mind, which is if, if your immediate reaction is that a policy is going wrong, whether
00:19:35.060 it's the war in Iraq or an economic downturn or whatever it is, I always turn this question back
00:19:41.420 to people to say, how much of what you're objecting to is something you wanted? Because experts don't
00:19:50.340 dispose experts propose. They are presented. I mean, I was a, um, an advisor both to the defense
00:19:57.480 department, um, to the CIA, to, um, I did some work with, uh, talking with people with state. Uh, I've,
00:20:04.760 you know, I've done a lot in the executive branch and I advised, um, both a state representative.
00:20:10.380 I worked, uh, in state politics for two years and in the federal level in the Senate for a year.
00:20:15.520 And you'd be surprised at how much of the policy outputs that experts work on are on problems that
00:20:23.020 the, the people, the voters want done. Um, and I, while I will certainly grant, you know, that
00:20:30.720 George tenant walking out there and saying, Hey, WMD is in Iraq, slam dunk. You know, he should have
00:20:37.060 been held accountable for that. That was just, that was a lousy call, um, by, you know, by the
00:20:42.900 politicization of expert opinion. Uh, on the other hand, you know, people always talk about things like
00:20:51.280 Vietnam or the Iraq war or the housing crisis. And I always point out, you know, these were all things
00:20:58.040 that were popular with the public that were, that experts were told to go fix. And that, you know,
00:21:04.280 some of the less expert opinions are Vietnam. I, I talk about briefly in the book, but you know,
00:21:10.700 it's important to remember the popular answer to Vietnam in 1964, when Barry Goldwater was running
00:21:15.740 was use nuclear weapons. You know, one of the reviews in my book said, what have you experts
00:21:20.440 done for us in the last 50 years? And my answer immediately was, well, you're not rooting around
00:21:25.100 in radioactive ashes looking for canned goods. So I think we'll take that one as a win.
00:21:30.140 Yeah. Well, I mean, everything that didn't go wrong was also secured by some form of expertise,
00:21:35.620 right? So every plane that didn't crash is a triumph of engineering.
00:21:39.760 If I could just add with every plane that doesn't crash, it's not just a triumph of engineering. It's a
00:21:43.300 triumph of diplomacy. It's a trial of, it's a triumph of public policy about, you know, managing the
00:21:48.680 airways of making sure, you know, de-conflicting flights. I mean, there's a million things that go
00:21:53.480 right every time you take a successful airplane flight. It's not just the pilot being skillful.
00:21:59.540 And I think people just don't think about that. No one has, unless you've really trained in this
00:22:04.620 area, has great intuitions for probability and risk. And let me just take the election of Donald Trump
00:22:13.680 as an example. You know, the polls, you know, on the eve of the election, I think he had a 28% chance
00:22:22.320 of winning. And many people assumed that he was more or less guaranteed not to win with a 28% chance.
00:22:32.180 I was somewhat guilty of that in that I really was just could not imagine him winning and was
00:22:37.260 certainly going to relish the moment when he didn't. But looking at those polls, I was always
00:22:43.660 worried given, you know, what I understand about probability. I understand how often a 28% chance
00:22:49.060 comes up in one's life. You know, it's a very high probability of something happening that you
00:22:54.360 think could be a kind of civilizational catastrophe. Well, I think it was Nate Silver who said something
00:22:59.260 when people were jumping all over the pollsters. He said, look, I said that, you know, Hillary had a
00:23:05.160 two in three chance of winning. He said, people remember, that means every third time you run the
00:23:09.300 election, Donald Trump wins. And I likened it to weather forecasters. You know, when a weather
00:23:16.040 forecaster says there's a 25% chance of rain and then people don't bring an umbrella and it rains
00:23:21.840 on them, they say stupid forecasters, they don't know anything, which is poor understanding of
00:23:27.840 probability, as you point out. Well, so before we dive into politics and war and foreign policy and
00:23:33.960 all of these other issues where you are an expert, I guess there's just a couple of other points about
00:23:39.680 medicine because this obviously affects people's lives continuously. I've begun to feel that this
00:23:48.460 is one of these areas where having more information is very often a bad thing. And it can be a bad thing
00:23:57.240 even for someone who is fairly well-educated in the area. I mean, so I'm not a doctor, but
00:24:04.560 I have a PhD in neuroscience. I understand a lot of the relevant biology. I can work my way through
00:24:11.820 more or less any medical document. But I find that when I get sick or one of my kids gets sick and
00:24:19.760 there's something on the menu that seems potentially terrible, the answer to that problem for me is less
00:24:29.060 less and less my getting onto Google or into scientific journals and doing more research on
00:24:36.060 my own. I find that it's just, and if this is true for me, it has to be doubly true for someone who does
00:24:41.460 not have a scientific background. I mean, now, you know, when something goes wrong, I want to know that I
00:24:47.240 have a good doctor. I want to know that I have another good doctor for a second opinion. But at the end of
00:24:54.480 the day, I have to find somebody who I can ask the question, what would you do if you were me? And
00:25:02.000 trust that behind that answer is much more expertise in this area than I have or than I'm going to get
00:25:09.980 by an endless number of Google searches. I think this issue of Googling symptoms is really creating a
00:25:17.360 kind of global wave of hypochondria. And I've often said to people, look, because they always come
00:25:24.600 back to me as, well, this is about the democratization of knowledge. Look at all these
00:25:28.580 medical journals. I can go to JSTOR. I can go to, you know, Medscape or whatever it is. And I say, yes,
00:25:33.660 but you can't understand them. And people get very offended by this. I said, look, these journal articles
00:25:39.600 in medicine, they're not written for you. They're written for people who already have a deep knowledge
00:25:45.960 of the foundational issues, who understand what it means to say, you know, this is the,
00:25:51.940 you know, N equals this, and therefore the lethality is that. You're not going to understand
00:25:57.180 that. And it's probably going to do more harm than good. And again, I sympathize, I empathize with
00:26:03.120 people about this. I had to have an emergency appendectomy. And, you know, after a night of tests
00:26:08.540 and pain and all that stuff, about five in the morning, a surgeon comes to me and she says,
00:26:12.520 we're, we really have to get, you could die. Um, we need to do this. And I said, well,
00:26:16.900 well, let me get my smartphone. Yeah. Well, this was, this was before,
00:26:20.880 now that this was before smartphones, but I was a young guy with a PhD. And I said, well,
00:26:25.640 uh, is there stuff I need to know? What are the risks here? And she kind of sighed and said, well,
00:26:29.360 here's all the things that could happen. And I started to literally feel panic. And I said,
00:26:33.560 is this really, I literally said, is this something we need to do? And my wife just kind of looked at me
00:26:38.260 and the doctor kind of looked at me. And of course, by then I'm just not making a rational
00:26:42.240 decision. And, you know, I, but, um, I part, part of what, uh, and I went through this with my father
00:26:49.280 as well. He was a gambler and he needed a heart, uh, he needed heart surgery. And, um, he, I, I put
00:26:56.880 it in gambler terms, you know, because they said, well, here's what happens if you don't have the
00:27:00.240 surgery. And instead of bombarding him with all this information, I said, dad, if you're holding this
00:27:04.920 kind of a hand, you know, and you've got these kinds of odds, what would you do? And he kind of
00:27:08.620 nodded and he got it. And I think it was a, a great case of taking a very intelligent, but older
00:27:13.560 man and just explaining it as a matter of probability. If you don't do this, here's your
00:27:17.860 chance of dying. If you do do this, here's your chance of not dying. And, uh, I think people need
00:27:23.640 to do that more often and say, look, I don't need that level of detail because it takes a certain
00:27:28.080 humility to say to yourself, because if you give it to me, I won't understand it.
00:27:32.200 But that runs can be a fairly subtle problem because you can understand it in the context
00:27:38.980 of reading it online. I mean, so for instance, if I read a paper in a medical journal that
00:27:44.280 details, you know, all of the recent research on a condition and, you know, the probability
00:27:48.540 that it's X, Y, or Z in severity and all the rest, I will understand all of it. But given
00:27:55.860 that I have no clinical experience, I still am not receiving that information the way
00:28:01.900 someone who's been treating patients with this range of conditions for decades. And
00:28:07.560 there's just so much more information available to that person than I have. And again, it's
00:28:13.800 not to say that your doctor can never be wrong, hence the reliance on further opinions, but
00:28:18.900 it's just, you don't...
00:28:20.680 She's more likely to be right than you are.
00:28:22.580 Yes, exactly.
00:28:23.580 That's the problem.
00:28:24.340 That's crucial, yeah.
00:28:25.240 The doctor is going to be more likely to be right. You know, and I think people phrase
00:28:29.740 this as a binary and foolish choice. Well, either I'm right or the doctor's right. Well,
00:28:34.560 the doctor could be wrong, but the doctor is just going to be more likely to be right than
00:28:40.880 you are. And again, I think partly people have gotten spoiled by living in a world where they
00:28:45.020 can get a lot of definite answers very quickly. And I think they comfort themselves. One of
00:28:50.220 the things I think you're getting at with, you know, being able to read something as opposed
00:28:54.320 to be able to intuitively understand it is the kind of magic dust of experience where, you
00:29:01.100 know, people really believe that there are shortcuts to knowledge that make them equal to experienced
00:29:08.860 practitioners of various things. That if they just... You know, Scott Adams, the guy who...
00:29:15.960 The Dilbert guy.
00:29:16.800 I know him well.
00:29:17.980 Well, you know, Adams said...
00:29:19.380 He's been on the podcast.
00:29:20.700 Tell me any problem I can't understand, you know, in an hour of discussion with an expert,
00:29:25.240 as though it's just a matter of... The way I put it in the book, it's as if it's just a matter of
00:29:29.200 copying from one hard disk to another and transferring the data. And expertise doesn't work that way.
00:29:35.780 It's like, you know, it's almost like exercise. I mean, you can't go on a crash diet and develop
00:29:42.260 six-packs overnight by talking to, you know, a personal trainer.
00:29:46.840 Yeah. I'm glad you brought up Scott Adams. We won't talk about him further, but, you know,
00:29:52.120 he was on this podcast for two hours defending Trump in some form. And you'll understand how
00:29:59.380 frustrating I found that conversation. Politics does offer its kind of unique case where
00:30:05.220 people have been led to believe that they actually don't want experts of any kind. It's like if
00:30:14.520 you're talking about medicine, say, very few people will tell you that they don't want their doctor
00:30:21.460 to be extraordinarily well-trained or, you know, the best doctor in the hospital or, you know,
00:30:26.620 if they have to have brain surgery, they don't want their uncle just kind of winging it with them.
00:30:32.560 They want somebody who is fully qualified to get into their head. And in politics, this breaks down
00:30:40.700 just spectacularly. And the first moment where I realized this, this has probably been obvious for
00:30:48.240 much longer than this, but it wasn't until the sudden appearance of Sarah Palin on the scene
00:30:54.800 and her appearance at the Republican National Convention as McCain's running mate, where I
00:31:02.360 just for the first moment realized how horribly backwards all of this was in politics. I wrote an
00:31:11.060 article titled In Defense of Elitism, which then got retitled, I think, by John Meacham when he was at
00:31:17.900 Newsweek as when atheists attack. So my point was completely lost and buried. But I made a point
00:31:24.640 there that, you know, many people have made elsewhere, which is, again, to the most basic
00:31:31.200 case of, you know, the pilot flying the plane, no one's first criterion is whether they would want
00:31:37.260 to have a beer with that person or whether the person is just like you in being completely
00:31:42.240 uncontaminated by any kind of experience with that skill set. You want someone who's truly qualified.
00:31:49.680 How has this broken down in politics where there's a kind of credibility that comes from having no
00:31:57.020 credibility? Well, I think there's several sources of this. One is we have come, and I usually trace
00:32:04.820 this back to the 1992 election with the stunning triumph of Bill Clinton, who, however you may feel about
00:32:11.600 him, is clearly one of the more gifted natural politicians of the age. And what happened in the
00:32:19.880 early 90s once the Cold War receded and, you know, again, we were the sole superpower. We were living
00:32:25.700 very affluent lives. Authenticity became the end all and be all of American politics that, you know,
00:32:33.820 it didn't really matter if the guy was any good. Do you like him? As you said, do you want to have a
00:32:38.180 beer with him? You look back, nobody wanted to have a beer with Richard Nixon, you know, or LBJ for that
00:32:46.260 matter. And I suppose you could trace this even earlier to Reagan, where, you know, Reagan kind
00:32:50.660 of just emanates this charisma and people just kind of love him. But I think this notion of being
00:32:57.540 empathetic, because Reagan was a lot of things, but he didn't demonstrate a lot of empathy.
00:33:02.580 He was kind of bigger than life. But Clinton really cornered the market on this notion that in order to
00:33:09.040 govern you effectively, I have to be just like you. I have to feel just like you. I used to use,
00:33:15.820 when I would give talks in the 90s, I would always seize on Clinton's statement that I want a cabinet
00:33:21.580 that looks just like America. And I would always push back on this and say, no, America watches,
00:33:26.580 you know, talk shows. I want a cabinet much smarter than America, much better than America. And that,
00:33:33.760 you know, like you, I've been accused of being a defender of elitism. Well, so be it. I don't want
00:33:39.780 the cabinet to be people just like me. I want the cabinet to be people much more competent, smarter,
00:33:45.600 and, you know, of higher character and steady-mindedness than most of the rest of us.
00:33:50.520 And I think, you know, when we get into the early 21st century and talking about people like Sarah
00:33:56.260 Palin, one of the things that I think has become really pernicious and cynical has been the flogging
00:34:04.520 of ignorant populism by people who are smart enough to know better. It's one thing to have Sarah Palin
00:34:11.420 up there blathering, because Sarah Palin is just dumb, and that's just the way it is. And, you know,
00:34:17.940 that was a disastrous, I mean, John McCain's public record of wonderful public service will
00:34:24.340 always be marred by choosing Sarah Palin. But surrounding her and going into the kind of Tea
00:34:31.440 Party period and the early 21st century, and even around Obama as well, there were people pushing
00:34:37.920 simplistic populist slogans who knew better. You could always argue that people like Sarah Palin don't
00:34:45.520 know any better. Now we are, we're dealing, I mean, if you look at the current administration,
00:34:49.780 you have a bunch of people that are, you know, the elite of the elite. I mean, this is Hollywood
00:34:54.260 and Wall Street, you know, pretty much running the government saying, we're here to do the bidding of
00:34:59.920 the people of, you know, rural Louisiana. Well, that's a lie. That's nonsense. And I think that has
00:35:06.480 really become part of the attack on expertise, is that it's being led by people who actually have
00:35:11.980 quite a lot of knowledge and education themselves, and are just cynically mobilizing this for political
00:35:17.280 purposes. And I think that is, that's, that's new. That isn't even like Huey Long, or, you know,
00:35:22.760 the populists of the 30s, responding to the depression. This is just a cynical attempt to
00:35:28.460 basically tell people that the world is a simple place, nothing is your fault, bad people hurt you,
00:35:34.960 all answers are, can be solved with, with hats and banners. And yet, deep down, when they close
00:35:42.760 the door, I'm sure they shrug and say, well, you know, that went over well, knowing that, that what
00:35:48.600 they're putting forward really is nonsense. And that, that scares me more than anything.
00:35:53.680 Yeah. Yeah. And with Trump, you have a kind of lack of kind of moral core that seems to be new to,
00:36:01.420 at least, is new to me here, which is, with someone like Palin, you know, I don't think anyone could
00:36:06.920 pretend to believe that she was a genius or incredibly well-informed on the issues. But I don't
00:36:14.700 think anyone was celebrating her rise while also thinking that she's the most dishonest person anyone
00:36:22.740 had ever seen, or actually just not a good person, right? And reveling in that lack of any kind of
00:36:30.120 commitment to ethics. And so with Trump, I mean, what we have here, which is, I think, genuinely new,
00:36:36.000 is that we have this kind of monster of incompetence and self-regard, which has been made possible
00:36:43.240 by the fact that tens of millions of people in this country seem to revel in his incompetence and
00:36:49.720 self-regard. It's not a bug, it's a feature for them. And then when he, you know, winds people like me
00:36:55.580 up, when I complain or members of the press complain about just how uncanny all this is,
00:37:02.520 all of these departures from normalcy are, that is just to the delight of all of these people who
00:37:08.420 love Trump. It's kind of like, it's the character of the internet troll, really, that has become
00:37:14.260 ascendant here. We have a troll as president, and that has enabled what is essentially a
00:37:19.800 Dunning-Kruger presidency of a sort that I don't think we've ever had. And like your reference to
00:37:25.460 singing, I mean, the karaoke reminded me of those ghastly performances in the audition phase of
00:37:32.400 American Idol, where you have these people come out who literally cannot sing a note, but for whatever
00:37:39.900 reason and whatever has conspired in their lives to make them think they can, they go out there and
00:37:45.280 humiliate themselves. And they're genuinely astonished that they have failed. They thought
00:37:51.440 they were great singers somehow. I mean, often this is just, seems to be selecting for mentally ill
00:37:57.060 people. But essentially what's happened here is we had an election that was like that, where we have a
00:38:02.860 candidate who could not sing a presidential note, and yet 60 million people leapt to their feet and
00:38:11.340 applauded after he finished. And that's where we are. It's astonishing.
00:38:16.680 I think there's a couple of things going on here. First, I think you, I think you have to separate
00:38:20.120 Trump out from his enablers around him. I have come to, you know, I've sort of gone back and forth
00:38:25.820 for about a year of how much I think this is a political strategy and how much of it I think that
00:38:30.400 Trump is just genuinely, that the president is just genuinely clueless. And I think, you know,
00:38:36.480 he just lives in a, Senator Burr said it the other day, I mean, you know, where he basically admitted
00:38:42.300 the president sort of constructs his own reality and lives there. And that's, you know, terrifying in
00:38:47.820 and of itself. But then there's that added question of, as you say, 62 million people jumping to their
00:38:53.780 feet and applauding for it. And I think there we have to bring in another word, which is resentment.
00:39:00.080 Um, that, that, that we're now in a politics of, uh, resentment because I, because we live in the age
00:39:08.020 of information where the people that are most privileged, the people that do the best are people
00:39:11.820 who can comprehend the world around them. They can gain an understanding of a certain amount of
00:39:16.940 complexity. They can manipulate information and work in that environment. And the people who can't,
00:39:22.060 who feel left behind by it, um, who are not necessarily poor, by the way.
00:39:27.620 Uh, this is a big myth that this is just, you know, Appalachian, desperate Appalachian opioid addicts
00:39:34.840 praying that Trump will help them. There, there's a lot of people who are doing perfectly well in
00:39:39.160 America who are, you know, cackling over the complete implosion of the government, um, who aren't
00:39:45.680 doing poorly. And I think it's this sense that, you know, the smarty pants is are finally getting
00:39:50.760 theirs somehow. Um, because this age of information has meant that the world has changed so fast
00:39:58.700 that people feel bewildered and angry by it. And rather than saying, you know, as kind of my
00:40:04.760 generation of parents did, my dad, my mom and dad were depression era people, um, not educated. My
00:40:10.180 parents were high school dropouts. Uh, but they said, wow, you know, my dad once said, I lived from the
00:40:15.120 model T to the space station. And his assumption was, I will never understand the space station,
00:40:21.700 but I'm glad I live in a country where there are people who do that's now lost where, you know,
00:40:27.780 the, you say, you know, uh, as long as Trump triggers the libtards or angers, the college
00:40:35.080 professors or, or, you know, ticks off, um, the smart people, then, um, then I'm good with it.
00:40:41.900 Then I don't really care if everything burns because then we're all kind of back at the same
00:40:45.820 level. And there's really an ugly social resentment under that, that has been spearheaded by an attack
00:40:52.660 on experts because the, the cynical group of enablers around Trump have convinced ordinary
00:40:58.800 people that anything they don't like in their lives. And I don't mean just, you know, hollowed
00:41:04.620 out towns from globalization. I mean, anything they don't like is the result of some expert
00:41:10.500 giving advice to some elite, uh, and Trump and others use those terms interchangeably, by the
00:41:17.120 way, experts and elites to convince them that it's just a big conspiracy and everyone's out to get
00:41:22.560 them. I mean, it's amazing to me that people who control, who voted and who gained control of all
00:41:29.100 three branches of government and three fifths of the state houses in America still think they're an
00:41:34.400 embattled minority at suffering under the hands of the man somehow. And so because they can't
00:41:41.860 demonstrate that politically, they assume that it's because of secret knowledge that the rest of us have
00:41:47.720 that we're somehow conspiring to control their lives around them, even though they have the political
00:41:53.880 power that they've craved. And it, and it's really, you know, it's, this is why we were also living
00:41:58.800 through a revival of conspiracy theories in America today, unlike any in my lifetime, um, because
00:42:05.440 that's comforting to people. Yeah. Conspiracy theories are fascinating because most of them
00:42:10.680 are structured around a very different sense of expertise. There's this adage, I don't know,
00:42:17.500 maybe you know where this came from. You never described a conspiracy.
00:42:20.960 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:42:27.040 samharris.org. Once you do, you'll get access to all full length episodes of the Making Sense
00:42:31.700 podcast, along with other subscriber only content, including bonus episodes and AMAs and the conversations
00:42:38.440 I've been having on the Waking Up app. The Making Sense podcast is ad free and relies entirely on
00:42:44.000 listener support. And you can subscribe now at samharris.org.
00:42:57.040 Thank you.
00:43:01.740 Thank you.