The Peter Attia Drive - September 28, 2020


#130 - Carol Tavris, Ph.D. & Elliot Aronson, Ph.D.: Recognizing and overcoming cognitive dissonance


Episode Stats

Length

1 hour and 59 minutes

Words per Minute

150.552

Word Count

18,028

Sentence Count

929

Misogynist Sentences

10

Hate Speech Sentences

16


Summary

In this episode, Dr. Carol Tavris and Dr. Elliot Aronson discuss their new book, Mistakes Were Made, but Not by Me, and how they came together to write it. They discuss how they began to work together on this project, what led them to write the book, and why they believe that no one is truly immune from the pain of cognitive dissonance.


Transcript

00:00:00.000 Hey everyone, welcome to the drive podcast. I'm your host, Peter Atiyah. This podcast,
00:00:15.480 my website and my weekly newsletter all focus on the goal of translating the science of longevity
00:00:19.800 into something accessible for everyone. Our goal is to provide the best content in health
00:00:24.600 and wellness full stop. And we've assembled a great team of analysts to make this happen.
00:00:28.880 If you enjoy this podcast, we've created a membership program that brings you far more
00:00:33.280 in-depth content. If you want to take your knowledge of the space to the next level at
00:00:37.320 the end of this episode, I'll explain what those benefits are. Or if you want to learn more now,
00:00:41.720 head over to peteratiyahmd.com forward slash subscribe. Now, without further delay, here's
00:00:48.080 today's episode. My guests this week are Carol Tavris and Elliot Aronson. Carol's name may sound
00:00:55.660 familiar to some of you because she was actually a guest on the podcast back in early 2017, along
00:01:01.120 with Avram Blooming when she and Avram were on to talk about hormone replacement therapy.
00:01:06.700 In this podcast with Elliot, we talk about something very different, which is a book that they co-authored
00:01:11.800 in 2007. Mistakes were made, but not by me. Why we justify foolish beliefs, bad decisions,
00:01:19.320 and hurtful acts. Now, if you've listened to this podcast much, you've probably heard me talk about
00:01:25.500 this book at least once. It's certainly one of my favorite books and one of the books I recommend
00:01:29.960 most to other people. It's important to understand the background of Carol and Elliot to understand how
00:01:35.640 they came together to do this. So Carol received her PhD in social psychology from the University of
00:01:40.460 Michigan. She's a fellow of the Association for Psychological Sciences. She's received numerous awards
00:01:46.500 for her efforts to promote gender equality, science, and skepticism. And that's something
00:01:51.560 that Carol and I bonded over early. Elliot received his PhD from Stanford in the late 1950s. And it was
00:01:59.160 during his time at Stanford in the 1950s that he trained with the great Leon Festinger, the father
00:02:06.660 of the theory of cognitive dissonance. He went on to teach at Harvard University, the University of
00:02:11.060 Minnesota, University of Texas, and UC Santa Cruz, and eventually back at Stanford. Elliot is the only
00:02:18.600 psychologist to have won the American Psychological Association's highest awards in all three major
00:02:25.660 academic categories for Distinguished Service in Writing in 1973, Distinguished Teaching in 1980,
00:02:32.640 and Distinguished Research in 1999. And in 2002, he was listed among the 100 most eminent psychologists
00:02:39.720 of the 20th century. In this episode, we talk about how Carol and Elliot began to work together on this
00:02:46.780 project, what prompted them to ultimately write the book. And we talk a little bit about how cognitive
00:02:51.940 dissonance shows up in many aspects of our lives, not just in science, but also in politics, in criminal
00:02:58.740 justice, how it has shown up historically, and perhaps most importantly, how we can train ourselves
00:03:05.820 to not be victims of some of the worst aspects of cognitive dissonance and dissonant behavior.
00:03:13.980 And instead, how we can use our understanding of why our minds naturally try to reduce the pain
00:03:21.080 of cognitive dissonance to actually hack it a little bit and try to instead be as intellectually
00:03:26.820 honest as possible. By the end of this episode, you'll understand that there's really no one who is
00:03:31.360 immune from this. Whether you're a doctor, a lawyer, a DA, a mental health worker, a scientist,
00:03:36.820 we all suffer from the pain of cognitive dissonance. And really, the question is, what do we do with
00:03:43.160 that discomfort? Are we able to sit in it or do we succumb to it? I found this to be a fascinating
00:03:48.300 discussion and I hope you do as well. So without further delay, please enjoy my conversation with
00:03:53.440 Carol Tavris and Elliot Aarons. Carol and Elliot, thank you so much for making the time to do this.
00:04:05.740 And the backstory to this is pretty funny. So if you'll both bear with me along with the listeners,
00:04:10.980 I want to explain it. I read Mistakes Were Made But Not By Me for the first of many times circa 2012,
00:04:20.100 maybe 2013. It was love at first sight. And the first thing I did was Google you guys.
00:04:27.740 And somehow I came across Carol's phone number. Or maybe it was an email, but I think it was actually
00:04:33.540 a phone number. And I called and I actually got a hold of Carol. And I don't know if you thought I
00:04:41.400 was a psycho, Carol. You probably did, but you were too kind. But I somehow cajoled you into coming down
00:04:47.380 to San Diego for dinner, which you did. And we had this amazing evening of talking about cognitive
00:04:54.740 dissonance. And it was the beginning of a beautiful friendship. And then two years ago, when I started
00:05:00.260 this podcast, you were one of the first people I reached out to. And I said, Carol, can I please
00:05:05.480 interview you? And you said, Peter, I'm sorry. I just can't talk about it anymore. I've said all I can
00:05:13.480 say about this subject matter. I did? You did. You just said, I just don't have it in me to talk
00:05:20.020 about this anymore. Oh, no. Yeah. And you said, but please don't hold it against me. And I said,
00:05:28.260 I would never hold it against you, Carol. If you ever change your mind, let me know. And so here we
00:05:32.840 are. And then the second point that's noteworthy is, Elliot, this is the first time you and I have got
00:05:37.140 to meet. So Carol has always said to me, you're the brains behind the science, behind sort of the
00:05:45.020 foundation upon which this field that we're going to dig into very deeply has sort of informed the
00:05:51.580 work you two have collaborated on. And so I think I want to kind of start a little bit with your
00:05:56.420 relationship, your collaborative relationship, which has been, how many years have you guys been
00:06:00.420 collaborating? Oh, boy. Oh, boy. Do we go back to psychology today when I was a baby?
00:06:06.460 I was a baby and you were a toddler. It's been close to 50 years, right? Is that right? 1972 or
00:06:17.000 something like that? I think we met at an American Psychological Association convention.
00:06:24.140 Carol was working as an editor of a magazine called Psychology Today, which used to be a good magazine
00:06:33.040 when she was on it. And I had just won some prize for writing a book. The senior editor, they asked
00:06:41.460 Carol to interview me in the hopes that I could do an article for Psychology Today based on the book I
00:06:49.800 wrote, which was called The Social Animal. One of the great social psychology textbooks ever written
00:06:57.520 and deservedly famous. And you had gotten the APA's, I think, Distinguished Writing Award. So I was dispatched
00:07:06.160 to try to get you to write an article for the magazine. And we both shared a love of, not just a love of
00:07:15.140 social psychology, but a love of communicating social psychology to the public in what can only be called
00:07:21.160 English as opposed to jargon. And we even made a movie together, a documentary film. And we've been
00:07:29.200 friends ever since. And this particular project came about partly because I was losing my eyesight
00:07:36.520 and Carol became my eyes, my ears, and my brain. And we collaborated on this very, it was really great
00:07:45.560 fun, great fun to write this book.
00:07:48.600 It was an interesting harmony, I think, for the two of us as well, because Elliot, as I'm sure we will
00:07:56.420 discuss, took Leon Festinger's theory of cognitive dissonance and made it into a focus on self-justification,
00:08:07.080 and we were sitting around talking about the Iraq war, and how it came to be and why, even though
00:08:14.800 there were no weapons of mass destruction, George Bush was holding on to his determination to believe
00:08:21.680 that it was absolutely the right thing to do. And Elliot said, you know, I think that George Bush was not
00:08:27.760 lying to the American people. I think he was doing what all of us do, which is make a decision and then
00:08:35.240 justify it by cherry picking the evidence to show that we were right in making that decision.
00:08:40.620 And from that conversation, we thought, you know, this is really an important message for people to
00:08:47.180 hear how in so many domains of our lives, the way we think can really get us stuck and hard to get out
00:08:55.900 of out of the mistakes we've made.
00:08:57.780 One little correction, Carol, you said cherry picking, which is true, but that implies consciously
00:09:06.040 cherry picking. And the point that I was trying to make is that the cognitive dissonance reduction
00:09:16.100 is an unconscious process. People don't say, hey, I think I'm going to reduce a little dissonance
00:09:23.980 right now. They just do it. And it flies just below the level of awareness. So that when George Bush
00:09:34.660 was convinced there were weapons of mass destruction, he convinced himself of that,
00:09:40.620 even though the evidence was ambiguous. There was some evidence that indicated that Saddam Hussein
00:09:49.200 did have weapons of mass destruction, and some evidence indicated that he didn't. And he simply
00:09:55.680 was hell bent on invading Iraq so that he downplayed the importance of the evidence that would have
00:10:05.320 cautioned him not to invade. And I think that is something we all do if we're not careful.
00:10:13.680 And there are certainly some fields in which I think the implications of, I have done something
00:10:20.880 incorrectly and new information emerges that suggests I've done it incorrectly becomes very hard
00:10:27.620 to swallow. Probably nowhere near as difficult to swallow as if you're the commander in chief and you've
00:10:33.320 been single-handedly responsible for what followed, uh, what I guess would have been March 19th, 2002,
00:10:40.160 I guess, or three would have been that invasion. But I think what gripped me and what got to me so
00:10:46.600 much, uh, even from the first reading of your book was as a, as a doctor, you think about how many times
00:10:53.040 in medicine we do things and then new evidence emerges that maybe that wasn't the right thing to do.
00:10:59.680 And sometimes it's not immediately obvious, right? But it's subtle. It's like, well,
00:11:04.600 we used to nutritionally tell people that this thing was the right thing to eat, but more and more
00:11:10.300 evidence seems to suggest it's not, but you've been telling everybody that thing. So what is the
00:11:16.200 implication? But before we get into these examples, maybe let's give people some real explanations of
00:11:23.080 dissonance. So I, I, I think you guys, either it was in your book or I I've seen this elsewhere,
00:11:27.900 but there's, there's a great example of the dissonance. It's like a person who knows that
00:11:33.380 smoking is bad for them, but still smokes. How does that person get through the day? What is that?
00:11:40.120 What, how do you describe that tension that must exist in that person?
00:11:43.920 That's our most famous example, Elliot, take it away.
00:11:46.940 I have to say that Leon Festinger's example. And I was a student, I was very lucky. I was a student
00:11:56.300 of Leon Festinger, just as he was inventing the theory of cognitive dissonance. I was his graduate
00:12:04.840 student and his major domo and became his protege and friend so that I kind of inherited cognitive
00:12:16.040 dissonance theory. And the example he uses is a person is, smokes two packs of cigarettes a day.
00:12:25.400 And then the evidence starts becoming clearer and clearer that smoking can cause cancer and other
00:12:32.960 diseases. And what does he do with that? Well, those two cognitions, I am a smart, sensible person
00:12:42.600 and I'm smoking cigarettes, even though I know it causes cancer. Well, the simplest sounding thing
00:12:51.280 to do is to give up smoking. But it's a lot harder to do than people might think, because it is addictive.
00:13:01.500 If a person tries to give up smoking and can't, then how does he reduce that dissonance? And dissonance
00:13:08.900 is a negative drive state. It feels terribly unpleasant, like being extremely hungry or extremely thirsty,
00:13:18.820 but it takes place in the mind. So it makes you very uncomfortable. And if you can't reduce the
00:13:27.340 dissonance by giving up smoking, then you work on the other cognition, that smoking causes cancer.
00:13:33.580 And you might try to convince yourself that it's really mostly correlational evidence and therefore
00:13:39.700 not really definitive. No one has done a controlled experiment with hundreds of thousands of people
00:13:47.720 forcing some to smoke and forcing others not to smoke, which would be, of course, an unethical
00:13:55.240 experiment. But in the mind of the person, that experiment would need to be done before I'd be
00:14:01.920 convinced. Or you could convince yourself that obesity is a health risk. And by smoking two
00:14:09.640 packs a day, I'm keeping myself from eating all of those rich desserts, which would have made me obese,
00:14:17.380 and I probably would die of a heart attack. Or it's debonair to fly in the face of danger and smoke a
00:14:27.740 cigarette like Humphrey Bogart in the movies. As I'm a really exciting person, I would rather live a
00:14:36.260 shorter but more interesting life than one where I was forever being cautious. All of these things,
00:14:43.920 each one of them and all of them together can be used together as a way of allowing me to smoke and
00:14:50.960 still feel good about myself. It lets us sleep at night, to use your point, Peter, as well. The
00:14:58.640 ability to reduce dissonance is what allows us to say, I'm doing something stupid, but look, here are
00:15:05.460 all the reasons that I justify it. There was a study not long ago of pregnant women smoking, and pregnant
00:15:12.840 women have a double knowledge of smoking being bad not only for their own health, but for the baby's
00:15:17.940 health. And what did these women say in explaining why they were going to keep smoking? Well, I've cut
00:15:23.600 down. Now the amount of cigarettes I smoke every day isn't really as hazardous as it would have been
00:15:29.520 before I cut down. So this is indeed how we sleep at night. And as Elliot has often said, that's the
00:15:37.760 benefit of our ability to reduce dissonance. But you know what? Sometimes some sleepless nights are
00:15:44.400 called for. Especially if you're the President of the United States making life and death decisions for
00:15:53.660 millions of people. Well, that's what's interesting, right? You've discussed this as an incredibly
00:16:00.220 negatively valenced emotion, right? I mean, I love the way you compared it to the incredible discomfort
00:16:06.780 of starvation or thirst, but it's psychological discomfort of that variety. Because I know so
00:16:14.100 little about psychology, it's amazing to me to understand the timeline of these things. Now,
00:16:19.380 you show up at Stanford in 1955 as a grad student. 55 is about when Leon got there. It's only two years
00:16:25.940 later that this theory is put forth. In the grand scheme of things, that seems relatively recent in my
00:16:33.220 mind. I'm not saying that to discredit that or the field, but what you're describing sounds so
00:16:38.860 fundamental to the way we as humans live that it strikes me as such a major breakthrough.
00:16:46.740 Like, why didn't this happen a hundred years sooner? Was there some other critical piece of insight
00:16:52.100 that was necessary that preceded this amazing observation that took place barely 70 years ago?
00:16:59.480 I think psychologists for a long time had the notion of rationalization, that people often
00:17:06.480 rationalize their own behavior, which is a kind of a pale version of cognitive dissonance theory.
00:17:15.160 And we all knew about that. And that was that, okay, people rationalize. But the genius of Leon
00:17:22.000 Festinger, first of all, the way he really invented the theory was because he was studying
00:17:29.420 rumor transmission. And in India, there was a major earthquake. And a lot of people got killed.
00:17:38.600 And what he learned was rumors spread at the epicenter of the earthquake. And Leon was studying rumors. So
00:17:49.360 he saw that these rumors were very reasonable rumors. They spread, don't worry, help is on the way.
00:17:55.680 People are coming. They're going to rescue us. They're going to bring food. We're starving, but they're going to
00:18:01.500 things are going to be okay. Those were the kinds of rumors that made sense. They were comforting rumors.
00:18:08.300 Meantime, there was a city about 15 or 20 miles away, where there wasn't a great deal of damage, but there was
00:18:16.640 enough shaking and enough damage, and enough people got mildly injured, that they were really anxious and really scared.
00:18:24.980 And the rumors that spread in that area was that there was a typhoon coming, that there was going
00:18:33.760 to be a hurricane, that there was going to be huge flooding. And people were really worried about all of
00:18:41.180 that. And Festinger scratched his head and said, why in the world would people spread rumors that would
00:18:48.580 increase their anxiety? And what he arrived at as a strong possibility was that the earthquake made them
00:18:58.240 feel extremely anxious. But they had very little to be anxious about because hardly anyone got hurt and
00:19:08.680 hardly any destruction occurred. So they invented future things that were going to happen and spread rumors
00:19:17.940 about them in order to justify their anxiety. And that was the beginning of cognitive dissonance theory.
00:19:26.720 That's one part of it. The second part of it was that Leon Festinger was a genius as an experimental
00:19:36.760 psychologist, so that he immediately thought up three or four really interesting experiments that went way
00:19:46.220 beyond what anyone ever conceived of in terms of mere rationalization, showing that cognitive dissonance
00:19:55.500 reduction works in ways that are often counterintuitive. It isn't the obvious thing that your grandmother
00:20:03.760 thought about and would tell you about. It happens in ways that are exciting and interesting when you
00:20:12.380 understand the theory and seem completely off the wall if you don't know the theory.
00:20:18.980 I'd love to hear an example of one of those experiments or such. Yeah.
00:20:22.940 I'll give you the best example that I can think of, which is one of the early experiments
00:20:27.520 by Leon Festinger and an undergraduate at Stanford who was, he eventually became my graduate student
00:20:35.840 when I was, became a professor, a guy named Merrill Carl Smith. And what they showed was if you pay
00:20:45.060 someone $20 for telling a lie to another person, he knows it's a lie. And if you ask, like what he had to tell
00:20:59.060 the other person who was about to go into the experiment to do a tedious task, like the kind of task that you would
00:21:09.240 be doing if you worked on an assembly line, packing spools, turning a screw half a turn to the right for a couple
00:21:17.380 of hours, which was really tedious. But he pretended he had just come out of the experiment doing that.
00:21:25.200 And his job was to tell the participant who was waiting to come in next, that it was really an
00:21:33.520 interesting experiment. And Carl Smith gave him $20 to do that. In another condition, he gave him $1
00:21:43.560 for doing that. And what happened was that the students who were given $1 for doing it actually came to
00:21:53.740 believe that the task that the task was more interesting than the students who were paid $20
00:21:59.060 for doing it. Completely upside down from what would be predicted by the dominant theory of the time,
00:22:08.380 the behaviorist theory of reinforcement, that the more you're paid for something, the more you like it.
00:22:15.840 What cognitive dissonance theory predicts is the less you're paid for it, the more you have
00:22:23.740 to add justifications of your own. So if you're paid $20 for telling a simple lie,
00:22:30.620 you can say to yourself, well, I sold my soul, but $20 is a pretty good price for my soul.
00:22:38.800 But if you're paid only $1, in effect, you're asking yourself, now, why did I do that for a lousy dollar?
00:22:47.960 Well, you know, it wasn't such a big lie, because, you know, the task, on the surface, it looks like a boring task,
00:22:55.060 but it's really a lot more interesting and more intricate than it really looks on the surface.
00:23:00.920 And they convinced themselves, not that the task was exciting, but it wasn't so bad.
00:23:06.840 So here's the relevant thing about this, Peter, when you were saying about the origins of cognitive dissonance.
00:23:14.260 Psychological science, when I was a graduate student, was almost an oxymoron.
00:23:19.760 Everybody joked about it. Psychology, what are you talking about? It's not a science.
00:23:24.120 It's an oxymoron. It's not research-based. It's not empirically based.
00:23:28.320 When Eliot was first doing his work, the dominant paradigms in psychology were psychoanalytic or behaviorist.
00:23:37.440 Those were the two big schools that were devoted to explaining how human beings operate and how they think and what motivates them.
00:23:47.200 And both of them were past their prime by the middle of the last century.
00:23:52.200 What Eliot was doing in terms of cognitive dissonance was, first of all, looking into the black box of the mind,
00:24:01.500 which behaviorists were ignoring completely.
00:24:04.160 We just have to observe behavior. It's all a matter of rewards and punishments.
00:24:08.980 And saying, no, no, there's something happening in there that affects our behavior most profoundly.
00:24:13.380 And it was also a time of questioning the qualitative observation, the non-scientific observations of the psychoanalytic approach to understanding behavior,
00:24:27.840 which were lively and popular and wrong.
00:24:32.580 So comes cognitive dissonance and the cognitive revolution and the world of psychology changed.
00:24:42.200 The world of psychological science changed.
00:24:45.320 Oh, that's so good. I have to come in because I got an example of the psychoanalytic thing.
00:24:51.360 I came to Stanford in 1956 and I got my PhD in 59.
00:24:58.900 And then I went to teach at Harvard.
00:25:00.780 When I arrived at Harvard, there was a guy named Michael Kahn doing an experiment for his PhD dissertation.
00:25:09.980 He was a really good Freudian psychologist.
00:25:13.760 And he wanted to do a test on one of Freud's notions called catharsis.
00:25:20.580 Catharsis says that when you feel angry at someone, you need to get it out of your system by punching a punching bag
00:25:28.260 or even punching the person who made you angry in the nose, assuming it's someone who's a little smaller than you are, I guess.
00:25:37.780 So he did an experiment like that, trying to demonstrate Freud's theory of catharsis,
00:25:45.160 that acting out on your aggressive feeling is going to make you feel better and less aggressive.
00:25:51.280 And what he found was just the reverse.
00:25:56.740 When a person expressed his anger by getting his tormentor into trouble so that his tormentor actually lost his job as a result of it,
00:26:08.060 this was all in an experiment.
00:26:10.000 That was the scenario that the participant actually believed he was costing the person his job.
00:26:19.280 It actually increased his negative feelings about that guy.
00:26:25.920 And Michael Kahn was really confused and said, how can this be?
00:26:31.540 Not only didn't my experiment prove that catharsis worked, just the opposite happened.
00:26:37.820 How could that possibly be?
00:26:39.720 And somebody around said, oh, we got this new guy just came as an assistant professor, Aronson.
00:26:45.020 I think he might have an answer.
00:26:47.420 And it's exactly the answer is cognitive dissonance.
00:26:51.420 If you make me angry and I retaliate in a way that causes you an extreme thing,
00:27:00.780 like more than the simple act that made me angry, I have to justify it somehow.
00:27:07.880 So the fact that I cost you your job makes me feel dissonant.
00:27:14.160 My God, I really hurt that guy.
00:27:16.300 Well, he must have really deserved it.
00:27:19.020 He's a terrible person anyway.
00:27:21.040 Look at the awful thing he did to me and that needed to be punished.
00:27:26.360 And he'll probably find another job anyway, but he's a jerk and he would have done the
00:27:31.620 same thing to me if I did that I did to him, et cetera.
00:27:35.000 And that really explains the phenomenon of blaming the victim, that if a person gets hurt
00:27:43.420 and we can't account for it, we try to figure out maybe he did something that brought that
00:27:49.540 on.
00:27:50.040 What it allows us to do is say, I am a good, kind, compassionate, smart person.
00:27:57.940 And if you're telling me I did something that wasn't good, kind, compassionate, or smart,
00:28:03.200 I could accept your evidence or I could say that to hell with your evidence in a way that
00:28:09.940 allows me to continue thinking of myself as a good, kind, smart person.
00:28:14.380 That's what Elliot brought to this as the fundamental heart of the reason that we so often reduce
00:28:21.320 dissonance in a way to preserve our self-concept, how we see ourselves.
00:28:27.080 Obviously, the seeds of this are sown, like I said, 60 years ago, but I've seen lectures
00:28:32.600 where people talk about the impact on the actual brain itself structurally and functionally.
00:28:38.900 So if you look at fMRI in a person who's placed in a dissonant situation, how you actually see
00:28:46.100 a change functionally.
00:28:47.140 Do either of you care to comment on how that looks?
00:28:49.800 I have to preface this by saying one of my favorite studies in all the world show that
00:28:54.820 if you give a lecture and you wave around some fMRIs and you give the exact same lecture
00:29:00.020 without the fMRIs, everybody thinks this first version was really scientific.
00:29:05.360 We just love those brain studies.
00:29:07.480 We do.
00:29:08.380 Well, yes, you are describing studies by Drew Weston, which basically brought people to
00:29:12.980 the laboratory and showed what was going on in their brains when they were confronted
00:29:16.640 with dissonant information about someone from their own political party.
00:29:22.420 If someone from your opposing political party behaves like a corrupt idiot or jerk, that's
00:29:28.040 perfectly consonant for you.
00:29:29.420 People from that party always behave that way.
00:29:31.920 If someone from your own party behaves exactly the same way, well, you know, it's no big deal.
00:29:37.200 All politicians do this and so forth.
00:29:40.180 So he brings people into the laboratory, wires up their brains, and basically finds that in a
00:29:45.880 state of dissonance, these brains were not happy.
00:29:49.820 They were just not happy.
00:29:51.400 But give them a chance to restore consonants and it settles down.
00:29:56.820 Now, I want to say about this that I think this kind of research certainly does explain,
00:30:03.660 it's certainly very helpful in understanding what efforts our brains go through so that we can live
00:30:12.240 in a state of consonance between what we believe and how we behave every day.
00:30:17.680 It's to our evolutionary benefit to hold beliefs that make us feel part of our tribe, our community,
00:30:25.660 our religion, our group, and so forth.
00:30:28.780 And so there really is a benefit, an evolutionary benefit, to being able to get rid of dissonance
00:30:35.420 when it occurs.
00:30:36.700 But that said, there are enormous psychological and cultural differences in what causes people
00:30:45.260 to feel dissonance.
00:30:47.140 So it's not like everybody always will feel dissonance if, for example, you're a scientist
00:30:53.000 and you get information that your study didn't turn out the way you might have liked.
00:30:57.520 Well, you might feel a pang, but the scientific approach would be to say, well, I've learned
00:31:02.060 something.
00:31:02.480 What can we do next?
00:31:04.500 Likewise, there have been very interesting studies comparing what causes dissonance in
00:31:09.760 Japan or the United States.
00:31:12.020 In the United States, we feel a state of dissonance if we ourselves personally are ashamed or embarrassed
00:31:19.080 or humiliated by something that we've done.
00:31:22.120 In Japan, people are more likely to feel the need to reduce dissonance if they've behaved
00:31:27.080 in a way to harm, hurt, or embarrass other people in their group because they are typically a
00:31:33.200 more other-oriented culture.
00:31:35.620 So while we can appreciate what is universal about dissonance, let's not make the corresponding
00:31:42.940 error that we all automatically and forever behave exactly the same way.
00:31:47.560 Let's unpack that a bit.
00:31:48.320 From a natural selection standpoint, when you go back even just several hundred years, let's
00:31:54.500 call it a thousand years, or even go back further to the point where we were mostly functioning
00:32:00.860 in tribes of relatively small numbers, say even going back to hunter-gatherers, assuming
00:32:07.060 for a moment that we could find sufficient food and shelter and take care of the most fundamental
00:32:11.960 of our needs, what would have been instances in which we would have experienced dissonance?
00:32:18.060 And therefore, why would this have been a conserved feature of our evolution to not stay
00:32:24.940 up at night worrying about things that were tormenting us, that were creating this sort
00:32:31.200 of dissonant or dialectical difference, and instead allowing us to basically placate our little
00:32:37.240 brains, which weren't little by that point, of course.
00:32:39.640 They were basically the same size as they are today.
00:32:41.960 And move forward.
00:32:42.760 I find this fascinating, because it strikes me as sort of a very modern problem, like something
00:32:48.760 that only in the last few thousands of years could this have even been relevant.
00:32:53.180 But of course, I'm just not thinking of the right examples, perhaps.
00:32:57.900 It's our position that cognitive dissonance is hardwired.
00:33:02.520 And the way that happens in terms of evolution is that it has more survival value in the sense
00:33:13.320 that if you are a hunter back 10,000 years ago or 20, 30, 40,000 years ago, and you experience
00:33:24.360 dissonance because you've done something that hurt one of your tribesmen or in a way that you're
00:33:32.840 concerned that he might take retaliation.
00:33:36.200 And you lay awake all night worrying about it, and then the next morning, you get up bright and early to go hunting, and you get pounced on by a tiger because you didn't sleep at night because you were so busy worrying about this thing.
00:33:53.000 You're not as vigilant as you normally would have been, then chances are your genes are less likely
00:34:00.920 to get into the gene pool.
00:34:04.040 So those who are good at reducing dissonance, those who are good at saying, ah, it wasn't
00:34:10.300 such a bad thing I did, he'll forgive me, I'm sure.
00:34:14.100 And then you sleep soundly, you get up and you do your hunting, and everything is fine.
00:34:18.780 I think that that cognitive dissonance reduction, the ability to reduce dissonance, is hardwired precisely because it has survival value in that sense.
00:34:30.700 Exactly.
00:34:31.360 And as human beings evolved and created new technologies, as agriculture emerged, as economies grew and flourished, people had more beliefs to defend as being the right ones.
00:34:45.680 If you're living in a little band where you have your creation myth of how your people came to be, and you never meet anybody from another band with a different point of view,
00:34:56.740 the day that you do, the day that the next tribe arrives and says, no, no, your God isn't the right God, our God is the right God.
00:35:04.800 Well, now, what are you going to do with that information?
00:35:07.680 What are you going to do with information that your way of planting crops is the wrong way and our way is a better way?
00:35:13.840 So to the extent, this to me is one of the most interesting things about dissonance.
00:35:23.100 It's to the extent that a belief is really deeply important to us, that's when we become most tenacious in holding on to it.
00:35:34.580 It's why, for example, it's not just dumb people who feel the need to reduce dissonance.
00:35:41.600 The greatest danger comes from smart people who refuse to accept the evidence that they have done something foolish or stupid
00:35:49.380 or that they were holding on to a belief or a medical practice long past its shelf life.
00:35:54.400 And now you're saying, I, a smart, competent, professional person who knows more about this subject than anyone in the world,
00:36:01.420 and you're telling me I'm wrong?
00:36:04.280 The hell with you, see?
00:36:06.260 Well, that's the scary part, right?
00:36:07.800 That to me is the part that is, look, just on a personal level, that's what gripped me with reading about this was,
00:36:14.920 wait a minute, how many times am I doing this?
00:36:16.760 Because by definition, the person doing it is generally blind to it.
00:36:20.280 I mean, this is effectively a form of confirmation bias.
00:36:24.180 That's the cherry picking that you were alluding to that, Elliot, you pointed out,
00:36:28.520 is it's a subconscious type of cherry picking and confirmation bias.
00:36:33.580 We can talk about it all day long, and any science student worth their salt can define it up and down.
00:36:40.100 But it's how often are we doing it?
00:36:42.160 And the more and more entrenched we get in our fields, the more and more, quote, knowledgeable we become,
00:36:49.720 the more difficult it becomes to walk back from something that you once held dear.
00:36:56.260 And, Peter, my favorite example of this is the prosecuting attorney who worked hard on a case.
00:37:06.520 Let's say it's a murder case.
00:37:08.320 And he sends the person to prison.
00:37:12.020 He gets a conviction.
00:37:13.080 The person goes to prison and is in prison for 25 years.
00:37:18.420 And then some DNA evidence shows up that proves beyond a shadow of a doubt in anyone else's mind
00:37:27.820 that the convicted person was actually innocent of that crime.
00:37:33.360 What happens to the typical prosecuting attorney in that situation is he would feel so bad,
00:37:43.180 so incompetent, so awful, learning that he sent a person away for 25 years when he's really innocent.
00:37:54.200 And then he says to himself, that can't be true.
00:37:58.140 The DNA evidence has to be wrong.
00:38:02.960 And so he keeps him in prison for another 25 years.
00:38:07.700 And he does that not because he's an evil guy, but because he thinks of himself as a good guy and a competent guy,
00:38:19.240 as someone who would never make a terrible mistake like that.
00:38:23.980 Yeah, it's funny you mentioned that, Elliot.
00:38:26.120 One of the things I was thinking about when I read the book, and I made a note to bring it up because I went back and re-skimmed it for the 57th time,
00:38:34.200 was the Amanda Knox case.
00:38:36.800 Do you remember this American girl?
00:38:38.860 I think she was from the Northwest.
00:38:40.540 And she was studying abroad in Italy.
00:38:43.380 Had a roommate that it was this tragic thing where the roommate and I think the roommate's boyfriend were murdered.
00:38:47.800 And it was pretty clear on first pass that Amanda had nothing to do with this.
00:38:53.380 And yet this prosecutor in Italy had a real bee in his bonnet that she was hands down the perpetrator.
00:39:00.200 And then, of course, the DNA evidence emerges because the killer actually went to the bathroom while he was in the house, was my recollection.
00:39:07.020 They basically get his DNA out of the toilet.
00:39:10.280 Nope, it's this other guy who they find.
00:39:12.780 And the prosecutor keeps moving the goalpost.
00:39:15.460 Well, maybe she didn't actually kill him, but she was in cahoots with this guy, though there's no evidence she's ever met this guy before at any motive.
00:39:23.040 I mean, the whole thing got more and more and more ridiculous.
00:39:25.940 Now, in her case, she's lucky.
00:39:27.400 Despite the multiple times she was actually convicted in an Italian court, ultimately it was overturned.
00:39:33.140 But you watch that story unfold.
00:39:35.620 Another example, which I've heard you lecture on, Carol, is the Duke La Crosse one from, God, it's probably 15 years ago now, right?
00:39:44.380 Well, in that case, so many issues played into this.
00:39:47.720 Walk us through that story and how that's another great example of a case study in cognitive dissonance.
00:39:52.120 The examples in all of these cases are what happens when a district attorney, for political reasons, personal advancement reasons, decides that he or she knows who the guilty person is and then just narrows the focus on getting that person convicted
00:40:10.100 and ignoring any disconfirmed, dissonant evidence that would throw that assumption into question.
00:40:18.600 The La Crosse case was the guys from the La Crosse team who had a stripper to a party at their, I guess, their fraternity house, right?
00:40:28.560 And she later claimed that she had been raped.
00:40:30.760 Well, this story fit every, every story that just touches so many buttons of race and women and fraternities and how terrible fraternity guys are and they're all racist rapists anyway and so forth.
00:40:47.120 The, I don't know, something like 80 faculty members at Duke took out a ad in the school paper about the toxic masculinity of the La Crosse team in particular and men in fraternities in general and so forth.
00:41:00.020 And so they were all backing themselves into a corner until it turned out that the district attorney was not giving the exculpatory evidence to the defense.
00:41:10.700 The district attorney was eventually disbarred.
00:41:13.440 It was a scandalous case, but he saw in this story of race and gender and rape and fraternity brothers a way to really make name and fame for himself.
00:41:23.500 That's a particularly egregious example, but it's not uncommon.
00:41:27.280 No, these cases are not rare at all.
00:41:30.400 And my favorite example, of course, is the, is the Central Park Jogger case where these black kids, they actually confessed to the crime and were sent to prison.
00:41:44.360 A fellow named Donald Trump took out full page ads in all of the New York newspapers saying they should be executed, even though some of them were underage.
00:41:56.660 And then it turned out that the DNA evidence never matched on the kids and, and the semen that was found on the woman.
00:42:05.320 And then some other person who was in prison for a similar crime confessed to it.
00:42:11.040 And sure enough, his DNA evidence did match and the prosecuting attorney, Linda Feirstein insists to this day that she was right.
00:42:24.060 And she would refuse to overturn the evidence that had to be overturned by the district attorney who oversaw the case and totally vacated the evidence.
00:42:36.860 And the city of New York paid a huge compensation amount of money to those kids who were, who spent several years in prison.
00:42:45.240 And Feirstein said, the, just the typical thing that prosecutors say, we always knew there was a sixth man.
00:42:54.500 And the fact that they never met.
00:42:57.000 Right, exactly.
00:42:58.180 Yeah, we always knew there was a sixth man.
00:43:00.080 That's what they say.
00:43:01.040 Okay, so at the time of the trial, we're only prosecuting this one guy because we know that he was the rapist and the murderer.
00:43:07.880 Oh, well, it wasn't his semen.
00:43:09.680 Oh, well, then there was another guy there.
00:43:11.620 And our guy was just holding the woman down while the other guy actually raped her.
00:43:16.620 The Innocence Project guys call this the unindicted co-ejaculator theory.
00:43:21.920 You know, it's just, you know, after the fact, we can come up with any explanation of why we are still right, even though we're wrong.
00:43:30.600 Now, there's a delicate balance here.
00:43:32.420 And I think the great story that I know you've talked about, because it's hard for you, Carol, you talk about it being one of the times that really challenged you was the McMartin nursery scandal back in the early 80s.
00:43:43.820 And there's a great quote, I think you said about Elliot, something to the effect of, we sacrificed our skepticism at the altar of outrage.
00:43:52.160 I love that.
00:43:53.580 And I want to come back to that.
00:43:54.920 But again, let's tell that story, because that's another great example.
00:43:59.240 But what I want to do is tell this story through the lens of how do we think about this in the current era, where accusations are coming greater and greater.
00:44:10.320 And some of these accusations are going to be true, and some of them are not going to be true.
00:44:16.100 And how there's two ends of it.
00:44:18.300 You can be at polar extremes of either of these, which are probably incorrect, but there has to be a rational way to do it.
00:44:24.020 So maybe thinking about that McMartin preschool story, which in retrospect, you know, I don't remember it.
00:44:31.440 I wasn't really old enough.
00:44:32.460 I went back and read about it.
00:44:34.260 And I got to be honest with you.
00:44:36.460 So on the one hand, it sounds completely idiotic now, but then at the time I can say, but if you were a parent whose kid was going there, you could easily see yourself getting sort of spiraled out of control too.
00:44:48.780 So there's a part of me that actually has quite an amount of empathy for everybody involved.
00:44:53.480 And it just overall seems like a really tragic story.
00:44:56.660 Tell us about that story, specifically your own struggle within it.
00:45:01.960 First of all, this is a very, very important question, because here's what happens.
00:45:08.080 The minute there is a sensational story in the news, somebody is accused of something, for example, what does the public generally do?
00:45:18.120 We jump to a conclusion.
00:45:19.700 Now, what dissonance theory would predict is the minute we make a decision, believe this person or believe the other person,
00:45:47.980 we will now make our belief conform to the evidence we're prepared to hear as things go forward.
00:45:57.380 So this is the danger of the early jumping to a decision, because then as time goes on, that choice, that belief that we have will harden rather than become more open to disconfirming evidence.
00:46:12.600 We will start looking for all the reasons we were right, to believe the accuser or to not believe the accuser,
00:46:18.340 and we will ignore and minimize or trivialize any information suggesting that we were wrong.
00:46:24.620 That is why that first decision is such a crucial one, because again, the more we put into supporting that initial decision,
00:46:34.880 the harder it will be to change our mind.
00:46:36.580 People say this is the slippery slope, but the thing that dissonance theory teaches us is that there's nothing slippery about it.
00:46:43.980 It's our active self-justifications for the beliefs we have that take us down sometimes what turns out to be a wrong path.
00:46:53.760 We mentioned in the update to our book, there's a wonderful YouTube that Sarah Silverman did
00:46:58.980 that shows the pain of dissonance right there on the YouTube screen, where she talks about her feelings about Louis C.K.
00:47:08.540 when he was first admitted to have been behaving in these inappropriate ways with women sexually.
00:47:16.000 Fine.
00:47:16.440 What she says in this video is a perfect demonstration of dissonance.
00:47:20.800 She says,
00:47:22.180 He's my dear, good, wonderful friend.
00:47:25.420 I love him.
00:47:27.060 I think he's a wonderful father.
00:47:28.980 I adore this man.
00:47:30.400 And what he did is reprehensible.
00:47:33.720 And what he did was a terrible thing to women.
00:47:37.040 And I want to side with the women whom he offended so deeply.
00:47:41.480 She doesn't answer it, but she lives with this dissonance of this uncertainty.
00:47:50.140 My case with McMartin was something else.
00:47:55.200 I was in Los Angeles at the time when the mother and her two children who were working at this daycare center were suddenly accused of what turned out to be utterly preposterous assaults on the children in their care.
00:48:12.560 In spite of the fact that for many, many years they had run this daycare center, nobody noticed anything amiss.
00:48:17.580 Parents were walking in and out of the place all the time.
00:48:19.900 But sanity and common sense generally goes out the door when you have a sentence with the words children and sex in the same sentence.
00:48:29.800 Just as you said, somebody comes and tells you, a police officer turns up at your door, as the police did in this case, and says,
00:48:38.220 Peter, there have been some allegations of child molestation going on at your child's daycare.
00:48:45.420 Do you have any information about this?
00:48:47.260 What are you going to say?
00:48:49.420 What are you going to say?
00:48:50.680 Oh, this is likely to be a sex panic?
00:48:52.540 I don't think you will.
00:48:54.420 And I remember it was such big news here in Los Angeles.
00:48:57.820 It was hysterical news.
00:48:59.360 I knew the prosecutor at the time.
00:49:01.240 She was convinced that this was really happening.
00:49:05.460 And there were no researchers at the time doing psychological research on how to interview children or on how children respond to repeated questions.
00:49:15.280 Nobody knew very much about anything.
00:49:18.920 And at the time, there was a developmental psychologist who argued that if you don't ask children leading questions,
00:49:26.700 did the doctor touch you here on your private parts?
00:49:30.420 Did this person do this to you?
00:49:32.280 If you don't ask leading questions, the child won't tell you, for example, that she's been in a medical exam with a doctor who touched her.
00:49:39.700 Well, that seemed to justify the tactics that the social workers and police were doing with these little kids.
00:49:48.180 And I wrote an op-ed for the Los Angeles Times called,
00:49:55.540 Do Children Lie?
00:49:57.900 Not About This.
00:49:59.920 Your title?
00:50:01.120 No, but I've had to live with that.
00:50:06.060 Because in fact, that was the message of the op-ed.
00:50:08.980 The op-ed was about how you really have to ask children leading questions.
00:50:14.440 So, of course, I can say that op-ed was the psychological science as we knew it at that point.
00:50:23.220 But am I embarrassed by it?
00:50:25.460 Oh, you bet I am, especially finding myself at a conference a couple of years later when Steve and Cece,
00:50:31.900 who became one of the great heroes of the research on this question,
00:50:36.900 who reminded everybody, do children lie?
00:50:39.780 Are you kidding?
00:50:41.280 The idea that children never lie could only be said by someone who never was a child or knew a child, for crying out loud.
00:50:47.020 Anyway, Steve used a screenshot of my op-ed as an example of how credulous, foolish, and stupid even our good social psychologists are.
00:51:01.380 Oh, my God.
00:51:01.960 I don't even know if he used the word good social psychologist.
00:51:05.160 No.
00:51:05.760 Well, it was very embarrassing.
00:51:07.180 Okay.
00:51:07.500 So, Peter, to your question, how did I feel embarrassed?
00:51:10.860 That's how I felt.
00:51:11.760 That is how I felt.
00:51:12.940 And I was much smarter than you, Carol, on that one.
00:51:15.540 Yes, you were, as usual.
00:51:17.180 As usual.
00:51:19.020 No, no.
00:51:21.040 I'm only kidding, because I felt the same way that you did.
00:51:25.440 I was far away at the time, so I wasn't getting bombarded with the news.
00:51:30.980 But I remember picking up a copy of Newsweek magazine and seeing a picture of Mrs. McMartin sticking her tongue out.
00:51:40.880 And I thought, oh, my God.
00:51:43.480 This molester is mocking the seriousness of this situation.
00:51:50.260 But since I didn't know much about it, I kept out of it.
00:51:53.060 But, of course, she was sticking her tongue out.
00:51:56.380 Some photographer was harassing her, and she stuck her tongue out at the photographer, and that got published in Newsweek.
00:52:03.220 And that's an example of what happens.
00:52:06.540 A person does a normal thing.
00:52:10.180 Gets angry at a photographer for harassing her when she's being falsely accused in the press and in court of having done a heinous crime, which she is innocent of.
00:52:24.020 But if we believe that she's guilty, everything she does begins to look like bizarre, dreadful behavior.
00:52:35.680 Because once we have the rubric of she is guilty of child molestation, then we can't see her as an innocent person being angry at a harassing photographer.
00:52:49.640 Exactly.
00:52:50.120 I want to just highlight that with a thought experiment, Elliot, which is you pick a person, whether it be someone who's accused of a crime or a political figure or something, who you just hold in absolute contempt, and imagine them in 10 different positions, smiling this way, frowning that way, sticking their tongue out, giving you the finger.
00:53:13.540 There is not one of those 10 in which you can't come up with a negative narrative.
00:53:19.500 You look at someone that you deem grotesque, no matter how they present themselves.
00:53:24.760 Even if they're standing there with the slightest smile, you would view it as disgusting.
00:53:30.360 How can they not show more remorse?
00:53:32.520 The slightest smile becomes a smirk.
00:53:35.040 Yes.
00:53:35.820 A frown becomes an admission of guilt.
00:53:38.760 We are so able to color this lens.
00:53:42.100 It's amazing, isn't it?
00:53:43.540 I'd like to add what I think is the important take-home on the McMartin case.
00:53:48.980 McMartin was the first of a wave of hysterical cases across the country in which daycare workers from here to Boston, hundreds of them were accused of these kind of ritual sexual molestation of children in their care.
00:54:06.180 Hundreds of daycare workers went to prison.
00:54:08.400 Some of them are still in prison.
00:54:10.180 And the lesson of McMartin is not simply that I was wrong or that others were wrong.
00:54:18.080 It's this, and this is what's harder.
00:54:21.520 The first reaction that anybody had to McMartin, we didn't know anything.
00:54:27.380 The public had no way of knowing that the first allegation was made by a woman who was so psychotic, so crazy, known to the police, that they stopped even listening to her after a while.
00:54:39.780 Nobody knew that the police had gone to every parent's home, leading them into a panicky reaction as your child has been sexually molested.
00:54:48.440 And nobody understood how the children were actually being interviewed by social workers who were bringing in those anatomically detailed dolls with prominent genitals and testifying that they knew if a child had been molested based on how the kid was playing with the doll.
00:55:06.280 Well, as you would have said before, you know, those genitals are pretty interesting.
00:55:12.140 No little kid is going to ignore the genitals.
00:55:15.200 But if they did ignore the genitals, it's become they've been sexually abused and traumatized.
00:55:19.480 And if they play with the genitals, it's because they've been sexually abused and traumatized.
00:55:23.320 It took psychological scientists to do a controlled experiment and ask children known to have been sexually molested and children known not to have been sexually molested, give them these dolls, see how the kids play with them.
00:55:40.620 And as you can imagine, there was no difference between these two groups.
00:55:44.900 So the kinds of therapists who were marching into court with no psychological scientific training at all, just their own observations, their hunches and their biases, could testify with assurance that they knew that this kid had been molested.
00:55:59.200 So this is the kind of research that was done in the aftermath of these daycare cases that came to transform how we understand children's testimony so that we can help the children who have been molested, but not destroy the lives of innocent adults either.
00:56:19.660 So the task for us, embarrassed as I was, dissonant that I felt for my own participation in what I wrote about McMartin, I really began atoning for this by writing about what was happening in the other daycare cases across the country, such as the Amaralde case in Boston, which was almost a mirror image of McMartin.
00:56:43.560 There's a very subtle difference between always believe the victim versus always be skeptical and take the accusation seriously, right?
00:56:53.800 Those are not the same thing, but they're similar.
00:56:58.360 They can look similar at the outset, but this is a layer of nuance that seems to be missing from a lot of the discussion today, isn't it?
00:57:06.160 Absolutely.
00:57:07.160 And in science, whenever ideology interferes, it can distort the science and it certainly can distort public opinion.
00:57:18.740 So the ideology now, because, and it's understandable, because women, for example, have been ignored for a very long time.
00:57:28.700 So when they talk about sexual harassment now in this climate, the idea is always believe them, always believe them.
00:57:39.320 Because why would a woman come forward if she wasn't telling the truth?
00:57:44.320 But that's an ideological conclusion.
00:57:47.980 I would substitute for that is always pay attention, always listen, but keep an open mind and realize that there are probably at least two sides to every story.
00:58:02.880 So be respectful, listen, but keep some skepticism in mind.
00:58:10.020 Now, some of the radical feminists are saying, so it doesn't matter.
00:58:15.340 So what if some guy is falsely accused and falsely loses his job?
00:58:22.480 It's almost as if it serves the people of that gender right because of all the abuses our gender took in the past.
00:58:32.680 And that's never a reasonable position.
00:58:36.820 It's a very, very old one, of course.
00:58:38.840 It's in the era of recovered memory therapy, Peter, which you've written about at length.
00:58:45.800 This has won you a lot of friends, Carol.
00:58:48.240 A lot of friends.
00:58:49.260 Really a lot, a lot of friends.
00:58:51.100 One of my favorite descriptions you ever gave was after you wrote your first lengthy tome on that, someone, you said you got a lot of, how did you describe it?
00:59:00.240 A lot of lovely requests to go and passionately make love to yourself or something like that.
00:59:07.700 May you go forth into the world and multiply by yourself, yes.
00:59:12.960 Oh, yeah.
00:59:14.800 Oh, yeah.
00:59:15.620 Well, you see, any exhortation to believe, just to believe.
00:59:21.300 I mean, I remember a cover of Ms. Magazine many years ago about the alleged satanic ritual abuse cults, which were supposedly afoot all over the country.
00:59:30.400 And people were believing that there were satanic ritual abuse cults that were trapping children and women and so forth.
00:59:37.020 And the cover of Ms. Magazine had one of these satanic images with the cover line, believe it.
00:59:42.180 Believe it? Really? I'm supposed to believe it?
00:59:45.980 So the idea that as women were going into psychotherapy and coming out believing that they had multiple personalities, not just three or five, but 10, 100, 500, there was an escalation of the belief in multiple personality disorder,
01:00:02.420 which was another hysterical epidemic in our country, fomented by many psychoanalysts and psychiatrists and social workers, until malpractice suits brought that bubble right down.
01:00:14.400 But to say, I understand that a person in therapy may be having trouble and may be suffering, but I get to question the explanation that the therapist is giving them, it's not a matter of disrespect.
01:00:28.380 It's a matter of understanding and bringing our best science to bear on understanding.
01:00:33.040 So I have always been skeptical of the believe X group, whatever X group is, uncritically.
01:00:40.860 What's the evidence? What's the best explanation for it?
01:00:44.260 So one of the more interesting to me, ways of understanding many of these very difficult, he said, she said, disagreements and debates, which are so painful and so difficult.
01:00:56.960 And we do understand how many, many thousands of women have been sexually abused in so many ways, but, but here's the but.
01:01:10.200 If you are bringing up an allegation that occurred six months ago, six years ago, 35 years ago, we're entitled to talk about what we understand about memory,
01:01:24.860 what we understand about the way in which information since the original event changes our memory or changes our interpretation of the event.
01:01:33.560 Events that happened to us 25 years ago, events that happened to us 25 years ago, that at the time we thought were benign, we can come to see as being malevolent and traumatic and oppressive.
01:01:44.040 That's all part of how the psychological process works.
01:01:48.260 And what social psychologists like Deborah Davis have shown is that a person doesn't have to be lying to be wrong in making an allegation.
01:02:00.140 And a man doesn't have to be lying, he can be self-justifying in responding to an allegation.
01:02:08.560 This is a different level of understanding that I think is important, especially when charges involve the possibility of ruining someone's life.
01:02:20.340 It doesn't fall into the category of there's always a simple answer and there's always a right or wrong way to see a particular allegation.
01:02:29.500 This point also about memory, probably about six years ago I read, or five years ago, it was after I had read your book for the first time, but I read the book by Katherine Schultz, Being Wrong.
01:02:42.680 And I also was very, I mean, moved is probably the wrong word, but I was struck by how feeble my memory was.
01:02:51.340 Because I've always, I think up until reading that book, I had never questioned that the way I remembered something was correct.
01:02:58.180 Because I have a very good memory for random stupid things, you know, like I can tell you that on, you know, Monday, June 27th, 1988, I did this and it happened this way and that way and this way and that way.
01:03:12.200 And I can be right on some of those things, but I can be surprisingly wrong on other things.
01:03:17.140 And I think reading that book helped me appreciate how much I could be wrong about.
01:03:22.700 And that became really scary to me because it really made me realize that I'm quite fallible to my own BS.
01:03:32.600 I thought I was somehow, I levitated above that.
01:03:36.720 Like, I just thought whatever was in Peter's memory vault had happened.
01:03:40.640 And now to know that that's not true is a little scary, but to your point, it's probably not true for anyone.
01:03:47.240 The common idea is that we have a little tape recorder inside our brain and all we have to do is press the button and it'll all come out.
01:03:57.100 It's wrong because it's not all in there.
01:04:01.140 A lot of it gets confabulated and things get mixed together that don't belong together.
01:04:07.360 So it doesn't have to be any matter of self-justification or any kind of dissonance reduction.
01:04:13.400 It can be just wrong, just randomly wrong and misremembered in some sort of harmless way.
01:04:21.720 As in Carol's famous memory of that book that she's sure her father read to her.
01:04:31.020 I'm handing you the ball.
01:04:33.560 Oh, is that the ball?
01:04:35.800 Oh, how did I miss the ball?
01:04:37.640 Landing right here in my lap.
01:04:39.140 Oh, well, yes.
01:04:40.320 When we were writing our book, I had a vivid, vivid memory of my father reading me James Thurber's The Wonderful O.
01:04:49.220 Oh, it's a, by the way, a wonderful book about pirates who remove the letter O from the alphabet, from speech and from every object.
01:04:58.440 You may keep geese, but not a goose and so forth.
01:05:00.940 Anyway, it's a wonderful, wonderful book.
01:05:02.800 And I remember him reading it to me and are laughing about what names would be like without their O's, Ophelia, Oliver, and so forth.
01:05:09.600 And then, because I like to read that book every so often just to cheer myself up, I go and I see its publication date, which was one year after my father died.
01:05:20.320 It just hit me in the, how could, what?
01:05:24.360 How could that be?
01:05:26.940 And, of course, then that starts you on another trail.
01:05:29.720 Well, who did read it to me?
01:05:30.960 And wait a minute.
01:05:31.760 I was a teenager.
01:05:33.040 Who's reading me a book when I'm a teenager?
01:05:35.200 Or, you know, all of the things that I had made up around that and they were wrong.
01:05:39.680 Which is why I like to think of dissonance theory as an incredibly helpful mechanism of understanding that is a form of arrogance control and certainty control.
01:05:54.520 And if we can learn to have passionate beliefs that give our lives color and meaning that we live by, but to hold them lightly enough so that if the evidence comes along that our favorite diet is wrong, we can finally have to say, you know what?
01:06:17.620 I was wrong on that one.
01:06:18.780 Yeah, this idea of doubt versus arrogance is amazing, but society doesn't really reward doubt as much as it rewards arrogance, right?
01:06:28.580 I mean, isn't that part of the challenge where we're, I think, culturally a little bit more likely to find somebody believable?
01:06:37.840 Wouldn't, doesn't a patient want a doctor?
01:06:40.320 Maybe arrogant is too strong a word, but doesn't a patient want a doctor that is, yeah, exactly, much more certain.
01:06:45.800 Confident.
01:06:46.420 Yes, much more confident.
01:06:47.660 It rewards certainty, yeah.
01:06:50.880 And certainty can easily morph into arrogance.
01:06:56.100 I remember once when I was serving as an expert witness in a murder case, I was presenting some psychological data that favored the accused person.
01:07:08.600 I said, all of the things being equal, and the prosecuting attorney leaped on that, saying, oh, sure, all of the things, but in the real world, professor, all other things are hardly ever equal.
01:07:23.860 And of course, as you know, court trials are somewhat theatrical, because they're playing to the jury.
01:07:32.300 And I finally had to turn to the judge and I finally had to turn to the judge and said, that requires an explanation.
01:07:38.660 And the judge was very sympathetic to me, and I finally had to say that all other things being equal is simply the credo of experimental science.
01:07:49.840 It means that you control extraneous variables that could distort the data.
01:07:58.180 Not that it proves your case, it's simply random error that can distort the data, so that it's a good thing to hold all other things equal, not a bad thing about science.
01:08:12.340 That's the kind of thing that we're talking about, that I'm not sure why I started that story.
01:08:21.140 Well, no, because, see, this is another interesting thing.
01:08:24.980 It's why many people prefer the pseudosciences, where somebody gives you a certain answer.
01:08:32.100 I know that if you eat this thing, this following health benefit will occur, whereas scientists have the irritating habit of talking in probabilities.
01:08:41.520 It is likely that, it is probable that, in fact, any scientist knows that the people who speak in certainties are not speaking scientifically.
01:08:50.200 That's just not how scientists think, and that's not how they would present their findings.
01:08:54.780 But certainty is what most people want to hear.
01:09:00.380 But, of course, if someone really is certain about something, they have almost certainly frozen their ability to change their minds when they need to.
01:09:09.300 I want to talk about something that you guys have written about as you have your pyramid diagram.
01:09:15.000 I talk about it as sort of a path dependency is the way I kind of describe it to people.
01:09:18.820 And I find this to be, I think, one of the most remarkable explanations that you provide for how two people can start out with a very similar set of beliefs and yet come to these very pivotal moments where a decision is made that sets them on a path.
01:09:40.380 And that path becomes reinforcing and reinforcing and reinforcing and after a long enough period of time, those people are so divergent in their beliefs and in their behaviors and including their view of each other that they seem unrecognizable as though they came from the same species, let alone that they once stood next to each other.
01:10:01.340 Let's go with two examples.
01:10:31.340 All right.
01:10:32.100 This example was actually based on an experiment that had been done many years ago with children.
01:10:37.180 And it's really quite simple.
01:10:39.140 The two students have pretty much the same attitude about cheating.
01:10:42.620 It's not a desirable thing to do.
01:10:45.220 We know it's not a good thing to do.
01:10:46.580 But, you know, it's not the worst sin in the world.
01:10:49.660 And now here they are in their exam and they draw blank on this crucial question.
01:10:54.920 And the crucial question will determine their grade on the exam and in the manner of students, not only on the exam, but in this course and in life.
01:11:03.160 And everything will go south if I don't get an A on this exam and so forth in the way that students often panic.
01:11:09.020 And suddenly they are given the opportunity to cheat as the student next to them, the one with the beautiful handwriting, makes her answers available to them.
01:11:19.760 Now they have a second to decide, cheat or not cheat.
01:11:23.360 One cheats, one doesn't cheat.
01:11:27.560 One gives up integrity for the grade.
01:11:30.600 The other says, no, integrity is too important.
01:11:32.600 I'm not going to look.
01:11:34.480 The minute they make that decision, as I was using this in a way earlier, the minute they make that decision,
01:11:40.720 they now have to put their behavior and their attitude toward cheating into consonants.
01:11:48.520 So their attitude about cheating will now change to be consonant with the behavior of what they did.
01:11:57.080 So the one who cheated will not modify that view to say cheating isn't such a bad thing.
01:12:02.600 Please, everybody cheats.
01:12:04.320 It's just a victimless crime.
01:12:06.700 Who cares?
01:12:07.220 But the one who resisted cheating will now feel even more strongly that cheating is wrong and unfair and it's not a victimless crime.
01:12:16.660 What about all the people who don't cheat and work hard and learn the material?
01:12:21.500 Over time, as they move along, their attitudes about cheating and their self-justifications for their own behavior will keep reinforcing.
01:12:34.360 And we use the metaphor of the pyramid because they start out at the top side by side.
01:12:39.540 But by the time they have finished justifying one step at a time their own behavior, they stand at the bottom of the pyramid, very far apart from each other.
01:12:50.480 And for me, what I find most illuminating in this metaphor is that it demonstrates how hard it is to go back up.
01:13:00.800 Because now here you are at the bottom and you've spent all this time and energy justifying your decision to cheat or not cheat.
01:13:07.160 How are you going to now go back and say, you know, that first step I took off the pyramid was really the wrong one.
01:13:12.400 So that is the metaphor we use in explaining why people get themselves locked into a belief or a practice.
01:13:22.200 And what that metaphor really illustrates beautifully is that every time a person has to make an important decision, and it's a difficult decision, like the cheating example, he or she is doomed to experience cognitive dissonance.
01:13:41.840 For the ones, for the students who cheats, the dissonance is, I see myself as a basically honest person, and yet I'm committing a dishonest act.
01:13:55.540 Therefore, to reduce dissonance, that act isn't so very important because everybody would do it.
01:14:01.920 I'd be a fool not to do it.
01:14:03.440 For the person who resists the temptation to cheat, the cognitive dissonance is, I could have gotten a really good grade in this course, and that would have allowed me to go to medical school, and I chose not to do it.
01:14:18.800 Therefore, it would have been a horrible crime if I had cheated.
01:14:23.380 So that you cannot escape cognitive dissonance, no matter which way you choose, and the cognitive dissonance is followed by self-justification, which changes the attitude enormously.
01:14:38.420 Elliot, to borrow your phrase from a moment ago, all things equal, if I took 200 students and looked at 100 of them who elected one path or the other, so just pick whichever path you want to discuss.
01:14:49.300 How much variability is there in the dissonant response amongst that group?
01:14:56.800 So if you take, for example, say the group that decided to cheat and now has to spend the rest of their life justifying, this was not victimless, this has allowed me to get to graduate school, I'm going to have a greater impact on the world, blah, blah, blah, blah, blah, blah.
01:15:09.820 How variable are those students in their responses, and what is the extent to which that is subconscious versus conscious and how it plays out?
01:15:20.320 There's no real answer to that question without doing the experiment.
01:15:23.820 But in the experiment that was done by one of my fellow graduate students at Stanford, a guy named Judson Mills, there was almost no overlap between the final feelings between the kids who cheated and the kids who didn't cheat.
01:15:42.780 Each one deviated from their feelings about cheating a day before they were put in that situation.
01:15:50.640 So there was a little overlap, but it had a major impact on their attitudes about cheating.
01:15:57.640 If I may, it happens that we have an example here in our very own book from the high-achieving, high-pressure Stuyvesant High School in New York City.
01:16:07.340 Seventy-71 students were caught exchanging exam answers, and they gave the New York Times reporter a litany of perfect self-justifications, which allowed themselves to keep seeing themselves as smart students of integrity.
01:16:25.220 One said, it's like, I'll keep my integrity and fail this test?
01:16:28.960 No, no one wants to fail a test.
01:16:31.220 You could study for two hours and get an 80, or you could take a risk and get a 90.
01:16:34.880 He redefined cheating as taking a risk.
01:16:39.260 For others, cheating was a necessary evil.
01:16:42.400 For many, it was helping classmates in need.
01:16:46.380 When one girl finally realized her classmates had been relying on her to write their papers for them, she said,
01:16:51.660 I respect them and think they have integrity, but sometimes the only way you could have gotten there is to kind of botch your ethics.
01:17:03.260 Kind of botch your ethics.
01:17:05.220 There you go.
01:17:06.040 How do they define integrity in this situation?
01:17:09.520 Exactly.
01:17:09.940 So there's another example that's even more distressing, though, and I think this was in your book, but if not, Carol, I know we've discussed it, which is-
01:17:18.600 We'll put it in the next edition.
01:17:20.420 Well, but I think it is actually in there, which is you get to the point where the cop is just planting evidence on people who are clearly innocent.
01:17:29.480 Do you remember?
01:17:29.940 Was this in the book?
01:17:30.580 Oh, yeah.
01:17:31.160 Test a line.
01:17:31.900 There's a term for it.
01:17:32.720 I think the way you walk down this is, look, maybe the first time it happens is they break into the crack house.
01:17:40.360 You know this house is full of crack.
01:17:42.140 There's no dispute.
01:17:43.200 As soon as the battering ram goes through the door, you can see the plume of smoke.
01:17:47.060 You see this one guy running into the bathroom, slamming the door, locking it, and you hear him literally shoveling drugs into the toilet as he flushes it.
01:17:57.400 And just as you get the door open, you see the last swirl of water taking that last gram of cocaine down the toilet, and you are out of luck.
01:18:09.140 And in that situation, when you know, as sure as God made little green apples, that these guys are filthy, you plant that cocaine right on that guy, and you make your arrest, and you finally have your bad guy.
01:18:21.940 And there's another cop there who would refuse to do it, even in that situation.
01:18:27.380 Now, again, those guys are at the top of the pyramid.
01:18:30.080 What can those guys look like 10 years later?
01:18:32.880 That guy who planted that drug in that situation, which seems quite justifiable, right?
01:18:38.600 What is that guy doing 10 years later?
01:18:41.080 Is that Mark Furman?
01:18:42.120 You remember this bozo in the O.J. Simpson trial?
01:18:44.920 Well, I mean, I guess to me, one of the points of these stories is, you probably aren't starting out with the most egregious examples of behaviors that people think about, right?
01:18:55.040 You probably got there stepwise.
01:18:57.260 Well, that's just it.
01:18:58.120 We look at people very often who are at the bottom of the pyramid, and we don't realize that they started at the top.
01:19:03.220 They'd made decisions, very, very small, tiny decisions.
01:19:06.840 Jeb Stuart Magruder, in our book, and how he got enmeshed in the Watergate scandal, it was one step at a time, starting with the smallest, smallest compromises, until he could not get unenmeshed, right?
01:19:18.860 And so it's true.
01:19:20.000 Very often we look at the behavior of people without realizing all the time, effort, and money that they put into justifying their behavior as they got further and further along.
01:19:32.440 So behavior that seems really puzzling to us at a distance makes far more sense when we see it in terms of this one step at a time, which is, you know, one of Elliot's great experiments.
01:19:45.940 Elliot, the initiation experiment.
01:19:47.820 I remember that.
01:19:49.080 Remember that one?
01:19:50.840 Terrific demonstration of this.
01:19:52.620 You want to tell it?
01:19:54.260 No, why don't you tell it, Karen?
01:19:55.820 No, I'm not.
01:19:56.720 We don't even know if Peter wants you to tell it.
01:19:58.540 I would love for one of you to tell it.
01:20:02.440 No, this is the critical point, right?
01:20:07.280 The initiation, this is what I describe as sort of the path dependency, right?
01:20:11.060 It's the switch on the train.
01:20:13.480 And as you said, if you're faced with a hard decision, I think most of us don't appreciate what the consequences of that can be if we take it with the wrong mindset.
01:20:24.680 And this is something I wanted to come to later, which is identity versus behavior, right?
01:20:30.940 So it's sort of like you can say in this moment with these facts, I'm going to make this decision.
01:20:38.940 But it's, I think, I think when you can do that without pinging your identity to that decision and instead just ping your behavior to it, that I think it becomes a lot easier to move off that path when you're encountering new information.
01:20:54.020 But sometimes it's too easy.
01:20:56.520 Let's talk through some of these.
01:20:57.920 Let's talk through this experiment.
01:20:58.840 Let's take the example you just gave of the cop who knows they're guilty.
01:21:04.880 He can see the smoke.
01:21:06.740 He hears the toilet flush and sees all of the evidence going down the toilet.
01:21:11.760 Now, you said you can see how that is justified that he plants the cocaine because he saw he knows they're guilty.
01:21:20.500 And I would suggest that it's never justified, that it's understandable that he might plant the cocaine.
01:21:32.780 But because of what I know about how the human mind works, I also know that once he takes that step of planting cocaine that he didn't actually find there, it makes the next step and the step after that a lot easier.
01:21:52.620 And the next thing you know, there's a totally innocent person who has been framed for a crime he didn't commit.
01:22:00.060 And notice, by the way, the dissonance a police officer will feel for another cop in the room to say, no, no, no, no, this is wrong and don't do this.
01:22:12.680 I mean, we see this is one of the problems in our discussions of police departments.
01:22:17.920 The presence of somebody who is behaving ethically is dissonant to someone who is not behaving ethically.
01:22:25.700 I don't like you.
01:22:26.820 You're reminding me, you're reminding me that I'm doing something that I shouldn't be doing if I were an ethical person.
01:22:34.260 And these guys get ostracized by the culture of most police departments, which is you have to go along, you have to play along.
01:22:43.680 How much do you think this figures into police racism, police brutality?
01:22:48.240 Do you feel that this is a similar starting point as well?
01:22:53.080 Or do you think that racism has far deeper roots that go far outside of conscious choices that are made?
01:22:59.920 I think racism in the police departments is a very complex issue.
01:23:05.280 But it begins with, I think, a false and illusory correlation.
01:23:10.940 Since most street crimes happen in poor neighborhoods, and since a lot of poor neighborhoods consist of ethnic minorities, black and brown people,
01:23:23.140 I think cops have learned that those dangerous neighborhoods with a lot of street crimes are often caused by black people,
01:23:33.920 then they become suspicious of all black people and become quite brutal of it about them in their behavior toward them.
01:23:41.780 And again, it's understandable, but not justified that they behave that way, the way they did.
01:23:49.680 And in the George Floyd case, it's, again, one step at a time with two of the other cops acting almost as bystanders who are allowing it to happen
01:24:04.320 because the culture is such that they don't want to interfere with the most brutal cop because he seems to know what he's doing
01:24:14.140 and they're afraid of being ostracized by their fellow cops if they interfere with it.
01:24:21.400 And that's that one step at a time, which is exactly how we define how cognitive dissonance can get us into trouble.
01:24:32.520 Because each step you take down that pyramid, you justify it until by the time you reach the bottom, you don't recognize yourself anymore.
01:24:45.740 That's why Jeb Stuart McGruder is a perfect example, because he was a guy with good, high morals,
01:24:55.360 and he saw himself in retrospect being corrupted in the Nixon administration until he ended up doing things that a year earlier he never would have dreamed of doing.
01:25:09.220 But one step at a time, justifying each one along the way, when he finally woke up as he was being sentenced to prison,
01:25:18.380 it was like waking up from a bad dream.
01:25:21.800 This is actually one of the most powerful experiences for me in working with Eliot on this book was finding the stories of people
01:25:30.480 who were able to break out of the cocoon of self-justification that they had spun to protect themselves.
01:25:39.220 And we're suddenly able to see themselves and the consequences of the behavior in a clearer light.
01:25:45.780 You see, sometimes that light is obscured for us.
01:25:48.240 I mean, for example, when we talk about systemic racism,
01:25:51.840 systemic racism, which is very hard for most people to understand and experience,
01:25:55.280 because by definition, we experience things as individuals.
01:25:58.420 But there was an important study in Seattle some years ago
01:26:01.460 of the way that the police and the city government were defining what a drug crime was.
01:26:11.540 What kind of drugs do we want to go after?
01:26:14.060 Whom do we want to arrest for what kind of drugs?
01:26:16.860 So the kind of, you know, white kids using cocaine, well, that's okay.
01:26:20.940 That's not a problem.
01:26:21.900 We're not going to have drug busts on white kids using cocaine.
01:26:24.400 But black kids using crack?
01:26:25.980 Oh, let's go for that.
01:26:27.120 So the very definition of what the problem is and whom we want to arrest
01:26:31.520 set up a pattern of the over-arrests of African Americans compared to whites.
01:26:38.400 That's an example of a decision that occurs at the top
01:26:41.100 and that can have very powerful racist consequences,
01:26:44.900 whereas each individual officer is saying,
01:26:47.260 by arresting this guy for crack, I'm not a racist.
01:26:50.580 I'm not personally a racist.
01:26:52.260 And I would say that one other element in our police departments,
01:26:56.620 as we are now learning, is that the culture of brutality is larger than the specifics.
01:27:05.180 Something like 20 to 25 percent of the police officers in the United States
01:27:09.240 have come from the military.
01:27:10.960 That's their training.
01:27:12.120 Many of them are excused from having to go to a police academy
01:27:14.700 if they've had military training.
01:27:16.300 So look how that would shape your point of view
01:27:20.400 about how to control crowds and who the enemy is and so forth.
01:27:26.220 So we have every individual is embedded in a social network,
01:27:30.320 taking their cues from that network
01:27:31.820 and justifying their behavior in order to remain a part of that network.
01:27:36.500 Perhaps the most optimistic part of, I think, your work
01:27:41.380 is the description of people who are able to halt the slide down the pyramid, right?
01:27:46.300 So I want to talk both about some of those stories,
01:27:49.860 but also what are the traits or things that we can do,
01:27:54.880 insertions of cognitive tools that we can use to kind of guard against it?
01:27:59.120 Because I hope nobody comes away from this thinking that this is a condemnation
01:28:04.200 of an individual who's subject to cognitive biases and confirmation biases
01:28:10.480 as they struggle to ameliorate the suffering of cognitive dissonance.
01:28:14.820 I think the point is just the opposite.
01:28:16.480 We're all doing it all the time.
01:28:18.640 And the more we talk about it and the more we think about it,
01:28:23.200 the better shot we have at getting to better answers
01:28:27.960 as opposed to just reiterating bad behaviors
01:28:31.200 because they fit with preconceived notions.
01:28:33.820 So what are some of your, both of your favorite success stories
01:28:37.900 on people who were heading down the path, down the pyramid,
01:28:41.180 and then managed to sort of realize that they were on the wrong side of the pyramid?
01:28:46.800 Me!
01:28:49.940 McMartin, look, I got out of that mess.
01:28:52.080 Okay, Elliot, you're on.
01:28:53.220 Well, my favorite person in our book, I think, is Wayne Hale,
01:28:59.340 the guy who made the GOAT decision on the Columbia disaster in NASA.
01:29:09.900 The Columbia ship exploded and Wayne Hale was the operations officer at the time.
01:29:18.560 He knew these guys.
01:29:20.560 He knew their families.
01:29:21.780 And they all got killed because of his decision.
01:29:27.000 And he wrestled with that one for a while.
01:29:31.500 His first response was, look, no launch is perfect.
01:29:36.240 There are always little problems and you have to learn which ones to ignore.
01:29:41.160 And he literally stopped himself in mid-sentence and said,
01:29:47.360 but, you know, when I look at it, I have to say,
01:29:52.700 at this point, I wish I had been more cautious.
01:29:57.500 The weight of the evidence was to abort the launch just before it started.
01:30:04.240 But I made the wrong decision and I'm dreadfully, dreadfully sorry.
01:30:12.960 And I, when I weigh all of the evidence and all of the reasons why he could have doubled down,
01:30:24.180 I say, my God, what a courageous guy that is.
01:30:27.620 To take the blame, which he deserved, for the death of all those wonderful people.
01:30:35.780 That was a tough one.
01:30:37.620 Yeah, it is.
01:30:39.180 It's immensely powerful.
01:30:40.800 It's, in finding these stories, you see truly the courage it takes,
01:30:45.420 the honesty to face up to it and fess up.
01:30:51.220 And generally speaking, the reaction of those around you is not going to be critical.
01:30:57.900 It's going to be grateful.
01:31:00.500 Wayne Hale did not suffer for his ability to send this email to everybody at NASA saying,
01:31:06.520 look no further for the person responsible for this disaster.
01:31:09.720 I'm the one.
01:31:11.480 And I should say, thank you, Peter, for this question,
01:31:13.880 because I too feel that the more people understand about how cognitive dissonance operates
01:31:20.280 and how we are all susceptible to it, the more helpful it is in understanding our own behavior.
01:31:29.360 So to know that the minute we make a decision, whether it's about a car or a partner or a house
01:31:34.840 or how to live with COVID, any time we make a decision,
01:31:39.560 we're going to be motivated to throw support for that decision and ignore evidence that we're wrong.
01:31:45.160 The moment we understand that, that gives us a whole new toolbox of ways of dealing with our own beliefs and attitudes.
01:31:53.200 And one of the primary tools in that toolbox is the ability to say,
01:31:59.380 I did a stupid thing.
01:32:02.600 I made a stupid mistake.
01:32:05.360 I did something that caused harm.
01:32:09.560 But just because I did something stupid or immoral does not necessarily make me a stupid person or an immoral person.
01:32:18.900 And to be able to say that thoughtfully and meaningfully is a very important thing to do.
01:32:26.360 We have to be able to say, making mistakes is difficult to deal with,
01:32:31.940 but one can do it if we don't ipso facto embrace that as meaning I did something stupid.
01:32:40.920 If I'm a stupid person, we have to be able to say, I made a stupid mistake, but I'm not a stupid person.
01:32:50.460 What can I learn from having made that mistake?
01:32:54.340 How can I make sure that I don't make a similar mistake like that again?
01:32:58.840 And if that mistake caused harm, how can I make amends?
01:33:03.920 And that's how a person lives a meaningful life.
01:33:08.620 Exactly.
01:33:09.580 We love the story of Shimon Peres, the former Israeli prime minister who was thrown into cognitive dissonance
01:33:18.680 when his good friend Ronald Reagan accepted an invitation to lay a wreath at the Bitburg Cemetery in Germany
01:33:26.100 and some national event.
01:33:28.360 And it turned out that 47 Nazi officers had been buried at this cemetery.
01:33:33.660 And of course, the world was furious at Reagan for accepting this invitation to lay a wreath there.
01:33:39.620 Holocaust survivors and so many others were just outraged, as was Peres.
01:33:44.700 So a reporter said to Peres, so what do you think about your friend Ronald Reagan accepting this invitation to speak at the cemetery?
01:33:52.300 And Peres said, when a friend makes a mistake, the friend remains a friend and the mistake remains a mistake.
01:34:03.560 In this way, he separated the two dissonant cognitions, just as Sarah Silverman was trying to do about her friend Louis C.K.
01:34:14.120 My friend made a mistake, he did something wrong.
01:34:16.820 He remains my friend and what he did remains wrong.
01:34:19.700 When I do something wrong, what I did remains wrong and I still remain a good, kind person.
01:34:28.160 You separate the dissonant cognitions and treat them separately because the usual impulse would be to jump to a decision.
01:34:35.860 I'm done with that friendship or I have to minimize the thing that my friend did.
01:34:41.520 And by being able to separate the two dissonant cognitions more closely and evaluate them,
01:34:46.700 sometimes it requires us to live with the discomfort that we love this person and this person did this awful thing.
01:34:55.440 And it requires self-reflection.
01:34:58.400 What kind of a person am I who have done this particular thing?
01:35:04.640 And serious self-reflection is a lot more difficult than self-justification.
01:35:11.280 The easy route is to leap directly into self-justification.
01:35:17.140 But as we've seen, one step at a time, that can lead us down the primrose path.
01:35:24.320 But self-reflection, I did a stupid thing.
01:35:28.880 Why did I do that?
01:35:30.360 How could I learn something from that?
01:35:32.920 But that's our recommendation for the way to go.
01:35:37.580 In Eliot's absolutely wonderful memoir, Not by Chance Alone, he uses this observation about what drew him to the field of social psychology.
01:35:46.960 And I agree with it, too.
01:35:48.660 He said, clinical psychology, therapy, is about repair.
01:35:54.920 Social psychology is about change.
01:35:57.320 And I think that is really a guiding principle of why we wrote this book, because it's an examination, an indictment of so many institutions here, the therapy world, the criminal justice world, family relationships, so many domains in which, by understanding, we do have the power to change.
01:36:21.420 So what year did the first edition came out?
01:36:22.960 Was it about 06?
01:36:24.600 Seven.
01:36:25.460 Seven.
01:36:25.720 So the world's changed a lot in 13 years.
01:36:29.860 Obviously, you've shared that a big part of your motivation for this was sort of the frustration you had with the Bush-Cheney administration's continued justification for the war on Iraq long after it became quite clear that the reasons that were given to go to war didn't exist.
01:36:50.540 And I also think it's safe to say that even within that administration, there were many different flavors of dissonance.
01:36:56.980 But I think I've read maybe two or three biographies, including the autobiography of George Bush.
01:37:03.860 I find presidential biographies very interesting.
01:37:06.280 And I would agree with your take.
01:37:07.880 I don't think for a moment Bush felt he was pulling the wool over anybody's eyes.
01:37:13.160 I really think he believed this to the essence of his core.
01:37:17.380 But here we are 13 years later and Iraq and Afghanistan and most of the Middle East, quite frankly, is largely forgotten at this point.
01:37:25.780 And not just with what's going on in terms of coronavirus, but I think more broadly speaking, what's going on in terms of populism, what's going on in terms of racism, a greater polarization within our political system.
01:37:39.080 I mean, I think almost anybody would give their left arm to go back to 2007, politically, frankly, like when you had two somewhat reasonable parties that kind of behave like it seems a heck of a lot better when you first wrote the book.
01:37:51.180 Are you more or less optimistic today?
01:37:53.600 And how much do you think cognitive dissonance is factoring into what what really looks upsetting and ugly and just very unpleasant with respect to the way we are governed and with the way we govern?
01:38:05.920 How much time you got?
01:38:07.260 The problem, I think, is the hard polarization of the parties in our introduction to the revision.
01:38:17.180 We quote Bob Dole saying, you know, Bill Clinton is my opponent, not my enemy.
01:38:22.580 I mean, how far we have come from the idea of seeing the other party as an opponent and not an enemy, which is quite an explicit tactic.
01:38:32.520 So it's back to your question about identity in a way.
01:38:35.880 One of the curious things that have happened in our country is that political identity has come to have a primary importance for people in ways that it did not at one time.
01:38:48.820 That is, you know, it used to be you'd say, well, would you want your child to marry a fill in the blank, a person of another religion, a person from another city, a person from another ethnicity, a person from a different religion and so forth.
01:39:01.720 Now, the thing that you most don't want your kid to marry is somebody from the other political party.
01:39:08.680 It's that the hostility of that attitude is a sign of the problem of polarization.
01:39:16.400 And what that means is if political parties have become the central part of people's identities, then by definition, it makes it very hard to accept any evidence that somebody from the other party might have a good idea, might be doing the right thing, might be somebody I can listen to.
01:39:35.580 Although the current political scene is worrisome and we can all be pessimistic about it, one of the things that has been most heartening for both of us in writing this book has come from the hundreds upon hundreds of letters from people telling us what they have learned from the book and how it has affected them.
01:39:54.400 Now, you know, authors get these, it changed my life kind of things, but we get stories.
01:39:59.140 We get stories from people who have explained how they have taken an understanding of cognitive dissonance into their own lives with often very interesting results.
01:40:10.480 So, for example, once I heard from a man who told the following story, he said, I've got five siblings and we've been at war, at war, over the legacy of our family inheritance.
01:40:22.320 We've formed two factions. We've formed two factions. We've been fighting with each other and mediation hasn't helped and nothing has helped and we're estranged from each other and so forth.
01:40:31.460 And then he said, I read your book and I gave it to our mediator and I said, here, I said, give this book to my brothers and they will immediately understand what they're doing wrong.
01:40:43.980 I mean, he didn't quite say it that way. He gave it to the mediator. Here, have my brothers read this book.
01:40:47.660 He said, I got no reply. He said, and then a couple of years later, he said, I picked up your book again and I read it and I said, oh, he said, oh, he said, incredibly, the words on the page had transformed themselves.
01:41:02.880 And I wrote to the mediator and I said, tell my brothers that I now understand what I have been doing wrong in our discussions.
01:41:12.620 I have been greedy. I have been selfish. I haven't thought enough about how you guys have seen the situation.
01:41:20.180 I'd like to apologize. Can we talk?
01:41:23.660 I love that story and it's a perfect example of what a lot of people discover on reading the book.
01:41:34.340 We all have blind spots and perhaps the ultimate blind spot is the belief that I don't have a blind spot.
01:41:45.200 And if only people would see it my way, then they could arrive at the reasonable solution to any problem.
01:41:54.200 But the fact is what people often discover is that the blind spot is in me.
01:42:02.200 I think you're absolutely right, Carol. I think this all comes back to this idea of I did versus I am.
01:42:08.000 I don't know if either of you had ever met Marsha Lenhan.
01:42:11.680 I do. I haven't seen her in a long time, but yes, her work on...
01:42:15.460 DBT. Well, her work on dialectical behavioral therapy, which I've become such a fan and such a student of.
01:42:22.480 Because I think that what Marsha and that school of DBT have taught is effectively the exact way you're describing this,
01:42:32.460 which is the more we can distance our identities from these actions,
01:42:37.260 the easier it is to hold and sit in the discomfort of these two things.
01:42:43.220 Again, I think the Silverman-Louis C.K. example is a great one.
01:42:47.140 I do it as a practice every day, by the way.
01:42:49.860 So I do this thing because one of the things I've struggled with historically
01:42:52.720 is I tend to peg my achievement to my identity.
01:42:57.420 So even with something that's as seemingly stupid as shooting my bow and arrow,
01:43:02.500 which I do very often, so I'll go in the back and I'll shoot my bow and arrow.
01:43:07.240 And if I have a good day shooting, I'm going to have a good day, period.
01:43:11.560 And if I shoot poorly, I'm going to have a bad day.
01:43:14.460 And of course, the only way that can be true is if you're so silly as to assign worth to yourself
01:43:20.720 as a result of how you perform.
01:43:22.560 So now, every single day, when I do whatever it is that's my recreation activity,
01:43:29.520 I dictate into my phone for no more than two minutes, three at the most,
01:43:35.180 a lovely discussion to myself separating my performance from my worth.
01:43:41.440 It can go something like this.
01:43:43.380 Hey, Peter, great job today shooting.
01:43:45.860 I mean, it was really amazing you got up there and you shot a 296-32X.
01:43:51.500 That was really good.
01:43:52.360 And you went and did da-da-da-da-da-da, meaning all of these are very good.
01:43:55.240 But I just want to remind you, that doesn't actually make you any better today.
01:43:59.440 You know, you're no better a father.
01:44:01.800 You're no better a husband.
01:44:03.420 You're no better a friend.
01:44:04.380 You're no better a doctor than you were yesterday when you shot very poorly.
01:44:09.460 And this exercise, by doing it out loud and actually sending it to my therapist every single day
01:44:15.260 and knowing she listens to it, has been such a powerful tool for me to uncouple what I do from who I am.
01:44:24.220 Now, that's a silly and small example.
01:44:26.820 But listening to our discussion today, I want to think of bigger and better ways to do that
01:44:31.760 because I do think that in as much as our political identities become our personal identities,
01:44:37.220 we're really hosed.
01:44:38.780 I do fear that as sort of a population, it's going to be very difficult when, as you said,
01:44:45.540 like when whatever the other person says is wrong, no matter what,
01:44:49.360 when no matter the countenance on the other person's face,
01:44:52.440 anything from crying, smiling, laughing will always be colored in the wrong thing.
01:44:59.040 I don't know how you can make progress in thought.
01:45:02.080 If a society can't make progress, right, in some form of manner,
01:45:06.240 whether it be knowledge or insight or thought, I don't know.
01:45:09.300 Does this represent the end of progress?
01:45:12.540 Oh, as soon as you start the end of days, right?
01:45:17.160 Well, I don't mean that in a doomsday scenario, but I mean, like at some point, right,
01:45:20.580 there, I mean, when you think about what it was that enabled radical transformation of society
01:45:27.280 in the past, I don't know, four or 500 years, a lot of it had to do with a remarkable progress
01:45:35.180 in thinking, the standardization of formal logic, scientific methodologies.
01:45:42.880 I mean, everything that you've spoken about today, Elliot, on some level is grounded in
01:45:46.520 the ability to do an experiment.
01:45:48.980 Carol, you and I have spoken for hours about our hero, Richard Feynman,
01:45:52.480 and his very famous Caltech, actually, it might've even been a Cornell.
01:45:57.240 I can't remember if it was at Cornell or Caltech when he gave this lecture that is very easily
01:46:00.580 searched on YouTube.
01:46:02.080 It's beautiful, where he basically explains what an experiment means and, you know, how
01:46:06.820 you know of an idea.
01:46:08.260 And the beauty of rooting for the null hypothesis, which is really hard to do, but it's what we
01:46:14.800 have to do.
01:46:16.180 The age of enlightenment was around the 16th century when people began to do experiments
01:46:24.180 and think scientifically.
01:46:25.680 And every society has to bring its population into the age of enlightenment.
01:46:33.600 And I think our educational system, our public educational system has failed us.
01:46:39.880 When you see some of the sort of people on the street being interviewed, I am appalled by
01:46:48.040 how illogical they are in their thinking, many of them, and how irrational they are and
01:46:55.120 how one thought does not proceed from the previous thought.
01:46:59.840 It's appalling.
01:47:01.440 And it seems to me that critical thinking has to be taught in junior high school and high
01:47:07.740 school in this country.
01:47:08.740 People have to learn how to separate bullshit from fact.
01:47:14.320 People have to learn how to trust science rather than off-the-wall exclamations of fact
01:47:22.540 that aren't really factual.
01:47:24.860 And that's all part of our educational system.
01:47:27.980 The democracy, a democracy is not going to work with an uneducated population.
01:47:34.160 And a distrust of the institutions that are the bedrock of that democracy.
01:47:41.140 Science, government, the law.
01:47:43.680 So that's basically my point, which is my fear is we are further in the wrong direction today
01:47:49.900 than where we were at the time you wrote this book.
01:47:53.160 And so my call to you is if you could transmit to that generation that is so critical, right?
01:48:01.260 Those kids that are 10, 11, 12, 13, 15 years old, where they're still in these formative years
01:48:07.600 and we want to impress upon them, you know, for me personally, because I have one of my kids is in
01:48:13.320 that age group, I think nonstop about how to excite her about science and how to, you know, how to look
01:48:21.500 at the world and question everything.
01:48:23.620 Like, why does that thing float in the pool, but that thing sinks?
01:48:26.600 And why does, you know, why is it that the vinegar and the oil separate?
01:48:31.040 Like every single thing you see, you should be starting to think, why is that happening?
01:48:35.720 What is it from your work that I could impart to my daughter who is, you know, 11 years old
01:48:43.060 to, to give her the best shot at having a tool of being the kind of person that is comfortable
01:48:51.700 sitting in discomfort, making these hard decisions and being able to change her mind when the facts
01:49:00.660 call for it?
01:49:01.340 Well, I think the first thing that I would suggest is something that I learned the hard way as a parent
01:49:10.420 and a grandparent, and now I'm learning as a great-grandparent, is the importance of modeling.
01:49:19.260 In the home, you behave in a way that you want your child to behave when your child is your age.
01:49:27.060 And there are no exceptions to that.
01:49:30.600 You just do it.
01:49:31.940 You do it the way you want your child to learn to do it.
01:49:36.460 I think that modeling is a very, very powerful tool.
01:49:41.400 You want to tell your story about when you were first married to Vera and the anger.
01:49:46.160 My father would have angry.
01:49:48.400 Yeah.
01:49:49.140 You know, that was a big change in you.
01:49:50.720 It really was.
01:49:52.400 I grew up with a father who, my mother and father frequently quarreled.
01:49:57.600 And I remember sitting at the dinner table and we could see it happening because the only time
01:50:04.980 my mother had a chance to get at my father and, and give him a litany of her complaints
01:50:11.220 was at dinnertime.
01:50:13.180 So we'd be sitting at the dinner table.
01:50:16.400 I remember I was maybe 10 or 11 years old and my mother would be complaining and my father
01:50:23.280 would regard that as nagging.
01:50:24.900 And we could see my father beginning to seethe inside.
01:50:29.560 And my older brother looked at me and winked.
01:50:34.000 And what that meant was, we'll give him about 20 seconds before he explodes.
01:50:40.780 And he exploded.
01:50:42.080 And he would slam his knife down, knife and fork down on the table, say, God damn it.
01:50:48.520 Can a guy have a peaceful meal around here?
01:50:50.600 And he'd grab his coat and leave the house and not come back until after my brother and
01:50:55.800 I were asleep.
01:50:56.560 And we hardly ever got to see my father because it would happen often.
01:51:01.260 And that was my model growing up.
01:51:04.000 And soon after I was married, I got married when I was 22 years old to a remarkable woman.
01:51:10.480 And we're still married now, 65 years and counting.
01:51:14.840 But early in the marriage, we had an argument about something.
01:51:18.920 And I remember getting really angry, raising my voice and then walking out the door, slamming
01:51:26.040 the door, going down the steps.
01:51:27.700 I got halfway down the steps and I suddenly said to myself, where the hell are you?
01:51:34.000 Where are you going?
01:51:35.000 Where are you going?
01:51:36.000 What are you doing?
01:51:38.000 And I realized at that moment that I was modeling my father's behavior, which I detested.
01:51:47.160 And I slowly walked back up the stairs and apologized.
01:51:52.720 And we talked rationally about the issue that had brought on that explosion.
01:52:01.080 And I gradually taught myself that getting angry, raising my voice was not the best way to deal
01:52:11.960 with it.
01:52:12.140 I was married to a very gentle, I am married to a very gentle, wonderful person who was
01:52:20.140 not accustomed to that kind of thing, and therefore would not tolerate it.
01:52:25.760 And it helped me let go of it.
01:52:28.220 And to use reasonable, rational discussion, stating of feelings, the positive ones, the negative
01:52:37.540 ones, whenever they become apparent.
01:52:41.160 And that became a model for our kids, because the way we relate to each other is now pretty
01:52:49.360 standard in our family.
01:52:51.580 I don't take great credit for that.
01:52:53.820 I really learned it the hard way, but I learned it.
01:52:58.580 And taught it in the encounter group world and so forth, the ability to identify a feeling
01:53:04.620 instead of just roaring, no catharsis in that respect.
01:53:09.500 Those are skills to be learned.
01:53:11.520 And Peter, when you said, what can you do to make science itself interesting to your daughter?
01:53:17.880 Remember that as human beings, we think in stories.
01:53:21.380 Storytelling is our way of understanding the world, explaining the world, and making the world
01:53:26.660 interesting.
01:53:27.180 What science does is tell us which stories are better than other stories.
01:53:32.140 And that's its charm, and that's its magic, if you will, and that's its appeal.
01:53:37.260 When science is presented as a series of finding, finding, finding, finding, there's this thing,
01:53:43.300 and this thing, and this thing, and another thing, it loses its interest, and it zips.
01:53:47.940 But when it's told as a story in which the discovery is something that changed us, the story we tell
01:53:54.100 at the beginning of our book of Semmelweis, and his observations about why the women in
01:54:00.560 his hospital were dying of childbed fever, and that maybe it was because his students were
01:54:05.240 coming from the morgue to the bedside of these women in delivery with the, having just done
01:54:10.960 autopsies on the women who had died the day before, and thinking, oh, he's carrying something
01:54:15.740 on his hands.
01:54:16.360 That was a story.
01:54:17.820 The Semmelweis story was something that my junior high school teacher told our little
01:54:22.640 science seminar group.
01:54:24.720 And I remember, I was fascinated by the story, but I guess I was a budding social psychologist
01:54:32.780 because what was fascinating to me was not just that Semmelweis had found the reason
01:54:37.340 that the women were dying.
01:54:38.600 They didn't know about germs yet, but he found the reason.
01:54:41.740 Just wash your damn hands, and these women will stop dying.
01:54:45.180 He had the solution before he knew the problem, and that's very interesting.
01:54:49.940 Absolutely.
01:54:50.920 But see, in the story that Mr. Crane told us, what interested me was, why didn't Semmelweis's
01:54:59.240 fellow doctors say, hey, Ignash, great explanation.
01:55:03.940 Thank you for explaining why my patients are dying.
01:55:06.380 I can change my practice immediately.
01:55:08.680 You know, this is terrific information.
01:55:11.080 What did they say?
01:55:12.200 They said, oh, piss off, you Hungarian nitwit.
01:55:14.940 I mean, the equivalent of it in 1847.
01:55:17.120 They hadn't read the theory of cognitive dissonance yet.
01:55:20.200 No, they certainly hadn't.
01:55:21.800 I don't think he was ever vindicated in his life.
01:55:23.800 I mean, he died basically in an insane asylum, still believing that no one had effectively
01:55:30.380 come to acknowledge this theory.
01:55:33.060 Absolutely.
01:55:33.920 But you see, it's an amazing story, and all of the elements are in it.
01:55:38.540 The psychological story of his fellow scientists who didn't want to believe him, I mean, which
01:55:42.600 is the story of history, throughout history.
01:55:44.920 Oh, thanks, Galileo.
01:55:46.360 Well, you know, I'm really grateful for your new theory here, right?
01:55:51.420 It presents both the excitement of scientific discovery and the challenges, and the challenges.
01:55:58.100 Well, guys, I want to really thank you for this discussion today.
01:56:01.660 I know it's probably longer than most discussions you guys have on this topic, but as I said at
01:56:07.180 the outset, it's a book that I love.
01:56:09.780 But more importantly, I think it's just a topic that I think everybody needs to spend some
01:56:14.540 time paying attention to.
01:56:16.600 It is a part of everyone's life, whether they are aware of it or not.
01:56:21.700 That's the importance of it, right?
01:56:23.420 In the spirit of what water is to a fish, as David Foster Wallace spoke about in his commencement
01:56:29.680 speech, whether you are conscious or unconscious of this thing, it is with you.
01:56:34.040 So, we are probably better off being conscious of it and having some measure of control and
01:56:40.500 thought around it than we are ignoring it.
01:56:43.200 Peter, it was a pleasure to talk to you and the discussion, if anything, was not long enough.
01:56:49.300 Thank you very much, Peter.
01:56:52.000 Thanks so much, guys.
01:56:52.320 Thanks a million for inviting us.
01:56:54.620 Thank you for listening to this week's episode of The Drive.
01:56:57.260 If you're interested in diving deeper into any topics we discuss, we've created a membership
01:57:01.520 program that allows us to bring you more in-depth, exclusive content without relying
01:57:05.900 on paid ads.
01:57:07.060 It's our goal to ensure members get back much more than the price of the subscription.
01:57:11.560 Now, to that end, membership benefits include a bunch of things.
01:57:14.560 One, totally kick-ass, comprehensive podcast show notes that detail every topic, paper,
01:57:20.020 person, thing we discuss on each episode.
01:57:22.520 The word on the street is, nobody's show notes rival these.
01:57:25.960 Monthly AMA episodes or Ask Me Anything episodes, hearing these episodes completely.
01:57:31.060 Access to our private podcast feed that allows you to hear everything without having to listen
01:57:36.180 to spiels like this.
01:57:37.780 The Qualies, which are a super short podcast that we release every Tuesday through Friday,
01:57:42.360 highlighting the best questions, topics, and tactics discussed on previous episodes of
01:57:46.580 The Drive.
01:57:47.320 This is a great way to catch up on previous episodes without having to go back and necessarily
01:57:51.860 listen to everyone.
01:57:53.600 Steep discounts on products that I believe in, but for which I'm not getting paid to endorse,
01:57:58.660 and a whole bunch of other benefits that we continue to trickle in as time goes on.
01:58:02.720 If you want to learn more and access these member-only benefits, you can head over to
01:58:06.320 peteratiamd.com forward slash subscribe.
01:58:09.280 You can find me on Twitter, Instagram, and Facebook, all with the ID, peteratiamd.
01:58:16.080 You can also leave us a review on Apple Podcasts or whatever podcast player you listen on.
01:58:21.220 This podcast is for general informational purposes only and does not constitute the
01:58:25.620 practice of medicine, nursing, or other professional healthcare services, including the giving of
01:58:30.920 medical advice.
01:58:32.400 No doctor-patient relationship is formed.
01:58:34.520 The use of this information and the materials linked to this podcast is at the user's own
01:58:39.660 risk.
01:58:40.560 The content on this podcast is not intended to be a substitute for professional medical
01:58:44.940 advice, diagnosis, or treatment.
01:58:47.880 Users should not disregard or delay in obtaining medical advice from any medical condition they
01:58:53.900 have, and they should seek the assistance of their healthcare professionals for any such
01:58:58.580 conditions.
01:58:59.260 Finally, I take conflicts of interest very seriously.
01:59:02.680 For all of my disclosures and the companies I invest in or advise, please visit peteratiamd.com
01:59:09.640 forward slash about where I keep an up-to-date and active list of such companies.
01:59:29.260 Thank you.
01:59:44.260 Thank you.