In this episode, Dr. Carol Tavris and Dr. Elliot Aronson discuss their new book, Mistakes Were Made, but Not by Me, and how they came together to write it. They discuss how they began to work together on this project, what led them to write the book, and why they believe that no one is truly immune from the pain of cognitive dissonance.
00:08:57.780One little correction, Carol, you said cherry picking, which is true, but that implies consciously
00:09:06.040cherry picking. And the point that I was trying to make is that the cognitive dissonance reduction
00:09:16.100is an unconscious process. People don't say, hey, I think I'm going to reduce a little dissonance
00:09:23.980right now. They just do it. And it flies just below the level of awareness. So that when George Bush
00:09:34.660was convinced there were weapons of mass destruction, he convinced himself of that,
00:09:40.620even though the evidence was ambiguous. There was some evidence that indicated that Saddam Hussein
00:09:49.200did have weapons of mass destruction, and some evidence indicated that he didn't. And he simply
00:09:55.680was hell bent on invading Iraq so that he downplayed the importance of the evidence that would have
00:10:05.320cautioned him not to invade. And I think that is something we all do if we're not careful.
00:10:13.680And there are certainly some fields in which I think the implications of, I have done something
00:10:20.880incorrectly and new information emerges that suggests I've done it incorrectly becomes very hard
00:10:27.620to swallow. Probably nowhere near as difficult to swallow as if you're the commander in chief and you've
00:10:33.320been single-handedly responsible for what followed, uh, what I guess would have been March 19th, 2002,
00:10:40.160I guess, or three would have been that invasion. But I think what gripped me and what got to me so
00:10:46.600much, uh, even from the first reading of your book was as a, as a doctor, you think about how many times
00:10:53.040in medicine we do things and then new evidence emerges that maybe that wasn't the right thing to do.
00:10:59.680And sometimes it's not immediately obvious, right? But it's subtle. It's like, well,
00:11:04.600we used to nutritionally tell people that this thing was the right thing to eat, but more and more
00:11:10.300evidence seems to suggest it's not, but you've been telling everybody that thing. So what is the
00:11:16.200implication? But before we get into these examples, maybe let's give people some real explanations of
00:11:23.080dissonance. So I, I, I think you guys, either it was in your book or I I've seen this elsewhere,
00:11:27.900but there's, there's a great example of the dissonance. It's like a person who knows that
00:11:33.380smoking is bad for them, but still smokes. How does that person get through the day? What is that?
00:11:40.120What, how do you describe that tension that must exist in that person?
00:11:43.920That's our most famous example, Elliot, take it away.
00:11:46.940I have to say that Leon Festinger's example. And I was a student, I was very lucky. I was a student
00:11:56.300of Leon Festinger, just as he was inventing the theory of cognitive dissonance. I was his graduate
00:12:04.840student and his major domo and became his protege and friend so that I kind of inherited cognitive
00:12:16.040dissonance theory. And the example he uses is a person is, smokes two packs of cigarettes a day.
00:12:25.400And then the evidence starts becoming clearer and clearer that smoking can cause cancer and other
00:12:32.960diseases. And what does he do with that? Well, those two cognitions, I am a smart, sensible person
00:12:42.600and I'm smoking cigarettes, even though I know it causes cancer. Well, the simplest sounding thing
00:12:51.280to do is to give up smoking. But it's a lot harder to do than people might think, because it is addictive.
00:13:01.500If a person tries to give up smoking and can't, then how does he reduce that dissonance? And dissonance
00:13:08.900is a negative drive state. It feels terribly unpleasant, like being extremely hungry or extremely thirsty,
00:13:18.820but it takes place in the mind. So it makes you very uncomfortable. And if you can't reduce the
00:13:27.340dissonance by giving up smoking, then you work on the other cognition, that smoking causes cancer.
00:13:33.580And you might try to convince yourself that it's really mostly correlational evidence and therefore
00:13:39.700not really definitive. No one has done a controlled experiment with hundreds of thousands of people
00:13:47.720forcing some to smoke and forcing others not to smoke, which would be, of course, an unethical
00:13:55.240experiment. But in the mind of the person, that experiment would need to be done before I'd be
00:14:01.920convinced. Or you could convince yourself that obesity is a health risk. And by smoking two
00:14:09.640packs a day, I'm keeping myself from eating all of those rich desserts, which would have made me obese,
00:14:17.380and I probably would die of a heart attack. Or it's debonair to fly in the face of danger and smoke a
00:14:27.740cigarette like Humphrey Bogart in the movies. As I'm a really exciting person, I would rather live a
00:14:36.260shorter but more interesting life than one where I was forever being cautious. All of these things,
00:14:43.920each one of them and all of them together can be used together as a way of allowing me to smoke and
00:14:50.960still feel good about myself. It lets us sleep at night, to use your point, Peter, as well. The
00:14:58.640ability to reduce dissonance is what allows us to say, I'm doing something stupid, but look, here are
00:15:05.460all the reasons that I justify it. There was a study not long ago of pregnant women smoking, and pregnant
00:15:12.840women have a double knowledge of smoking being bad not only for their own health, but for the baby's
00:15:17.940health. And what did these women say in explaining why they were going to keep smoking? Well, I've cut
00:15:23.600down. Now the amount of cigarettes I smoke every day isn't really as hazardous as it would have been
00:15:29.520before I cut down. So this is indeed how we sleep at night. And as Elliot has often said, that's the
00:15:37.760benefit of our ability to reduce dissonance. But you know what? Sometimes some sleepless nights are
00:15:44.400called for. Especially if you're the President of the United States making life and death decisions for
00:15:53.660millions of people. Well, that's what's interesting, right? You've discussed this as an incredibly
00:16:00.220negatively valenced emotion, right? I mean, I love the way you compared it to the incredible discomfort
00:16:06.780of starvation or thirst, but it's psychological discomfort of that variety. Because I know so
00:16:14.100little about psychology, it's amazing to me to understand the timeline of these things. Now,
00:16:19.380you show up at Stanford in 1955 as a grad student. 55 is about when Leon got there. It's only two years
00:16:25.940later that this theory is put forth. In the grand scheme of things, that seems relatively recent in my
00:16:33.220mind. I'm not saying that to discredit that or the field, but what you're describing sounds so
00:16:38.860fundamental to the way we as humans live that it strikes me as such a major breakthrough.
00:16:46.740Like, why didn't this happen a hundred years sooner? Was there some other critical piece of insight
00:16:52.100that was necessary that preceded this amazing observation that took place barely 70 years ago?
00:16:59.480I think psychologists for a long time had the notion of rationalization, that people often
00:17:06.480rationalize their own behavior, which is a kind of a pale version of cognitive dissonance theory.
00:17:15.160And we all knew about that. And that was that, okay, people rationalize. But the genius of Leon
00:17:22.000Festinger, first of all, the way he really invented the theory was because he was studying
00:17:29.420rumor transmission. And in India, there was a major earthquake. And a lot of people got killed.
00:17:38.600And what he learned was rumors spread at the epicenter of the earthquake. And Leon was studying rumors. So
00:17:49.360he saw that these rumors were very reasonable rumors. They spread, don't worry, help is on the way.
00:17:55.680People are coming. They're going to rescue us. They're going to bring food. We're starving, but they're going to
00:18:01.500things are going to be okay. Those were the kinds of rumors that made sense. They were comforting rumors.
00:18:08.300Meantime, there was a city about 15 or 20 miles away, where there wasn't a great deal of damage, but there was
00:18:16.640enough shaking and enough damage, and enough people got mildly injured, that they were really anxious and really scared.
00:18:24.980And the rumors that spread in that area was that there was a typhoon coming, that there was going
00:18:33.760to be a hurricane, that there was going to be huge flooding. And people were really worried about all of
00:18:41.180that. And Festinger scratched his head and said, why in the world would people spread rumors that would
00:18:48.580increase their anxiety? And what he arrived at as a strong possibility was that the earthquake made them
00:18:58.240feel extremely anxious. But they had very little to be anxious about because hardly anyone got hurt and
00:19:08.680hardly any destruction occurred. So they invented future things that were going to happen and spread rumors
00:19:17.940about them in order to justify their anxiety. And that was the beginning of cognitive dissonance theory.
00:19:26.720That's one part of it. The second part of it was that Leon Festinger was a genius as an experimental
00:19:36.760psychologist, so that he immediately thought up three or four really interesting experiments that went way
00:19:46.220beyond what anyone ever conceived of in terms of mere rationalization, showing that cognitive dissonance
00:19:55.500reduction works in ways that are often counterintuitive. It isn't the obvious thing that your grandmother
00:20:03.760thought about and would tell you about. It happens in ways that are exciting and interesting when you
00:20:12.380understand the theory and seem completely off the wall if you don't know the theory.
00:20:18.980I'd love to hear an example of one of those experiments or such. Yeah.
00:20:22.940I'll give you the best example that I can think of, which is one of the early experiments
00:20:27.520by Leon Festinger and an undergraduate at Stanford who was, he eventually became my graduate student
00:20:35.840when I was, became a professor, a guy named Merrill Carl Smith. And what they showed was if you pay
00:20:45.060someone $20 for telling a lie to another person, he knows it's a lie. And if you ask, like what he had to tell
00:20:59.060the other person who was about to go into the experiment to do a tedious task, like the kind of task that you would
00:21:09.240be doing if you worked on an assembly line, packing spools, turning a screw half a turn to the right for a couple
00:21:17.380of hours, which was really tedious. But he pretended he had just come out of the experiment doing that.
00:21:25.200And his job was to tell the participant who was waiting to come in next, that it was really an
00:21:33.520interesting experiment. And Carl Smith gave him $20 to do that. In another condition, he gave him $1
00:21:43.560for doing that. And what happened was that the students who were given $1 for doing it actually came to
00:21:53.740believe that the task that the task was more interesting than the students who were paid $20
00:21:59.060for doing it. Completely upside down from what would be predicted by the dominant theory of the time,
00:22:08.380the behaviorist theory of reinforcement, that the more you're paid for something, the more you like it.
00:22:15.840What cognitive dissonance theory predicts is the less you're paid for it, the more you have
00:22:23.740to add justifications of your own. So if you're paid $20 for telling a simple lie,
00:22:30.620you can say to yourself, well, I sold my soul, but $20 is a pretty good price for my soul.
00:22:38.800But if you're paid only $1, in effect, you're asking yourself, now, why did I do that for a lousy dollar?
00:22:47.960Well, you know, it wasn't such a big lie, because, you know, the task, on the surface, it looks like a boring task,
00:22:55.060but it's really a lot more interesting and more intricate than it really looks on the surface.
00:23:00.920And they convinced themselves, not that the task was exciting, but it wasn't so bad.
00:23:06.840So here's the relevant thing about this, Peter, when you were saying about the origins of cognitive dissonance.
00:23:14.260Psychological science, when I was a graduate student, was almost an oxymoron.
00:23:19.760Everybody joked about it. Psychology, what are you talking about? It's not a science.
00:23:24.120It's an oxymoron. It's not research-based. It's not empirically based.
00:23:28.320When Eliot was first doing his work, the dominant paradigms in psychology were psychoanalytic or behaviorist.
00:23:37.440Those were the two big schools that were devoted to explaining how human beings operate and how they think and what motivates them.
00:23:47.200And both of them were past their prime by the middle of the last century.
00:23:52.200What Eliot was doing in terms of cognitive dissonance was, first of all, looking into the black box of the mind,
00:24:01.500which behaviorists were ignoring completely.
00:24:04.160We just have to observe behavior. It's all a matter of rewards and punishments.
00:24:08.980And saying, no, no, there's something happening in there that affects our behavior most profoundly.
00:24:13.380And it was also a time of questioning the qualitative observation, the non-scientific observations of the psychoanalytic approach to understanding behavior,
00:24:27.840which were lively and popular and wrong.
00:24:32.580So comes cognitive dissonance and the cognitive revolution and the world of psychology changed.
00:24:42.200The world of psychological science changed.
00:24:45.320Oh, that's so good. I have to come in because I got an example of the psychoanalytic thing.
00:24:51.360I came to Stanford in 1956 and I got my PhD in 59.
00:32:42.760I find this fascinating, because it strikes me as sort of a very modern problem, like something
00:32:48.760that only in the last few thousands of years could this have even been relevant.
00:32:53.180But of course, I'm just not thinking of the right examples, perhaps.
00:32:57.900It's our position that cognitive dissonance is hardwired.
00:33:02.520And the way that happens in terms of evolution is that it has more survival value in the sense
00:33:13.320that if you are a hunter back 10,000 years ago or 20, 30, 40,000 years ago, and you experience
00:33:24.360dissonance because you've done something that hurt one of your tribesmen or in a way that you're
00:33:32.840concerned that he might take retaliation.
00:33:36.200And you lay awake all night worrying about it, and then the next morning, you get up bright and early to go hunting, and you get pounced on by a tiger because you didn't sleep at night because you were so busy worrying about this thing.
00:33:53.000You're not as vigilant as you normally would have been, then chances are your genes are less likely
00:34:04.040So those who are good at reducing dissonance, those who are good at saying, ah, it wasn't
00:34:10.300such a bad thing I did, he'll forgive me, I'm sure.
00:34:14.100And then you sleep soundly, you get up and you do your hunting, and everything is fine.
00:34:18.780I think that that cognitive dissonance reduction, the ability to reduce dissonance, is hardwired precisely because it has survival value in that sense.
00:34:31.360And as human beings evolved and created new technologies, as agriculture emerged, as economies grew and flourished, people had more beliefs to defend as being the right ones.
00:34:45.680If you're living in a little band where you have your creation myth of how your people came to be, and you never meet anybody from another band with a different point of view,
00:34:56.740the day that you do, the day that the next tribe arrives and says, no, no, your God isn't the right God, our God is the right God.
00:35:04.800Well, now, what are you going to do with that information?
00:35:07.680What are you going to do with information that your way of planting crops is the wrong way and our way is a better way?
00:35:13.840So to the extent, this to me is one of the most interesting things about dissonance.
00:35:23.100It's to the extent that a belief is really deeply important to us, that's when we become most tenacious in holding on to it.
00:35:34.580It's why, for example, it's not just dumb people who feel the need to reduce dissonance.
00:35:41.600The greatest danger comes from smart people who refuse to accept the evidence that they have done something foolish or stupid
00:35:49.380or that they were holding on to a belief or a medical practice long past its shelf life.
00:35:54.400And now you're saying, I, a smart, competent, professional person who knows more about this subject than anyone in the world,
00:38:02.960And so he keeps him in prison for another 25 years.
00:38:07.700And he does that not because he's an evil guy, but because he thinks of himself as a good guy and a competent guy,
00:38:19.240as someone who would never make a terrible mistake like that.
00:38:23.980Yeah, it's funny you mentioned that, Elliot.
00:38:26.120One of the things I was thinking about when I read the book, and I made a note to bring it up because I went back and re-skimmed it for the 57th time,
00:38:43.380Had a roommate that it was this tragic thing where the roommate and I think the roommate's boyfriend were murdered.
00:38:47.800And it was pretty clear on first pass that Amanda had nothing to do with this.
00:38:53.380And yet this prosecutor in Italy had a real bee in his bonnet that she was hands down the perpetrator.
00:39:00.200And then, of course, the DNA evidence emerges because the killer actually went to the bathroom while he was in the house, was my recollection.
00:39:07.020They basically get his DNA out of the toilet.
00:39:10.280Nope, it's this other guy who they find.
00:39:12.780And the prosecutor keeps moving the goalpost.
00:39:15.460Well, maybe she didn't actually kill him, but she was in cahoots with this guy, though there's no evidence she's ever met this guy before at any motive.
00:39:23.040I mean, the whole thing got more and more and more ridiculous.
00:39:35.620Another example, which I've heard you lecture on, Carol, is the Duke La Crosse one from, God, it's probably 15 years ago now, right?
00:39:44.380Well, in that case, so many issues played into this.
00:39:47.720Walk us through that story and how that's another great example of a case study in cognitive dissonance.
00:39:52.120The examples in all of these cases are what happens when a district attorney, for political reasons, personal advancement reasons, decides that he or she knows who the guilty person is and then just narrows the focus on getting that person convicted
00:40:10.100and ignoring any disconfirmed, dissonant evidence that would throw that assumption into question.
00:40:18.600The La Crosse case was the guys from the La Crosse team who had a stripper to a party at their, I guess, their fraternity house, right?
00:40:28.560And she later claimed that she had been raped.
00:40:30.760Well, this story fit every, every story that just touches so many buttons of race and women and fraternities and how terrible fraternity guys are and they're all racist rapists anyway and so forth.
00:40:47.120The, I don't know, something like 80 faculty members at Duke took out a ad in the school paper about the toxic masculinity of the La Crosse team in particular and men in fraternities in general and so forth.
00:41:00.020And so they were all backing themselves into a corner until it turned out that the district attorney was not giving the exculpatory evidence to the defense.
00:41:10.700The district attorney was eventually disbarred.
00:41:13.440It was a scandalous case, but he saw in this story of race and gender and rape and fraternity brothers a way to really make name and fame for himself.
00:41:23.500That's a particularly egregious example, but it's not uncommon.
00:41:30.400And my favorite example, of course, is the, is the Central Park Jogger case where these black kids, they actually confessed to the crime and were sent to prison.
00:41:44.360A fellow named Donald Trump took out full page ads in all of the New York newspapers saying they should be executed, even though some of them were underage.
00:41:56.660And then it turned out that the DNA evidence never matched on the kids and, and the semen that was found on the woman.
00:42:05.320And then some other person who was in prison for a similar crime confessed to it.
00:42:11.040And sure enough, his DNA evidence did match and the prosecuting attorney, Linda Feirstein insists to this day that she was right.
00:42:24.060And she would refuse to overturn the evidence that had to be overturned by the district attorney who oversaw the case and totally vacated the evidence.
00:42:36.860And the city of New York paid a huge compensation amount of money to those kids who were, who spent several years in prison.
00:42:45.240And Feirstein said, the, just the typical thing that prosecutors say, we always knew there was a sixth man.
00:43:32.420And I think the great story that I know you've talked about, because it's hard for you, Carol, you talk about it being one of the times that really challenged you was the McMartin nursery scandal back in the early 80s.
00:43:43.820And there's a great quote, I think you said about Elliot, something to the effect of, we sacrificed our skepticism at the altar of outrage.
00:43:54.920But again, let's tell that story, because that's another great example.
00:43:59.240But what I want to do is tell this story through the lens of how do we think about this in the current era, where accusations are coming greater and greater.
00:44:10.320And some of these accusations are going to be true, and some of them are not going to be true.
00:44:36.460So on the one hand, it sounds completely idiotic now, but then at the time I can say, but if you were a parent whose kid was going there, you could easily see yourself getting sort of spiraled out of control too.
00:44:48.780So there's a part of me that actually has quite an amount of empathy for everybody involved.
00:44:53.480And it just overall seems like a really tragic story.
00:44:56.660Tell us about that story, specifically your own struggle within it.
00:45:01.960First of all, this is a very, very important question, because here's what happens.
00:45:08.080The minute there is a sensational story in the news, somebody is accused of something, for example, what does the public generally do?
00:45:19.700Now, what dissonance theory would predict is the minute we make a decision, believe this person or believe the other person,
00:45:47.980we will now make our belief conform to the evidence we're prepared to hear as things go forward.
00:45:57.380So this is the danger of the early jumping to a decision, because then as time goes on, that choice, that belief that we have will harden rather than become more open to disconfirming evidence.
00:46:12.600We will start looking for all the reasons we were right, to believe the accuser or to not believe the accuser,
00:46:18.340and we will ignore and minimize or trivialize any information suggesting that we were wrong.
00:46:24.620That is why that first decision is such a crucial one, because again, the more we put into supporting that initial decision,
00:46:34.880the harder it will be to change our mind.
00:46:36.580People say this is the slippery slope, but the thing that dissonance theory teaches us is that there's nothing slippery about it.
00:46:43.980It's our active self-justifications for the beliefs we have that take us down sometimes what turns out to be a wrong path.
00:46:53.760We mentioned in the update to our book, there's a wonderful YouTube that Sarah Silverman did
00:46:58.980that shows the pain of dissonance right there on the YouTube screen, where she talks about her feelings about Louis C.K.
00:47:08.540when he was first admitted to have been behaving in these inappropriate ways with women sexually.
00:47:33.720And what he did was a terrible thing to women.
00:47:37.040And I want to side with the women whom he offended so deeply.
00:47:41.480She doesn't answer it, but she lives with this dissonance of this uncertainty.
00:47:50.140My case with McMartin was something else.
00:47:55.200I was in Los Angeles at the time when the mother and her two children who were working at this daycare center were suddenly accused of what turned out to be utterly preposterous assaults on the children in their care.
00:48:12.560In spite of the fact that for many, many years they had run this daycare center, nobody noticed anything amiss.
00:48:17.580Parents were walking in and out of the place all the time.
00:48:19.900But sanity and common sense generally goes out the door when you have a sentence with the words children and sex in the same sentence.
00:48:29.800Just as you said, somebody comes and tells you, a police officer turns up at your door, as the police did in this case, and says,
00:48:38.220Peter, there have been some allegations of child molestation going on at your child's daycare.
00:48:45.420Do you have any information about this?
00:49:01.240She was convinced that this was really happening.
00:49:05.460And there were no researchers at the time doing psychological research on how to interview children or on how children respond to repeated questions.
00:49:32.280If you don't ask leading questions, the child won't tell you, for example, that she's been in a medical exam with a doctor who touched her.
00:49:39.700Well, that seemed to justify the tactics that the social workers and police were doing with these little kids.
00:49:48.180And I wrote an op-ed for the Los Angeles Times called,
00:52:10.180Gets angry at a photographer for harassing her when she's being falsely accused in the press and in court of having done a heinous crime, which she is innocent of.
00:52:24.020But if we believe that she's guilty, everything she does begins to look like bizarre, dreadful behavior.
00:52:35.680Because once we have the rubric of she is guilty of child molestation, then we can't see her as an innocent person being angry at a harassing photographer.
00:52:50.120I want to just highlight that with a thought experiment, Elliot, which is you pick a person, whether it be someone who's accused of a crime or a political figure or something, who you just hold in absolute contempt, and imagine them in 10 different positions, smiling this way, frowning that way, sticking their tongue out, giving you the finger.
00:53:13.540There is not one of those 10 in which you can't come up with a negative narrative.
00:53:19.500You look at someone that you deem grotesque, no matter how they present themselves.
00:53:24.760Even if they're standing there with the slightest smile, you would view it as disgusting.
00:53:43.540I'd like to add what I think is the important take-home on the McMartin case.
00:53:48.980McMartin was the first of a wave of hysterical cases across the country in which daycare workers from here to Boston, hundreds of them were accused of these kind of ritual sexual molestation of children in their care.
00:54:06.180Hundreds of daycare workers went to prison.
00:54:21.520The first reaction that anybody had to McMartin, we didn't know anything.
00:54:27.380The public had no way of knowing that the first allegation was made by a woman who was so psychotic, so crazy, known to the police, that they stopped even listening to her after a while.
00:54:39.780Nobody knew that the police had gone to every parent's home, leading them into a panicky reaction as your child has been sexually molested.
00:54:48.440And nobody understood how the children were actually being interviewed by social workers who were bringing in those anatomically detailed dolls with prominent genitals and testifying that they knew if a child had been molested based on how the kid was playing with the doll.
00:55:06.280Well, as you would have said before, you know, those genitals are pretty interesting.
00:55:12.140No little kid is going to ignore the genitals.
00:55:15.200But if they did ignore the genitals, it's become they've been sexually abused and traumatized.
00:55:19.480And if they play with the genitals, it's because they've been sexually abused and traumatized.
00:55:23.320It took psychological scientists to do a controlled experiment and ask children known to have been sexually molested and children known not to have been sexually molested, give them these dolls, see how the kids play with them.
00:55:40.620And as you can imagine, there was no difference between these two groups.
00:55:44.900So the kinds of therapists who were marching into court with no psychological scientific training at all, just their own observations, their hunches and their biases, could testify with assurance that they knew that this kid had been molested.
00:55:59.200So this is the kind of research that was done in the aftermath of these daycare cases that came to transform how we understand children's testimony so that we can help the children who have been molested, but not destroy the lives of innocent adults either.
00:56:19.660So the task for us, embarrassed as I was, dissonant that I felt for my own participation in what I wrote about McMartin, I really began atoning for this by writing about what was happening in the other daycare cases across the country, such as the Amaralde case in Boston, which was almost a mirror image of McMartin.
00:56:43.560There's a very subtle difference between always believe the victim versus always be skeptical and take the accusation seriously, right?
00:56:53.800Those are not the same thing, but they're similar.
00:56:58.360They can look similar at the outset, but this is a layer of nuance that seems to be missing from a lot of the discussion today, isn't it?
00:57:47.980I would substitute for that is always pay attention, always listen, but keep an open mind and realize that there are probably at least two sides to every story.
00:58:02.880So be respectful, listen, but keep some skepticism in mind.
00:58:10.020Now, some of the radical feminists are saying, so it doesn't matter.
00:58:15.340So what if some guy is falsely accused and falsely loses his job?
00:58:22.480It's almost as if it serves the people of that gender right because of all the abuses our gender took in the past.
00:58:32.680And that's never a reasonable position.
00:58:51.100One of my favorite descriptions you ever gave was after you wrote your first lengthy tome on that, someone, you said you got a lot of, how did you describe it?
00:59:00.240A lot of lovely requests to go and passionately make love to yourself or something like that.
00:59:07.700May you go forth into the world and multiply by yourself, yes.
00:59:15.620Well, you see, any exhortation to believe, just to believe.
00:59:21.300I mean, I remember a cover of Ms. Magazine many years ago about the alleged satanic ritual abuse cults, which were supposedly afoot all over the country.
00:59:30.400And people were believing that there were satanic ritual abuse cults that were trapping children and women and so forth.
00:59:37.020And the cover of Ms. Magazine had one of these satanic images with the cover line, believe it.
00:59:42.180Believe it? Really? I'm supposed to believe it?
00:59:45.980So the idea that as women were going into psychotherapy and coming out believing that they had multiple personalities, not just three or five, but 10, 100, 500, there was an escalation of the belief in multiple personality disorder,
01:00:02.420which was another hysterical epidemic in our country, fomented by many psychoanalysts and psychiatrists and social workers, until malpractice suits brought that bubble right down.
01:00:14.400But to say, I understand that a person in therapy may be having trouble and may be suffering, but I get to question the explanation that the therapist is giving them, it's not a matter of disrespect.
01:00:28.380It's a matter of understanding and bringing our best science to bear on understanding.
01:00:33.040So I have always been skeptical of the believe X group, whatever X group is, uncritically.
01:00:40.860What's the evidence? What's the best explanation for it?
01:00:44.260So one of the more interesting to me, ways of understanding many of these very difficult, he said, she said, disagreements and debates, which are so painful and so difficult.
01:00:56.960And we do understand how many, many thousands of women have been sexually abused in so many ways, but, but here's the but.
01:01:10.200If you are bringing up an allegation that occurred six months ago, six years ago, 35 years ago, we're entitled to talk about what we understand about memory,
01:01:24.860what we understand about the way in which information since the original event changes our memory or changes our interpretation of the event.
01:01:33.560Events that happened to us 25 years ago, events that happened to us 25 years ago, that at the time we thought were benign, we can come to see as being malevolent and traumatic and oppressive.
01:01:44.040That's all part of how the psychological process works.
01:01:48.260And what social psychologists like Deborah Davis have shown is that a person doesn't have to be lying to be wrong in making an allegation.
01:02:00.140And a man doesn't have to be lying, he can be self-justifying in responding to an allegation.
01:02:08.560This is a different level of understanding that I think is important, especially when charges involve the possibility of ruining someone's life.
01:02:20.340It doesn't fall into the category of there's always a simple answer and there's always a right or wrong way to see a particular allegation.
01:02:29.500This point also about memory, probably about six years ago I read, or five years ago, it was after I had read your book for the first time, but I read the book by Katherine Schultz, Being Wrong.
01:02:42.680And I also was very, I mean, moved is probably the wrong word, but I was struck by how feeble my memory was.
01:02:51.340Because I've always, I think up until reading that book, I had never questioned that the way I remembered something was correct.
01:02:58.180Because I have a very good memory for random stupid things, you know, like I can tell you that on, you know, Monday, June 27th, 1988, I did this and it happened this way and that way and this way and that way.
01:03:12.200And I can be right on some of those things, but I can be surprisingly wrong on other things.
01:03:17.140And I think reading that book helped me appreciate how much I could be wrong about.
01:03:22.700And that became really scary to me because it really made me realize that I'm quite fallible to my own BS.
01:03:32.600I thought I was somehow, I levitated above that.
01:03:36.720Like, I just thought whatever was in Peter's memory vault had happened.
01:03:40.640And now to know that that's not true is a little scary, but to your point, it's probably not true for anyone.
01:03:47.240The common idea is that we have a little tape recorder inside our brain and all we have to do is press the button and it'll all come out.
01:03:57.100It's wrong because it's not all in there.
01:04:01.140A lot of it gets confabulated and things get mixed together that don't belong together.
01:04:07.360So it doesn't have to be any matter of self-justification or any kind of dissonance reduction.
01:04:13.400It can be just wrong, just randomly wrong and misremembered in some sort of harmless way.
01:04:21.720As in Carol's famous memory of that book that she's sure her father read to her.
01:04:40.320When we were writing our book, I had a vivid, vivid memory of my father reading me James Thurber's The Wonderful O.
01:04:49.220Oh, it's a, by the way, a wonderful book about pirates who remove the letter O from the alphabet, from speech and from every object.
01:04:58.440You may keep geese, but not a goose and so forth.
01:05:00.940Anyway, it's a wonderful, wonderful book.
01:05:02.800And I remember him reading it to me and are laughing about what names would be like without their O's, Ophelia, Oliver, and so forth.
01:05:09.600And then, because I like to read that book every so often just to cheer myself up, I go and I see its publication date, which was one year after my father died.
01:05:20.320It just hit me in the, how could, what?
01:05:33.040Who's reading me a book when I'm a teenager?
01:05:35.200Or, you know, all of the things that I had made up around that and they were wrong.
01:05:39.680Which is why I like to think of dissonance theory as an incredibly helpful mechanism of understanding that is a form of arrogance control and certainty control.
01:05:54.520And if we can learn to have passionate beliefs that give our lives color and meaning that we live by, but to hold them lightly enough so that if the evidence comes along that our favorite diet is wrong, we can finally have to say, you know what?
01:06:50.880And certainty can easily morph into arrogance.
01:06:56.100I remember once when I was serving as an expert witness in a murder case, I was presenting some psychological data that favored the accused person.
01:07:08.600I said, all of the things being equal, and the prosecuting attorney leaped on that, saying, oh, sure, all of the things, but in the real world, professor, all other things are hardly ever equal.
01:07:23.860And of course, as you know, court trials are somewhat theatrical, because they're playing to the jury.
01:07:32.300And I finally had to turn to the judge and I finally had to turn to the judge and said, that requires an explanation.
01:07:38.660And the judge was very sympathetic to me, and I finally had to say that all other things being equal is simply the credo of experimental science.
01:07:49.840It means that you control extraneous variables that could distort the data.
01:07:58.180Not that it proves your case, it's simply random error that can distort the data, so that it's a good thing to hold all other things equal, not a bad thing about science.
01:08:12.340That's the kind of thing that we're talking about, that I'm not sure why I started that story.
01:08:21.140Well, no, because, see, this is another interesting thing.
01:08:24.980It's why many people prefer the pseudosciences, where somebody gives you a certain answer.
01:08:32.100I know that if you eat this thing, this following health benefit will occur, whereas scientists have the irritating habit of talking in probabilities.
01:08:41.520It is likely that, it is probable that, in fact, any scientist knows that the people who speak in certainties are not speaking scientifically.
01:08:50.200That's just not how scientists think, and that's not how they would present their findings.
01:08:54.780But certainty is what most people want to hear.
01:09:00.380But, of course, if someone really is certain about something, they have almost certainly frozen their ability to change their minds when they need to.
01:09:09.300I want to talk about something that you guys have written about as you have your pyramid diagram.
01:09:15.000I talk about it as sort of a path dependency is the way I kind of describe it to people.
01:09:18.820And I find this to be, I think, one of the most remarkable explanations that you provide for how two people can start out with a very similar set of beliefs and yet come to these very pivotal moments where a decision is made that sets them on a path.
01:09:40.380And that path becomes reinforcing and reinforcing and reinforcing and after a long enough period of time, those people are so divergent in their beliefs and in their behaviors and including their view of each other that they seem unrecognizable as though they came from the same species, let alone that they once stood next to each other.
01:10:46.580But, you know, it's not the worst sin in the world.
01:10:49.660And now here they are in their exam and they draw blank on this crucial question.
01:10:54.920And the crucial question will determine their grade on the exam and in the manner of students, not only on the exam, but in this course and in life.
01:11:03.160And everything will go south if I don't get an A on this exam and so forth in the way that students often panic.
01:11:09.020And suddenly they are given the opportunity to cheat as the student next to them, the one with the beautiful handwriting, makes her answers available to them.
01:11:19.760Now they have a second to decide, cheat or not cheat.
01:12:07.220But the one who resisted cheating will now feel even more strongly that cheating is wrong and unfair and it's not a victimless crime.
01:12:16.660What about all the people who don't cheat and work hard and learn the material?
01:12:21.500Over time, as they move along, their attitudes about cheating and their self-justifications for their own behavior will keep reinforcing.
01:12:34.360And we use the metaphor of the pyramid because they start out at the top side by side.
01:12:39.540But by the time they have finished justifying one step at a time their own behavior, they stand at the bottom of the pyramid, very far apart from each other.
01:12:50.480And for me, what I find most illuminating in this metaphor is that it demonstrates how hard it is to go back up.
01:13:00.800Because now here you are at the bottom and you've spent all this time and energy justifying your decision to cheat or not cheat.
01:13:07.160How are you going to now go back and say, you know, that first step I took off the pyramid was really the wrong one.
01:13:12.400So that is the metaphor we use in explaining why people get themselves locked into a belief or a practice.
01:13:22.200And what that metaphor really illustrates beautifully is that every time a person has to make an important decision, and it's a difficult decision, like the cheating example, he or she is doomed to experience cognitive dissonance.
01:13:41.840For the ones, for the students who cheats, the dissonance is, I see myself as a basically honest person, and yet I'm committing a dishonest act.
01:13:55.540Therefore, to reduce dissonance, that act isn't so very important because everybody would do it.
01:14:03.440For the person who resists the temptation to cheat, the cognitive dissonance is, I could have gotten a really good grade in this course, and that would have allowed me to go to medical school, and I chose not to do it.
01:14:18.800Therefore, it would have been a horrible crime if I had cheated.
01:14:23.380So that you cannot escape cognitive dissonance, no matter which way you choose, and the cognitive dissonance is followed by self-justification, which changes the attitude enormously.
01:14:38.420Elliot, to borrow your phrase from a moment ago, all things equal, if I took 200 students and looked at 100 of them who elected one path or the other, so just pick whichever path you want to discuss.
01:14:49.300How much variability is there in the dissonant response amongst that group?
01:14:56.800So if you take, for example, say the group that decided to cheat and now has to spend the rest of their life justifying, this was not victimless, this has allowed me to get to graduate school, I'm going to have a greater impact on the world, blah, blah, blah, blah, blah, blah.
01:15:09.820How variable are those students in their responses, and what is the extent to which that is subconscious versus conscious and how it plays out?
01:15:20.320There's no real answer to that question without doing the experiment.
01:15:23.820But in the experiment that was done by one of my fellow graduate students at Stanford, a guy named Judson Mills, there was almost no overlap between the final feelings between the kids who cheated and the kids who didn't cheat.
01:15:42.780Each one deviated from their feelings about cheating a day before they were put in that situation.
01:15:50.640So there was a little overlap, but it had a major impact on their attitudes about cheating.
01:15:57.640If I may, it happens that we have an example here in our very own book from the high-achieving, high-pressure Stuyvesant High School in New York City.
01:16:07.340Seventy-71 students were caught exchanging exam answers, and they gave the New York Times reporter a litany of perfect self-justifications, which allowed themselves to keep seeing themselves as smart students of integrity.
01:16:25.220One said, it's like, I'll keep my integrity and fail this test?
01:17:09.940So there's another example that's even more distressing, though, and I think this was in your book, but if not, Carol, I know we've discussed it, which is-
01:17:20.420Well, but I think it is actually in there, which is you get to the point where the cop is just planting evidence on people who are clearly innocent.
01:17:43.200As soon as the battering ram goes through the door, you can see the plume of smoke.
01:17:47.060You see this one guy running into the bathroom, slamming the door, locking it, and you hear him literally shoveling drugs into the toilet as he flushes it.
01:17:57.400And just as you get the door open, you see the last swirl of water taking that last gram of cocaine down the toilet, and you are out of luck.
01:18:09.140And in that situation, when you know, as sure as God made little green apples, that these guys are filthy, you plant that cocaine right on that guy, and you make your arrest, and you finally have your bad guy.
01:18:21.940And there's another cop there who would refuse to do it, even in that situation.
01:18:27.380Now, again, those guys are at the top of the pyramid.
01:18:30.080What can those guys look like 10 years later?
01:18:32.880That guy who planted that drug in that situation, which seems quite justifiable, right?
01:18:38.600What is that guy doing 10 years later?
01:18:42.120You remember this bozo in the O.J. Simpson trial?
01:18:44.920Well, I mean, I guess to me, one of the points of these stories is, you probably aren't starting out with the most egregious examples of behaviors that people think about, right?
01:18:58.120We look at people very often who are at the bottom of the pyramid, and we don't realize that they started at the top.
01:19:03.220They'd made decisions, very, very small, tiny decisions.
01:19:06.840Jeb Stuart Magruder, in our book, and how he got enmeshed in the Watergate scandal, it was one step at a time, starting with the smallest, smallest compromises, until he could not get unenmeshed, right?
01:19:20.000Very often we look at the behavior of people without realizing all the time, effort, and money that they put into justifying their behavior as they got further and further along.
01:19:32.440So behavior that seems really puzzling to us at a distance makes far more sense when we see it in terms of this one step at a time, which is, you know, one of Elliot's great experiments.
01:20:13.480And as you said, if you're faced with a hard decision, I think most of us don't appreciate what the consequences of that can be if we take it with the wrong mindset.
01:20:24.680And this is something I wanted to come to later, which is identity versus behavior, right?
01:20:30.940So it's sort of like you can say in this moment with these facts, I'm going to make this decision.
01:20:38.940But it's, I think, I think when you can do that without pinging your identity to that decision and instead just ping your behavior to it, that I think it becomes a lot easier to move off that path when you're encountering new information.
01:21:06.740He hears the toilet flush and sees all of the evidence going down the toilet.
01:21:11.760Now, you said you can see how that is justified that he plants the cocaine because he saw he knows they're guilty.
01:21:20.500And I would suggest that it's never justified, that it's understandable that he might plant the cocaine.
01:21:32.780But because of what I know about how the human mind works, I also know that once he takes that step of planting cocaine that he didn't actually find there, it makes the next step and the step after that a lot easier.
01:21:52.620And the next thing you know, there's a totally innocent person who has been framed for a crime he didn't commit.
01:22:00.060And notice, by the way, the dissonance a police officer will feel for another cop in the room to say, no, no, no, no, this is wrong and don't do this.
01:22:12.680I mean, we see this is one of the problems in our discussions of police departments.
01:22:17.920The presence of somebody who is behaving ethically is dissonant to someone who is not behaving ethically.
01:22:26.820You're reminding me, you're reminding me that I'm doing something that I shouldn't be doing if I were an ethical person.
01:22:34.260And these guys get ostracized by the culture of most police departments, which is you have to go along, you have to play along.
01:22:43.680How much do you think this figures into police racism, police brutality?
01:22:48.240Do you feel that this is a similar starting point as well?
01:22:53.080Or do you think that racism has far deeper roots that go far outside of conscious choices that are made?
01:22:59.920I think racism in the police departments is a very complex issue.
01:23:05.280But it begins with, I think, a false and illusory correlation.
01:23:10.940Since most street crimes happen in poor neighborhoods, and since a lot of poor neighborhoods consist of ethnic minorities, black and brown people,
01:23:23.140I think cops have learned that those dangerous neighborhoods with a lot of street crimes are often caused by black people,
01:23:33.920then they become suspicious of all black people and become quite brutal of it about them in their behavior toward them.
01:23:41.780And again, it's understandable, but not justified that they behave that way, the way they did.
01:23:49.680And in the George Floyd case, it's, again, one step at a time with two of the other cops acting almost as bystanders who are allowing it to happen
01:24:04.320because the culture is such that they don't want to interfere with the most brutal cop because he seems to know what he's doing
01:24:14.140and they're afraid of being ostracized by their fellow cops if they interfere with it.
01:24:21.400And that's that one step at a time, which is exactly how we define how cognitive dissonance can get us into trouble.
01:24:32.520Because each step you take down that pyramid, you justify it until by the time you reach the bottom, you don't recognize yourself anymore.
01:24:45.740That's why Jeb Stuart McGruder is a perfect example, because he was a guy with good, high morals,
01:24:55.360and he saw himself in retrospect being corrupted in the Nixon administration until he ended up doing things that a year earlier he never would have dreamed of doing.
01:25:09.220But one step at a time, justifying each one along the way, when he finally woke up as he was being sentenced to prison,
01:25:18.380it was like waking up from a bad dream.
01:25:21.800This is actually one of the most powerful experiences for me in working with Eliot on this book was finding the stories of people
01:25:30.480who were able to break out of the cocoon of self-justification that they had spun to protect themselves.
01:25:39.220And we're suddenly able to see themselves and the consequences of the behavior in a clearer light.
01:25:45.780You see, sometimes that light is obscured for us.
01:25:48.240I mean, for example, when we talk about systemic racism,
01:25:51.840systemic racism, which is very hard for most people to understand and experience,
01:25:55.280because by definition, we experience things as individuals.
01:25:58.420But there was an important study in Seattle some years ago
01:26:01.460of the way that the police and the city government were defining what a drug crime was.
01:26:11.540What kind of drugs do we want to go after?
01:26:14.060Whom do we want to arrest for what kind of drugs?
01:26:16.860So the kind of, you know, white kids using cocaine, well, that's okay.
01:35:30.360How could I learn something from that?
01:35:32.920But that's our recommendation for the way to go.
01:35:37.580In Eliot's absolutely wonderful memoir, Not by Chance Alone, he uses this observation about what drew him to the field of social psychology.
01:35:57.320And I think that is really a guiding principle of why we wrote this book, because it's an examination, an indictment of so many institutions here, the therapy world, the criminal justice world, family relationships, so many domains in which, by understanding, we do have the power to change.
01:36:21.420So what year did the first edition came out?
01:36:25.720So the world's changed a lot in 13 years.
01:36:29.860Obviously, you've shared that a big part of your motivation for this was sort of the frustration you had with the Bush-Cheney administration's continued justification for the war on Iraq long after it became quite clear that the reasons that were given to go to war didn't exist.
01:36:50.540And I also think it's safe to say that even within that administration, there were many different flavors of dissonance.
01:36:56.980But I think I've read maybe two or three biographies, including the autobiography of George Bush.
01:37:03.860I find presidential biographies very interesting.
01:37:07.880I don't think for a moment Bush felt he was pulling the wool over anybody's eyes.
01:37:13.160I really think he believed this to the essence of his core.
01:37:17.380But here we are 13 years later and Iraq and Afghanistan and most of the Middle East, quite frankly, is largely forgotten at this point.
01:37:25.780And not just with what's going on in terms of coronavirus, but I think more broadly speaking, what's going on in terms of populism, what's going on in terms of racism, a greater polarization within our political system.
01:37:39.080I mean, I think almost anybody would give their left arm to go back to 2007, politically, frankly, like when you had two somewhat reasonable parties that kind of behave like it seems a heck of a lot better when you first wrote the book.
01:37:51.180Are you more or less optimistic today?
01:37:53.600And how much do you think cognitive dissonance is factoring into what what really looks upsetting and ugly and just very unpleasant with respect to the way we are governed and with the way we govern?
01:38:07.260The problem, I think, is the hard polarization of the parties in our introduction to the revision.
01:38:17.180We quote Bob Dole saying, you know, Bill Clinton is my opponent, not my enemy.
01:38:22.580I mean, how far we have come from the idea of seeing the other party as an opponent and not an enemy, which is quite an explicit tactic.
01:38:32.520So it's back to your question about identity in a way.
01:38:35.880One of the curious things that have happened in our country is that political identity has come to have a primary importance for people in ways that it did not at one time.
01:38:48.820That is, you know, it used to be you'd say, well, would you want your child to marry a fill in the blank, a person of another religion, a person from another city, a person from another ethnicity, a person from a different religion and so forth.
01:39:01.720Now, the thing that you most don't want your kid to marry is somebody from the other political party.
01:39:08.680It's that the hostility of that attitude is a sign of the problem of polarization.
01:39:16.400And what that means is if political parties have become the central part of people's identities, then by definition, it makes it very hard to accept any evidence that somebody from the other party might have a good idea, might be doing the right thing, might be somebody I can listen to.
01:39:35.580Although the current political scene is worrisome and we can all be pessimistic about it, one of the things that has been most heartening for both of us in writing this book has come from the hundreds upon hundreds of letters from people telling us what they have learned from the book and how it has affected them.
01:39:54.400Now, you know, authors get these, it changed my life kind of things, but we get stories.
01:39:59.140We get stories from people who have explained how they have taken an understanding of cognitive dissonance into their own lives with often very interesting results.
01:40:10.480So, for example, once I heard from a man who told the following story, he said, I've got five siblings and we've been at war, at war, over the legacy of our family inheritance.
01:40:22.320We've formed two factions. We've formed two factions. We've been fighting with each other and mediation hasn't helped and nothing has helped and we're estranged from each other and so forth.
01:40:31.460And then he said, I read your book and I gave it to our mediator and I said, here, I said, give this book to my brothers and they will immediately understand what they're doing wrong.
01:40:43.980I mean, he didn't quite say it that way. He gave it to the mediator. Here, have my brothers read this book.
01:40:47.660He said, I got no reply. He said, and then a couple of years later, he said, I picked up your book again and I read it and I said, oh, he said, oh, he said, incredibly, the words on the page had transformed themselves.
01:41:02.880And I wrote to the mediator and I said, tell my brothers that I now understand what I have been doing wrong in our discussions.
01:41:12.620I have been greedy. I have been selfish. I haven't thought enough about how you guys have seen the situation.