The Art of Manliness - July 31, 2025


The 3 Types of Failure (And How to Learn From Each)


Episode Stats

Misogynist Sentences

2


Summary

Amy Edmondson is a professor of leadership at Harvard Business School and the author of The Right Kind of Wrong: The Science of Failing Well. In this episode, she talks about the three types of failure, which are most productive, and which are less fruitful, and how to best use the former and prevent the latter.


Transcript

00:00:00.000 Brett McKay here and welcome to another edition of the Art of Manliness podcast.
00:00:11.440 People often think of failure in one of two ways, as something that hinders the pursuit
00:00:15.380 of success or as something that's a necessity in obtaining it, as in the Silicon Valley mantra
00:00:20.060 that recommends failing fast and often. There's truth to both ideas, but neither offers a
00:00:24.740 complete picture of failure. That's because there isn't just one kind of failure, but
00:00:28.680 three. Here to unpack what those three types are is Amy Edmondson, a professor of leadership
00:00:33.580 at the Harvard Business School and the author of The Right Kind of Wrong, The Science of
00:00:37.740 Failing Well. Today on the show, Amy shares which type of failure is most productive, which
00:00:42.360 types are less fruitful, and how to best use the former, prevent the latter, and learn from
00:00:46.740 failure of every kind. We also talk about how to organize potential failures into a matrix
00:00:51.440 that will help you best approach them. Along the way, we dig into examples both big and
00:00:56.020 small of how individuals, organizations, and families can put failure to work for them.
00:01:01.860 After the show's over, check out our show notes at awimp.is slash failwell.
00:01:17.400 Amy Edmondson, welcome to the show.
00:01:19.540 Thank you so much for having me.
00:01:21.040 So you got a new book out called The Right Kind of Wrong. It's all about how we learn from
00:01:25.720 our mistakes. So you spent your academic career researching failure. How did that happen?
00:01:32.480 Not a lot of people end up researching failure. How did you end up in that field?
00:01:36.160 Well, I suppose you could say it was a little bit by accident. And let me explain that to say
00:01:42.060 I was interested in learning. And even more specifically, I was interested in the problem
00:01:47.120 of organizational learning, that organizations need to keep learning in a changing world.
00:01:54.100 And it turns out that is easier said than done for a whole host of factors. And one of the factors,
00:02:01.140 one of the major factors that just kept coming up again and again in my research was that was our
00:02:06.680 allergy to failure. The reality is things will go wrong, especially in an uncertain and complex world.
00:02:13.180 And if we don't have a healthy response to that, we don't learn. We don't learn and grow as
00:02:19.420 individuals. We don't alter our systems in ways we need to so our organizations are more effective
00:02:24.980 and so on.
00:02:26.600 Well, it seems like in recent years, I'd say in the past 15, 10 years, there's been this mantra
00:02:31.880 of fail early, fail often that's come out of Silicon Valley, where it seems like they're trying to
00:02:37.260 rehabilitate failure, saying it's okay to fail. But you argue that this mantra, this idea misses
00:02:43.960 the mark when it comes to the benefits of failure. How so?
00:02:47.760 That's right. It's not so much that it misses the mark, is that it's very woefully incomplete,
00:02:53.280 right? Or it's not specific enough or not. It's too broad brush. And it doesn't say under these
00:02:59.820 conditions, this is not good advice. And just to illustrate with an obvious one, you wouldn't say
00:03:05.600 to your surgeons and operating room personnel, fail often, you know, have a great time. Of course not,
00:03:12.240 right? You would say, gosh, can you try to get it exactly right? And they would know that there was
00:03:18.460 that job. So the problem with the fail fast mantra isn't that it's wrong, it's that it only applies
00:03:24.780 to certain kinds of contexts.
00:03:27.000 Right. You wouldn't want like a nuclear reactor to fail early.
00:03:29.860 No, no, not even close. Or even an automotive assembly line, you don't say fail fast. You say,
00:03:36.160 hey, how about Six Sigma, right? Let's get it right. You know, we know there might be things that go
00:03:41.480 wrong. But if we can have our quality be at the one error per million events level, we're really good.
00:03:48.440 And there's no reason we can't get there.
00:03:50.180 So what you do in this book is you walk readers through your research that you've spent your career
00:03:56.140 looking at on how we can get the benefits of failure and when it's okay to fail early or fail
00:04:03.140 fast and when it's better to, okay, well, we might make mistakes, but let's try to reduce that. And you
00:04:09.700 categorize failures into three failure archetypes. You have intelligent failures, basic failures,
00:04:16.040 and complex failures. Let's talk about intelligent failures. What makes a failure intelligent?
00:04:22.400 Intelligent failures are the right kind of wrong in my view. So a failure is intelligent. It should be
00:04:28.460 applauded and used as often as possible when in pursuit of a goal in new territory with a hypothesis
00:04:37.040 as small as possible, you turn out to be wrong. So to say that more clearly, a failure is always an
00:04:44.640 undesired outcome. We don't want to fail. We want to succeed. But if that undesired outcome happens in
00:04:51.120 new territory, in pursuit of a goal, you know, where you've done your homework, you've thought it
00:04:56.840 through, you have good reason to believe it might work, and you don't expend more resources or take a
00:05:04.340 greater risk than you have to to get the knowledge you need to go forward, then it's an intelligent
00:05:09.380 failure. And you, um, in the book, you highlight back in the mid-century, the 20th century, when
00:05:15.920 doctors were trying to figure out how to do open heart surgery. This was an example of intelligent
00:05:20.520 failure. The risk was really high, but the payoff, they were in new territory, but the payoff was
00:05:25.540 significant, but they did all they could to reduce failure. Yeah. Right. And one of the ways they
00:05:31.120 mitigated risk, or one of the ways they kept the failures, which must have been, you know, devastating
00:05:36.300 emotionally and intellectually to experience, one of the ways they kept those failures as small as
00:05:42.940 possible is that they, of course, only operated on people who truly had no other choice. You don't
00:05:49.280 operate on a healthy patient and say, hey, this might not work. We've never done anything like this
00:05:53.120 before. No, of course not. They were operating on patients who had very little chance of
00:05:59.480 survival over any kind of long-term. And so in a sense, these patients only hope was a surgical
00:06:07.620 repair. And yet such a thing hadn't been attempted before. So the odds were, you know, not truly high
00:06:15.260 that it would work out well the first time. And I described this, it's such a visceral example of,
00:06:21.700 yes, a series of very intelligent failures that ultimately led us to the stunning success that open
00:06:28.840 heart surgery is today. Yeah. Now it's just a matter of routine. Practically. Yeah. But you also argue
00:06:35.400 that, okay, these are the right kind of wrongs, but we still, even though when we know like we're doing
00:06:40.340 something that's new, there's going to be a big payoff, we still have that aversion. Yes. So what's
00:06:46.920 holding us back from making these intelligent failures? Well, I think that aversion is part of it.
00:06:52.060 And so that leads us to be risk averse. And the word is captured right in there. I mean, you know,
00:06:59.360 many people, in fact, most people are more, you know, are sufficiently risk averse that they will
00:07:06.000 fail to make progress on desired goals or, you know, fail to experience all sorts of things that they would
00:07:13.660 dearly love to do, but they can't. Because if you want everything to go perfectly,
00:07:19.800 then you will not do anything that has the risk of failure. And if you choose not to do anything
00:07:26.880 where there's a risk of failure, then you're not growing, stretching, venturing into new territory.
00:07:33.180 And then the other aspect you talk about too, is that there's that social fear, you know,
00:07:38.000 no one wants to be a loser. No one wants to be a failure. So there's that element as well.
00:07:42.820 Absolutely. I think most people can think of a thing they'd be willing to do behind closed doors.
00:07:47.960 You know, I'll take a risk, I'll try something, and it might fail. But as long as no one else sees
00:07:53.560 me doing it or knows about it, then I'm okay with it. We don't want the embarrassment or the
00:07:57.740 humiliation of the failure. And then you talk about this idea I think people might have heard about
00:08:03.280 of if you want to avoid that social fear, that social stigma, is groups need to develop what's
00:08:09.620 called psychological safety. What is psychological safety?
00:08:12.900 Well, psychological safety describes a climate, an interpersonal climate, where you really don't
00:08:20.780 feel afraid to take risks like speaking up with an uncertain idea or disagreeing with someone or
00:08:27.860 admitting a mistake or acknowledging a failure. So all of those kinds of behaviors, those interpersonal
00:08:34.660 behaviors are quite, they can be uncomfortable because you worry that people might think less of
00:08:40.120 you. You might be rejected. And so in an environment where you know, yeah, that's just what we do.
00:08:45.480 It might not be easy, but I believe my team members will welcome this, or I believe my family expects me
00:08:53.180 to be candid about this. So that describes psychological safety. It's quite literally an environment where you
00:08:59.540 can take interpersonal risks of the kind that are so necessary for problem solving and innovation and
00:09:07.000 ideation and all of those good things.
00:09:09.800 Well, this goes back to a study you did early in your academic career where you thought it was a
00:09:15.000 failure, but you actually found out this actually led you down some new research paths. I think early
00:09:20.100 on you were researching teams, like what makes a good team? Does a good team culture reduce mistakes?
00:09:26.260 And your study found, oh my gosh, this team that looks awesome. Like it seems like they're cooperative.
00:09:31.020 Everyone's fantastic. They actually made more mistakes and you thought, oh my gosh, my hypothesis
00:09:36.740 was wrong. But I think what you found out was like, actually, you know, what happened is this team,
00:09:41.800 the teams that look like they have good camaraderie, they're talking, et cetera. They actually just
00:09:46.560 talk about their mistakes more than the teams that don't have that connection.
00:09:50.540 It's exactly right. And of course it took me, you know, hours to even think that might be what was
00:09:55.460 happening to realize that as a possible interpretation of the confusing data, because the confusing data
00:10:00.700 were suggesting, as you said, that the teams with higher camaraderie, higher quality relationships
00:10:05.640 were ones that also had higher error rates. Now, that just seems, you know, so wrong on multiple
00:10:12.700 levels until you start to realize, well, wait a minute, the error rates, how objective are they?
00:10:19.020 You know, how do we get those data? And you begin to realize that really the only way you're getting
00:10:24.340 data on people's errors is if they're telling you about them or if they're not hiding them,
00:10:29.540 if they're letting them be discovered, which is, you know, not easy for people to do. So I began to
00:10:36.280 suspect that these better teams weren't making more mistakes. They were just more open about them.
00:10:42.180 And later on, I called that difference in climate, psychological safety, found a way, developed a
00:10:50.720 way to measure it, and ultimately found that that measure is very predictive of team performance in a
00:10:57.680 huge variety of industry contexts. And we've had guests on the podcast that deal with family
00:11:04.260 psychology. One thing that you often see is that couples who don't argue, there's probably problems
00:11:09.900 there if they're not having arguments, the same sort of thing. If you don't feel comfortable raising
00:11:13.660 concerns with your spouse, that's a problem. Right, right. It can't be that you just have no
00:11:19.680 disagreements or that you just see eye to eye on everything. There's never a conflict. There's never a
00:11:25.540 trade-off. You know, there's never who's going to pick up the kids at daycare today kind of moments,
00:11:31.300 right? That just isn't, that's probably not possible. So, you know, any relationship, any
00:11:38.080 healthy relationship is going to have disagreements, conflicts, challenges, arguments. And so if you're
00:11:44.280 not hearing any of that, if that isn't happening, it might at first glance seem like a good sign,
00:11:49.780 but it probably isn't. It probably means people are either afraid to disrupt the apparent harmony
00:11:55.880 or maybe one partner, not the other, feels afraid to, you know, to be themselves and just to speak up
00:12:01.780 openly and honestly. Okay. So recap, intelligent failure, it's when you're going for, it's new
00:12:07.240 territory. What were the other factors? Well, it's new territory in pursuit of a goal
00:12:12.120 with a hypothesis as small as possible. So there are four criteria. Okay. And what we're really
00:12:18.360 talking about, of course, is an experiment. I mean, you're acting, but at some level, you know that
00:12:23.540 what I'm trying to do here to achieve a result that I care about may or may not work. And so it's an
00:12:31.120 experiment technically. And you try not to have experiments that are larger than they need to be
00:12:37.180 when there's uncertainty, meaning you don't want to, you know, invest more money than you can afford in
00:12:43.880 an uncertain investment. You don't want to use more patients in a clinical trial than you need to,
00:12:50.020 to be able to demonstrate efficacy of the treatment. That makes sense. So if you're thinking
00:12:54.200 about starting a business, you don't, might not necessarily want to quit your day job and then
00:12:59.500 mortgage your house and, you know, bet it all on this business. There's other small steps you could
00:13:06.000 take to eventually start your own business. Yes. So mitigate risk by doing it incrementally,
00:13:12.500 you know, doing it. And that might sound too risk averse, but it isn't because each of those
00:13:17.220 incremental steps involves some risk, but you're managing risk, which is smart.
00:13:22.960 So another type of failure you talk about is a basic failure. So what's a basic failure and how
00:13:28.440 does it differ from an intelligent failure? Basic failures are pretty simple. They're undesired
00:13:33.560 results that were caused by human error. And human error means there was a right way.
00:13:39.200 You know, you can think of it as a recipe. There was a recipe that could and should have been used
00:13:44.520 to get the result you want, but you made a mistake and it led to a failure. So basic failures are
00:13:51.800 simple. That doesn't mean they're small. They can be big, they can be small, but technically,
00:13:56.960 or at least theoretically, they're preventable. You know, when we're really at our best, when we're
00:14:01.140 alert, when we're vigilant, when we're exercising good teamwork, we can catch and correct most basic
00:14:07.340 failures. Okay. So basic failure would be swapping out salt for sugar in a recipe.
00:14:13.220 Exactly. Exactly. You know, and the cookies will taste or, you know, or the, or the sauce will taste
00:14:19.340 bad because now it's got too much sugar in it and not enough salt.
00:14:23.460 Right. And yeah, but these basic errors, they can be big. You talk about a Citibank employee who
00:14:28.160 did something wrong with the computer and like transferred, I think it was millions.
00:14:32.280 $800 million. So it was essentially, you could think of it as a checking the wrong box in an
00:14:38.460 online form, which means you're not being vigilant enough, right? If, if by, if you have a system
00:14:44.300 that can allow you to accidentally transfer essentially the principle of a loan rather than
00:14:49.960 the interest, it's probably not a well-designed system, but nonetheless, it was a system that needed
00:14:54.840 to be used with great care and human error led to this massive kind of economic loss.
00:15:05.000 And you also talk about a basic failure, unlike an intelligent failure, a basic failure occurs in
00:15:09.840 known territory. So like, you know, the recipe, you know, the protocol for transferring money.
00:15:15.020 It's just, you messed up somewhere along the way.
00:15:17.160 You messed up. Yeah.
00:15:17.860 Right. And to say that it's in known territory and it's not as kind of valuable as an intelligent
00:15:22.920 failure doesn't mean you can't learn from it. We can definitely learn from our basic failures,
00:15:28.340 right? I try very hard to learn from my own basic failures, but the sort of amount of learning is
00:15:34.520 arguably less, much less than for new failures, intelligent failures, where there just wasn't a
00:15:41.020 recipe. So you had no choice, but to kind of create a recipe.
00:15:44.480 Yeah. I think the things you can learn from a basic failure is you learn about human nature.
00:15:49.200 Like when is our tendency to be, you know, not paying attention? When do we have a tendency
00:15:53.900 to neglect things? And so you can develop systems or protocols to counter that.
00:15:59.140 Yes. Yes. Self-awareness and systems or protocols. So, you know, an example of a basic failure would
00:16:05.260 be if you text and drive and then got into an accident, that would be a basic failure. And obviously
00:16:11.180 the thing you learn is pretty darn clear here. Do not do that, which I think most of our listeners
00:16:16.380 know that already, but we know it still happens as well.
00:16:20.480 One of the things that people have done over the years to counter these basic failures,
00:16:24.700 a simple checklist can get you through a lot of things. This started off with pilots during
00:16:30.000 World War II. They have a checklist they go through. And then now in the medical field,
00:16:34.320 they also have very basic rudimentary checklists. You think, why would I need to go through? Did I
00:16:39.100 wash my hands? Did I... Right.
00:16:40.680 But it works.
00:16:42.580 Busy people. You know, busy people have a lot on their mind. And sometimes just to have that
00:16:47.020 concrete checklist in front of you will remind you, you know, put you back into the mindset you need
00:16:53.020 to be to be careful. And of course, Atul Gawande, the fabulous surgeon and author, wrote this marvelous
00:17:01.300 book, The Checklist Manifesto, about the power of something so simple as a checklist. And it's really
00:17:06.540 all about the prevention of basic failure. And it's also the case that just having a checklist
00:17:13.480 won't itself be enough, right? They have to be used with intent.
00:17:18.640 Well, yeah. You highlighted a case of a pilot. There was a checklist. They were supposed to
00:17:22.260 defrost or not. Something like that.
00:17:24.440 Yeah.
00:17:24.640 They didn't...
00:17:25.320 Air Florida. It's a famous and tragic accident. It's Air Florida Flight 90 back in January of 1982.
00:17:32.420 And the pilot and the co-pilot went through the checklist essentially in their sleep. You know,
00:17:39.500 it's Air Florida. So you can imagine that most of the time when you're Air Florida, you're in warm
00:17:44.500 weather. Most of the time when you're saying anti-ice off, the correct answer is off. And that's
00:17:51.620 exactly what they did. The pilot said anti-ice off and the co-pilot said off, you know, as if that was
00:17:57.440 the right answer, on a cold January blistery icy day in Washington, D.C., that was in fact decidedly
00:18:05.280 the wrong answer. Not going through the anti-ice routine led to the downing of that flight and the
00:18:11.540 death of 78 people.
00:18:14.240 Okay. So it's a great application for this in everyone's just day-to-day life. Look at the
00:18:18.480 things you do on a regular basis where you've noticed simple mistakes happening because of
00:18:24.240 you're not paying attention and develop a checklist for that and follow it. And this could be like,
00:18:29.320 you know, when you're packing for a trip, right? Instead of thinking, oh, I forgot, whatever,
00:18:33.380 have a checklist you use every single road trip.
00:18:37.060 Absolutely. You know, I finally did that myself because there were just were too many times where
00:18:41.540 I'd get somewhere. I realized I forgot socks, you know, and then what are you thinking? You weren't
00:18:46.940 thinking, right? You need a little help.
00:18:49.300 And that's why you have the checklist. Okay. So that's basic failure. The next type of failure
00:18:53.360 is a complex failure. What makes a complex failure complex?
00:18:58.360 A complex failure is complex because it's multi-causal. So the characteristic complex
00:19:04.000 failure has multiple factors that contributed to the failure. It's the interaction between the
00:19:10.180 factors that led to the mess because any one of these factors on their own would not cause a failure.
00:19:16.720 So they're not like the basic failures where you just need this one mistake and boom, there's a
00:19:20.320 failure. There could be mistakes involved, but you don't need mistakes involved. You just need an
00:19:25.080 unfortunate coming together of several factors that lead to a kind of breakdown.
00:19:32.160 What's an example of that that we've seen in recent years?
00:19:35.720 Well, recent years, I mean, in one of my research studies in the hospital context,
00:19:41.760 there was a young patient, a 12-year-old boy named Matthew, actually not his real name, but
00:19:47.980 nonetheless, he was mistakenly given, mistakenly meaning by the system, I guess, a potentially fatal
00:19:55.120 dose of morphine. Fortunately, it was caught and counteracted. But in my analysis, I identified like
00:20:02.980 seven different factors, any one of which on their own wouldn't have led to this sort of accidental
00:20:08.540 overdose. The first thing was that the hospital's ICU were post-surgical patients, which Matthew was
00:20:15.580 ordinarily go, was full. So then he had to go to be cared for by people who don't usually care for
00:20:21.480 post-operative patients. So they were kind of less specialized. He happened to be cared for by a brand
00:20:27.860 new nurse who maybe didn't have as good instincts about the processes of care. There was a pump that is
00:20:35.760 used for this pain medication that was in a sort of dark corner, making it harder to read. You know,
00:20:41.160 there was sort of a printing error in the, or not error, a printout of the concentration that was
00:20:48.380 folded in a way that made it hard to read. So I don't need to go into all the details, but the point
00:20:54.080 is that any one of those factors on their own wouldn't lead you to give this overdose, right? It was the
00:21:00.720 fact of them all coming together at the same time that created the perfect storm.
00:21:05.960 No, this reminded me, when I got to this section, reminded me of Clausewitz, the famous military
00:21:11.200 strategist who wrote On War. And he had this idea of friction. And he talked about this. He says,
00:21:17.880 everything in war is very, he's talking about war here, but I think you can apply this to anything.
00:21:21.780 He says, everything in war is very simple, but the simplest thing is difficult. The difficulties
00:21:26.920 accumulate and end by producing a kind of friction that is inconceivable unless one has experienced
00:21:32.440 war. So yeah, friction isn't just one thing. It's a bunch of stuff coming together. And then he goes
00:21:36.840 on and he says, countless minor incidents, the kind you can never really foresee combined to lower the
00:21:43.100 general level of performance so that one always falls far short of the intended goal. Yeah. So in war,
00:21:50.060 as in life, it's not just one thing. It's like a bunch of different stuff that, that comes up.
00:21:54.400 It's many little things. And I opened the chapter on complex failures with a classic and tragic
00:22:00.880 environmental spill back in 1972 called the Torrey Canyon, which is the name of the ship.
00:22:06.380 And the captain himself describes this as, quote, many little things, unquote. And that's,
00:22:13.040 that's exactly the same sentiment. And any one of those little things on their own wouldn't have led to
00:22:18.200 this horrible spill. But the unfortunate coming together of these factors leads to an outcome that
00:22:26.280 nobody wants. Right. It's going to happen. You probably see this in your daily life when you're
00:22:30.140 late for work. Yeah. It's like your kid gets sick and then the other kids lost their homework and then
00:22:36.180 the car needs gas. And then there's more traffic because it's raining and... Right. And it's not one of
00:22:42.260 those things. If one of the things happened, you would have been okay. But all of them together combined,
00:22:45.820 it just created, that created the problem. Right. And the thing about complex failures,
00:22:51.020 because they sound kind of like, oh, well, there's nothing we can do about it,
00:22:54.080 is that in fact, they offer many little opportunities for course correction, right? Because
00:23:00.400 there's so many different factors. In many cases, all you needed to do was kind of notice and alter one
00:23:07.300 of them to prevent the failure. We're going to take a quick break for your words from our sponsors.
00:23:15.000 And now back to the show. Well, one thing you talk about in the book and in your research is that
00:23:19.860 complex failures are increasing in our modern age. Why is that?
00:23:24.240 Well, I think we have so much more complexity in our world. Information technology maybe is first
00:23:31.940 among equals for this reality, which is we're interconnected in ways that we never have been
00:23:38.300 historically. So the phrase going viral, a photo or a tweet or something that you really didn't want
00:23:47.000 to get attention may inadvertently get attention simply because we're so interconnected. It's just so
00:23:53.480 quick. Things can spiral out of control very quickly. I also think we seem to have more
00:23:59.100 complex and unpredictable weather patterns, more complex and interconnected supply chains that are
00:24:06.280 global so that we're more likely to be impacted by random events in another part of the world
00:24:12.220 in our day-to-day life than was ever possible in history. Yeah, we experienced, I think everyone
00:24:17.260 experienced to some extent the supply chain things during COVID, right? And that's just a perfect
00:24:23.160 example. Everything's so interconnected. You make one small change here, then the distributor has
00:24:28.120 to make a change and the retailer has to make a change and then the consumer has to make a change
00:24:31.960 and then everything just gets mucked up. Few people can't show up for work because they're sick,
00:24:36.700 right? And then that leads to shortages here and, you know, something doesn't get delivered and
00:24:41.320 then that spirals out of control. But you say that, okay, complex failures, we're not completely at the
00:24:48.360 mercy of complex failures. You say there's often warning signs that there's a problem. How do you
00:24:53.640 know when there's a warning sign with a complex failure? Well, you really won't know unless people,
00:25:00.480 let's go back to psychological safety again, because so many times and many of the complex failures that
00:25:06.520 I've studied in company and government settings, it is not the case that the bad news or the failure
00:25:15.000 came in totally out of the blue. It is nearly always the case that there was one or more people
00:25:20.700 who, you know, had a worry, thought something wasn't quite right, but really didn't feel safe to speak
00:25:27.920 up about it in a timely way that might have allowed for corrective action. So psychological safety plays a
00:25:35.680 really important role in our ability to prevent complex failures. And if it sounds like a fool's errand,
00:25:42.380 it really isn't because there's a whole body of research called high reliability organizations that
00:25:47.360 addresses the question of, you know, why do some inherently risky and dangerous undertakings like
00:25:56.820 air traffic control or, you know, nuclear power plants operate safely just essentially nearly all the
00:26:03.780 time, right? And it's through vigilance and mindfulness and incredible teamwork and, you know,
00:26:10.940 psychological safety to speak up and listening to people and thanking them when they raise a concern,
00:26:16.580 even if that concern turned out to be nothing. So there's a set of management and cultural practices
00:26:22.740 that allow us to cope with our complexity and our interconnectedness.
00:26:29.680 And also you talk about there's things you can do to, you can have contingency plans in place to
00:26:34.940 kind of mitigate complex failures. Like I've experienced this with my podcast. Over the years,
00:26:39.040 I've learned that technology is fickle. Sometimes things are working great. And I never know if
00:26:44.760 someone's on a PC or a Mac or whatever. And so I've been able to develop systems where I've got
00:26:52.260 my main recording, then I got a backup of a recording, then I got a backup of the backup
00:26:56.280 and it saved me, but it's helped me reduce some of the risk with complex failures.
00:27:01.280 Exactly. And you've only done that because you've, you know, you may have learned from a failure,
00:27:05.660 right? But you recognize that, yeah, as much as we know as this is not new territory, right? In this
00:27:11.800 point in history, recording a podcast is not new territory. And yet you understand that several
00:27:19.300 factors could come together in just the wrong way. My computer might be different than, you know,
00:27:23.260 the last guest's computer. Exactly. As you said, so you figure out in advance that because things
00:27:28.600 might go wrong, I'm going to think about the backups. And also this idea of complex failures,
00:27:34.120 you talk about the idea that some complex failures are tightly coupled and some are loosely coupled.
00:27:39.120 What do you mean by that? Well, this is work by a famous sociologist named Charles Perrault,
00:27:45.240 who passed away a few years ago. And he just pointed out that when you have highly interconnected
00:27:52.540 complex systems with tight coupling, you're really at greater risk for complex failures. And
00:27:58.400 tight coupling refers to the following idea, that if there's an action in one part of the system,
00:28:03.780 it leads inexorably to a result in another part. So just to have a simple example, you know, you put
00:28:10.600 your ATM card into the slot, it just pulls it right in, right? And it's too late. Like you can't get it
00:28:17.680 back until the system decides to give it back to you, right? So tightly coupled refers to things that
00:28:23.640 once they're underway, you really can't stop them.
00:28:27.140 Okay. So an example of that is that the Citibank employee who transferred $800 million. Like they
00:28:31.480 did that one thing and it was done.
00:28:33.940 Right. Right. So that was a tightly coupled system that, by the way, you know, nowadays we have,
00:28:39.460 and this was only a few years ago, but you know, we have the duo, what are the,
00:28:42.380 help me out here?
00:28:44.140 Double opt-in. Yeah. Whenever, yeah. Whenever I do transfers with my bank account, it has to be like
00:28:48.280 one person and then another person.
00:28:50.340 And then they send you a code, you know, to your phone and then you have to enter that code. And those
00:28:54.940 things are in there. Sometimes they feel very annoying, but they're in there to prevent the tight
00:28:59.600 coupling that would just lead to so many more failures than what would be desirable for anybody.
00:29:06.300 Yeah. One of the takeaways I got from the tight coupling and loose coupling. So loose coupling is
00:29:10.360 the opposite of tight coupling. So you basically, you have some slack. So if you do have a failure,
00:29:14.920 you're able to correct. And it made me think of, you know, just adding more margin in your life.
00:29:21.040 Right. So instead of trying to like get to, to the last minute before you leave, you'll give
00:29:26.840 yourself 10 minutes. Cause you know, that might help you if there's traffic, it'll accommodate for
00:29:31.640 that. Brett, I love that because it's build it in. Like we all know this. We all know we're moving
00:29:36.100 too fast from meeting to meeting, you know, and then you're going to jump into your car and get
00:29:40.520 to that next thing. Deliberately build in, you know, at the beginning of the week or beginning
00:29:45.620 of the month or however you do it, build those buffers in slack and buffers are essentially the
00:29:50.680 same concept. And very few of us have enough of them to actually operate effectively or as
00:29:57.260 effectively as we would like to operate. So build in the buffer.
00:30:00.560 All right. So let's recap what we've talked about here. We talked about the three types of failures.
00:30:03.240 We've got intelligent failures and these are just experiments. Treat your life like an
00:30:08.040 experiment. You can do this with your job. So maybe there's some type of a new position you want
00:30:13.800 to go for. Well, try and experiment with it and see what you can do with it. Or I mean, a date could
00:30:19.820 be an intelligent failure. Absolutely. Trying a new hobby could be an intelligent failure. Then there's
00:30:25.920 a basic failure and that's just basic mistakes. And the way you counter that is a simple checklist
00:30:31.640 can often do the trick. Vigilance, attention. Right. And then we have complex failures. It's
00:30:38.120 multifaceted. It could be any one thing, but they all come together to create the problem.
00:30:42.980 So, you know, paying attention to warning signs, not being afraid to speaking up or pointing out
00:30:48.280 a potential issue, coming up with contingency plans and adding buffer.
00:30:53.740 Love that. Beautiful. Yeah.
00:30:55.780 So that's the first part of your book. You kind of lay this out and you go into more detail and I
00:31:00.100 hope our listeners go and read the book, but I love in the second half of the book, you talk about
00:31:04.840 some mindset shifts that people need to make when it comes to failure. And one thing you talk about
00:31:11.240 is we need to overcome our tendency to find someone to blame for a failure. So how does that help people
00:31:18.580 learn from failures? And then how do you hold people accountable if you don't find someone to blame?
00:31:23.220 Great question. So first of all, it's very natural to want to find someone to blame because it lets us
00:31:29.300 off the hook, like psychologically and emotionally, right? If something goes wrong and I can instantly
00:31:35.100 reassure myself that, Hey, it's not my fault, right? Then I've bolstered my self-esteem. I've kept
00:31:39.860 myself comfortable. I don't have to confront my own contribution to the failure. And so it's just a
00:31:45.840 very natural instinct. I even described in the book, a two-year-old who, when his father has a small
00:31:52.780 sort of collision with the passenger side mirror in driving too close to the parked cars, the little
00:32:00.180 child immediately hears the bang and says, I didn't do anything, Papa, right? It's like, of course not.
00:32:05.840 You're sitting in the toddler seat in the back seat. But this instinct to dodge blame is very, very deep
00:32:13.520 and well-learned, but it's unhelpful, right? It's unhelpful for our ability to sort of learn and grow
00:32:19.500 because even if you're, I mean, I think there are very few times where you're 100% to blame for things,
00:32:27.020 but you always have some kind of contribution that you could look at thoughtfully and, you know,
00:32:34.720 and wonder about and work on doing better next time and so forth. So it's actually consistent with
00:32:42.300 accountability because I think the way many people and many organizations think about accountability
00:32:47.880 is counterproductive. They'll say, well, we need to hold someone accountable and that's almost
00:32:54.080 synonymous with punishment. We need to punish someone or else it'll happen again. Whereas the
00:32:59.140 truth is, you know, when you really punish someone for something going wrong, what happens is it just
00:33:05.220 decreases the chances that you hear about things in a timely way. But, you know, back to psychological
00:33:10.980 safety. But the way we need to think about accountability is to be willing to fully
00:33:17.000 account for what happened, right? To take a clear-eyed scientific look at the events that
00:33:23.640 unfolded, understand what role you played, what role other factors played so that you can really learn
00:33:30.140 its valuable lessons and improve your practices next time. Gotcha. And this isn't to say, you talk about
00:33:38.000 this in the book too, that if someone is being malicious, then like you need, yeah, you need to
00:33:43.960 punish them. Yeah. I'm all for blame when people engage in what I'll call blameworthy acts. Right.
00:33:50.180 It's just that in our organizations and to a certain extent in our families, those aren't the norm,
00:33:56.920 right? You know, the norm is just human beings who make mistakes and are sometimes a little thoughtless
00:34:03.220 and sometimes a little busy and all the rest. But very few people are really waking up in the
00:34:10.060 morning and sort of intent on sabotaging each other. Right. So, yeah, if someone makes an
00:34:15.480 intelligent failure, you don't want to do any blame there for sure. No. No, you want to celebrate it.
00:34:20.160 Yeah, you want to celebrate it. A basic failure, you know, if they were, they didn't sleep enough or
00:34:25.680 they're out partying, then you might want to have that conversation. But it could be something's just
00:34:30.000 going on at work where it made it tough for them to pay attention. Right. I mean, they may have been
00:34:35.280 having, you know, to do cover for other people who are out sick or they may have, you know, had a
00:34:40.980 sick child last night, not got enough sleep. What you want to do is first understand what happened and
00:34:46.120 then figure out what kinds of protective measures to put in place to ensure that that same thing
00:34:52.940 doesn't happen a second time. Another thing you talk about is reframing failures. What does
00:34:57.760 reframing failure look like? Reframing means acknowledging that there's always a frame,
00:35:03.280 you know, like a picture frame. We're looking at reality. We're looking at the events in our lives
00:35:07.840 through frames that are largely unconscious, that stem from our background, our expertise, or,
00:35:14.940 you know, our various biases that we all have. And reframing is learning to kind of stop and
00:35:21.040 challenge how you see a situation. Wonder what you're missing. Ask yourself, is there another way to
00:35:27.280 see this? Right. So if you have that reflexive instinct to just blame, oh, I know what happened
00:35:33.120 there. It's the self-discipline to say, well, I have a partial view on what happened there and I'd
00:35:39.560 love to understand it better before I draw any conclusions. So yeah, I like the idea of just
00:35:44.280 thinking of failures like a scientist. What can I learn from this? I think that's a good reframe to
00:35:50.200 have. And you also talk about one of the things you talk about throughout the book is this idea of
00:35:54.440 context, like understanding the context of failure happens in, how can that help you learn to fail
00:36:00.480 better? You know, that the issue of context brings us back to why the Silicon Valley talk is limited
00:36:08.200 because it only works in certain contexts. And so the way I describe a context with the two dimensions I
00:36:15.660 use to figure out context is one is how much uncertainty is there, right? You know, how much uncertainty is
00:36:22.540 there and whether or not I can produce a batch of chocolate chip cookies, like very little, right?
00:36:27.400 Unless all the power goes out or something. It's almost a guarantee that if I follow the recipe,
00:36:32.360 I can produce those cookies. How much uncertainty is there that I can find a new drug that will cure
00:36:38.080 certain kind of cancer? Well, very high uncertainty indeed. And so that matters in terms of guiding my
00:36:44.500 actions. Number two is what are the stakes? And for me, stakes can be financial,
00:36:50.160 they can be reputational, or they can be human safety. And when you're dealing with high stakes,
00:36:56.520 you proceed far more cautiously, or you should, than when you're dealing with low stakes. And so if
00:37:02.680 you're dealing with, you know, high stakes and high uncertainty, you're conducting your experiments.
00:37:08.860 You'd have to experiment because you don't have the answers, but those experiments should be very
00:37:13.020 thoughtful and very small. Okay. I like that. And you have a nice chart to kind of walk people through
00:37:18.420 that. And so yeah, if it's, if it's low stakes and it's something new, it's like, yeah, have fun
00:37:24.280 experimenting. Yeah, exactly. And I literally have seen both errors in my research. I mean,
00:37:29.700 meaning sort of these psychological errors we make where I've seen people be a little reckless in
00:37:34.360 very dangerous situations and then bad things happen. And then I've seen people be, you know,
00:37:39.200 overly cautious because they want to get it right, or they want to look good in situations where there
00:37:44.860 really isn't a right answer. And the only way to make progress is to get out there and try stuff.
00:37:50.000 Gotcha. So a high stakes context, that's where you really need that mindfulness. Like if you're
00:37:53.900 an air traffic controller. Right. Super high stakes, but really well understood territory. So there's
00:38:00.000 where you're operating as mindfully and vigilantly as possible. Right. And then if you're in a situation
00:38:05.760 where it's new, so like you're doing experimental surgery and the stakes are high, you're going to want
00:38:12.080 to do careful experimentation. Right. Yeah. And try to mitigate the risk as much as possible.
00:38:16.800 Exactly. Okay. I love that. And you also recommend four failure practices. What are those and how can
00:38:22.320 they help us fail better? So the four failure practices that I write about actually in the last
00:38:27.120 chapter, how to thrive as a fallible human being are persistence, right? There will always be obstacles
00:38:33.220 in your path, in anything, you know, that you are truly hoping to do in your life or in your work,
00:38:41.040 right? If you're, if you have stretch goals, there will be obstacles. And so persisting,
00:38:45.700 you know, trying again, not being crippled by the inevitable failures that do happen is one of,
00:38:51.620 one of the practices. Reflection. I think all of us can benefit from doing more explicit reflection
00:38:58.900 that could be formalized and keeping a journal or could be weekly team reflections with your team at
00:39:06.400 work to kind of what went well, what didn't go well, but being systematic about our learning from
00:39:11.740 our own experiences is a super important practice. Well, I know that, um, in special forces,
00:39:18.020 they do what's called after action reports. Yes. So after a mission, they'll just get together and
00:39:23.880 it's very, it's formal, but informal and there's no blame. They just talk about what went right and what
00:39:30.460 went wrong. Yep. It's very clean, right? It's what, what did we set out to do? What actually
00:39:36.380 happened? What's the difference in why, what will we do next time? Very scientific, not emotional,
00:39:42.100 but it's almost storytelling to ourselves about to take our own experience and turn it into an
00:39:49.380 explicit narrative so we can understand it better. Because, you know, we all make the mistake of
00:39:54.080 thinking, Oh yeah, yeah. Right. I had that experience. I'll learn from it. Right. But we won't,
00:39:58.360 unless we pause to do the explicit reflection. And the third one is accountability. And that's
00:40:04.660 very related to reflection because that's being, it's being willing to confront and be honest about
00:40:11.200 the whole account. You know, again, what happened and that willingness to say, ah, here's what I did
00:40:18.040 that may have contributed to that failure. Here's what I did that contributed. Here's what I failed to
00:40:23.420 do that might've helped prevent it. So it's about, it's about being willing to own your role, which
00:40:30.240 can seem scary, but it's also quite empowering. If you think about it, you're facing the fact that you
00:40:36.300 do matter, right? That you, you have an impact and it's important to be willing to take, you know,
00:40:44.160 sort of to own it, I guess is what I mean. And then, and then finally, I talk about apologies as a real
00:40:50.860 valuable practice that we can all learn to do more of in an uncertain fallible world as uncertain
00:40:58.840 fallible people. Apologies play a very important role in relationships. If, if something goes wrong,
00:41:06.200 it's just so powerful to be willing to apologize for it. And, and that means of course, being willing
00:41:12.700 to take account of where you contributed, but good, good apologies. First of all, they signal that you
00:41:19.820 care about that relationship. You care about the other person. They express remorse. I don't mean,
00:41:25.840 you know, they don't have to be like horrifyingly remorseful, but just, I feel bad that I didn't call
00:41:31.400 when I said I would, but you express that remorse. You offer to make amends, you know, how can I make
00:41:36.480 it up to you? And you own your part in it. You don't sort of duck the accountability part.
00:41:43.160 And then beyond that, celebrate good failures, the right kind of wrong in your work with your
00:41:49.240 kids. If they do something that ended up in a mistake, but it's actually, they learned something
00:41:53.120 from it. Be like, Hey, that's, that's great. You learned something.
00:41:56.240 Yes. You know, I divide the, I sort of have the individual practices, you know, that each of us
00:42:01.200 can do, you know, take accountability, you know, reflect and so on. And then the, the kind of collective
00:42:06.380 practices that work in a team or a family or a company, which include calling attention to the
00:42:11.760 context, you know, being very thoughtful about like how much uncertainty or what the stakes are
00:42:16.240 here, encouraging failure, sharing, you know, that detoxifies failure. If we can sort of more
00:42:22.140 cheerfully tell each other about the things that have gone wrong in our projects or lives, it helps
00:42:28.880 a great deal. And then really reward that honesty. You know, if you're in a family and a child sort of
00:42:35.520 speaks up quickly about something, you know, I broke a glass if they're a little kid or I, you know, was it
00:42:40.100 a party I shouldn't have been at if they're an older teenager, being deeply appreciative of that
00:42:45.360 honesty will build the right habits and the right climate for learning in that family.
00:42:51.580 Well, Amy, this has been a great conversation. Where can people go to learn more about the book
00:42:54.620 and your work?
00:42:56.120 Well, the book can be found right kind of wrong anywhere books are sold, I hope. And you can go
00:43:02.320 to my website, amycedmondson.com for information about other papers and other writings and activities.
00:43:09.260 Fantastic. Well, Amy Edmondson, thanks for your time. It's been a pleasure.
00:43:12.160 Pleasure to talk with you, Brett.
00:43:14.520 My guest today was Amy Edmondson. She's the author of the book, Right, Kind of Wrong. It's
00:43:17.880 available on amazon.com and bookstores everywhere. You can find more information about her work at
00:43:21.520 our website, amycedmondson.com. Also check out our show notes at aom.is slash fail well,
00:43:27.400 where you can find links to resources, where you can delve deeper into this topic.
00:43:29.780 Well, that wraps up another edition of the AOM podcast. Make sure to check out our website at
00:43:41.200 artofmanlies.com where you can find our podcast archives, as well as thousands of articles that
00:43:45.160 we've written over the years about pretty much anything you can think of. And if you haven't
00:43:48.200 done so already, I'd appreciate it if you take one minute to give us a review on Apple Podcasts or
00:43:51.180 Spotify. It helps out a lot. If you've done that already, thank you. Please consider sharing the show
00:43:55.520 with a friend or family member who you think we get something out of it. As always, thank you for the
00:43:59.060 continued support. And until next time, this is Brad McKay reminding you to not listen to AOM
00:44:02.720 podcast, but put what you've heard into action.