#564: Assault Your Assumptions Through Red Teaming
Episode Stats
Summary
In this episode of the Art of Manliness podcast, Brett McKay and Bryce Hoffman discuss the concept of Red Teaming, and how it was developed by the U.S. military in response to the 9/11 attacks and counterinsurgency in Afghanistan.
Transcript
00:00:00.000
Brett McKay here and welcome to another edition of the Art of Manliness podcast. We live in
00:00:11.800
an age of disruption. Companies that were once stalwarts overtaken by small plucky upstarts
00:00:16.100
our personal lives can also be disrupted. It can lose a job or a business fails. My
00:00:20.220
guest today says that instead of waiting to be disrupted by outside forces, you're better
00:00:23.840
off using techniques developed by intelligence agencies and the military to disrupt yourself
00:00:28.300
first. His name is Bryce Hoffman and he's the author of the book Red Teaming, how your business
00:00:32.300
can conquer the competition by challenging everything. We begin our show discussing what
00:00:35.840
red teaming is and the history of its development from war gaming by 19th century Prussians to
00:00:40.120
more sophisticated techniques developed by the U.S. military during the war on terrorism.
00:00:44.080
Bryce and I discuss the hidden biases that red teaming is designed to counter and then
00:00:47.640
get into specific red teaming techniques you can start using today to challenge your assumptions,
00:00:51.540
stress test your strategies, identify unseen threats, and make better decisions in both
00:00:56.260
your personal life and your business. After the show's over, check out our show notes
00:01:16.000
So you are the author of the book Red Teaming, how your business can conquer the competition
00:01:20.500
by challenging everything. Now, I'm sure some of our listeners have heard of this concept
00:01:24.840
of red teaming. It's come out of the military. We'll talk a bit more about that. But for those
00:01:29.180
who aren't familiar with red teaming, what is this concept?
00:01:33.540
Red teaming at base is really a system for confronting hard truths that hold us back from
00:01:42.540
moving forward in the best direction possible. It was developed by, as you said, Brett, it was
00:01:47.600
developed by the military and the intelligence community as a result of the failures and intelligence
00:01:53.540
that led to the terrorist attacks on 9-11. And in the case of the U.S. Army, as a result of the
00:02:00.340
really faulty assumptions that the Army came to believe were responsible for turning what seemed
00:02:07.540
to be easy victories into long protracted counterinsurgency in Iraq and Afghanistan. So it was designed
00:02:15.120
intentionally to help these organizations challenge their own thinking, deliberately try to poke holes in
00:02:23.160
their own plans to make their plans better and to make better decisions in the future.
00:02:28.340
So red teaming became systematic in the aftermath of 9-11 and during the Iraq and Afghanistan wars.
00:02:34.960
But there's a history that goes back further than that in the military. Like, when did we start seeing
00:02:42.540
So the origins of red teaming in the military really go back to the Prussians in the 1790s. And after they
00:02:48.280
had been defeated by Napoleon, they, which, you know, if you think about it, for the Prussians
00:02:53.500
was really a big deal because the Prussians considered themselves to be kind of the most
00:02:58.080
badass military country on the face of the earth at that time. And so the fact that they were defeated,
00:03:03.580
not just by the French, but by a French corporal was really humiliating for them.
00:03:07.580
And so they decided they weren't going to take it, that they were going to
00:03:11.440
wait for the right opportunity and attack Napoleon again. But they knew they'd only get one chance.
00:03:17.220
They did two things that are really important that kind of set the stage for red teaming.
00:03:21.440
One is they recognized, and this is important, this goes to what I was talking about, Brett,
00:03:26.600
about confronting hard truths. The Prussians realized that none of them were equal to Napoleon,
00:03:33.440
that none of them was as good a general as Napoleon was. But what they then did is say,
00:03:41.080
well, you know what? We're not, none of us is as good as Napoleon, but you know, Hans over there
00:03:45.540
is just as good at logistics as Napoleon is. And Fritz is as good at artillery strategy as Napoleon is.
00:03:53.680
And Gunther over there is as good at cavalry tactics as Napoleon is. And so instead of having
00:03:58.440
one general lead their army, they found the best general in each of the key areas that a general
00:04:05.100
needed to be adept in and created a team of generals, what they called the general staff
00:04:11.320
to lead their military. And they told their king, you know, we're not going to have one general in
00:04:16.520
charge. We're going to have this group in charge and together we're going to be as good as Napoleon.
00:04:20.960
And they were right. And that concept was so successful that militaries all over the world
00:04:27.420
adopted, and it's still in use today, pretty much in every country. So that's a core red teaming
00:04:32.120
concept because it goes to this idea that all of us are smarter than any of us. So that the wisdom of
00:04:38.940
the group is greater than the wisdom of any individual, unless that individual happens to be
00:04:43.240
Napoleon or in a business example, Steve Jobs, perhaps. So that was the first thing that happened.
00:04:48.080
Second thing is they recognized that they were only going to get one chance at this and they better
00:04:53.180
make it count. So to make sure it did, they prepared their strategy to fight Napoleon and then they
00:05:00.260
divided themselves in two groups and they set up a tabletop exercise with little wooden pieces to
00:05:05.740
represent all the different units and the terrain and stuff like that. And they fought battle after
00:05:10.660
battle on the tabletop with one half of the general, the Prussian officers playing the Prussians and the
00:05:15.780
other half playing the French and trying to figure out how the French could defeat the Prussians
00:05:20.400
strategy. Now, if you had Crayola crayons as a kid, like I did, you'll remember that there was a cool
00:05:26.460
color called Prussian blue in your Crayola set. And the reason it was called that is because Prussians in
00:05:33.180
the 1790s wore these kind of spiffy blue uniforms that that color is named after. So they were the blue team.
00:05:39.720
And since they were planning their fight against the revolutionary French, they painted the pieces
00:05:46.980
for the French red and called them the red team. So that's where this idea of taking a group of your
00:05:52.420
own people and in deliberately trying to, to defeat your own strategy, to poke holes in your own plan
00:05:58.520
comes from. And that was also, they called it Kriegspiel, which in English is war gaming.
00:06:04.300
And that also spread throughout the world is still used today. So what the type of red teaming that
00:06:10.300
we're talking about, which we call decision support, red teaming is the formal name for it,
00:06:14.540
is taking that same approach and not just using it to, to kind of figure out how the enemy is going
00:06:21.220
to react, but to just consciously kind of assault your own assumptions to make sure they can kind of
00:06:26.580
withstand that rigorous scrutiny, because in doing so you and your organization can make better
00:06:31.840
decisions. Okay. So the U.S. military obviously continued this tradition of war gaming after
00:06:37.920
the Prussians came up with it throughout the 19th century and the 20th century. And you said it was
00:06:43.640
after the 9-11 attack that intelligence agencies and the militaries decided we need to take war gaming
00:06:50.280
and do something more, make it more systematic. So what's the story there? How did, who were the
00:06:55.640
organizations, individuals involved with creating this more systematic approach to red teaming?
00:07:01.840
So two things happen in parallel, though at slightly different times. The first thing that
00:07:06.220
happened, Brett, was on September 12th, 2001, literally as they were still pulling people out of
00:07:12.800
the rubble of the World Trade Center and the Pentagon, CIA Director George Tennant activated or
00:07:19.300
probably more appropriately reactivated a group within the CIA called the Red Cell. And the Red Cell
00:07:26.500
is the CIA's Red Team. And Tennant told these folks, he said, look, we should have seen this coming. We
00:07:33.880
knew that there was about to be a terrorist attack on the United States, but we failed to connect the
00:07:39.060
dots in time. And the reason we failed to connect the dots is not that we didn't have the information.
00:07:44.320
It's not that we didn't have the intelligence. Cause if you think back to like the 9-11 commission,
00:07:49.700
some of the revelations that came out of that were pretty fascinating. I mean, things like the
00:07:54.760
director of a flight school in Florida calling the FBI and saying, Hey, you know, just a heads up
00:08:00.720
here. I've got the, this group of guys from, from the Middle East who want to learn how to fly jumbo jets,
00:08:05.220
but they don't want to learn how to land them. And they want to do all their simulator training over New York
00:08:10.400
City, things like that, that the intelligence agencies have, but didn't piece together.
00:08:14.840
And, and so Tennant said, what I want to do is have this group, the Red Cell take everything that
00:08:22.980
we believe to be true every day, look at all of our intelligence assessments and try to argue that the
00:08:28.260
opposite is true. Try to argue that we're wrong. Try to poke holes in what we have concluded based on
00:08:34.280
the intelligence that we have, not because we are necessarily wrong, but because a, we could be
00:08:40.160
and maybe you'll figure out the correct answer or B, even if we're right by stress testing our
00:08:46.860
conclusions, you'll make them stronger. So the Red Cell got to work. And obviously the work of the
00:08:51.880
Red Cell is highly classified. But one thing the CIA has said publicly is that the work of the Red Cell
00:08:58.660
is directly responsible for having prevented a number of major terrorist attacks on the scale of
00:09:05.440
9-11, if not larger, since 2001. The other thing that the Red Cell did, which, which is more in the
00:09:12.260
public domain, is they started creating a document a few weeks after this. Every day, they started
00:09:17.840
creating a document called the Alternative Intelligence Assessment. And what this was, was a, a, a,
00:09:24.500
as your listeners probably know that every day, the president of the United States gets a black book,
00:09:30.160
I believe it's about six pages in it of the, which called the Daily Intelligence Assessment.
00:09:35.460
And it is a breakdown of what's happened in the world in the past 24 hours, what the CIA believes
00:09:43.120
it means, and what options the president has for responding to this based on the conclusions the CIA
00:09:49.580
has drawn from this intelligence. So the Alternative Intelligence Assessment was a one-page document
00:09:54.920
slipped in, in the back of that, that book every day that said, you know what, Mr. President, you've
00:10:00.280
just read what we believe is going on in the world and what you can do about it, but we might be wrong.
00:10:05.740
And if we're wrong, here's the other ways you could interpret these events. And here's some other
00:10:11.200
options you might consider. Now it's worth noting, Brett, that both President Bush and President Obama
00:10:17.600
said publicly that they found the Alternative Intelligence Assessment to be one of the most important
00:10:22.920
things they looked at every day. President Trump, about two weeks after he took office, said he
00:10:27.420
found it confusing and asked to be removed from his daily briefing. So the CIA still prepares that
00:10:33.800
they just don't give it to the president. So that's one thing that happened with the CIA. Now, a couple of
00:10:38.180
years later, the U.S. Army kind of had a similar epiphany. See, they thought they won the war in Iraq,
00:10:46.280
and then suddenly they found that they were losing it. And that created a real catharsis in the leadership
00:10:53.620
of the U.S. military, in the leadership of the U.S. Army in particular, that said, how did this happen?
00:10:59.480
How did we so easily win this war and now find ourselves locked in this insurgency that we're
00:11:04.380
actually losing? And so the then director of the U.S. Army, the then chairman of the Army, who is named
00:11:11.800
General Schumacher, former Green Beret, great American, General Schumacher said, we're going to
00:11:17.660
set up a lessons learned team to figure out how this happened, how we got in this mess, and how to
00:11:24.840
make sure we never get in this mess again. And what this team concluded very quickly, Brett, was that we
00:11:31.980
had become victims of our own success. By we, I mean the U.S. military had become victims of our own
00:11:37.560
success. The military had so easily won the war in Iraq in the early 1990s, the first Gulf War,
00:11:45.980
with so little cost. And they'd so easily won the war in the Balkans in the later part of the 1990s,
00:11:53.280
again, with so little cost, that they had concluded that because we had this immense
00:11:58.280
mastery of information because of spy satellites and drones and all this stuff, and because our weapons
00:12:03.620
were so superior to anyone else's, that we basically had become invincible. And they really
00:12:09.560
believed that. If you go back and look at the stuff that was written in the Pentagon in the run-up to
00:12:14.720
the invasion of Iraq, people really believed that they were invincible. And then suddenly they found
00:12:20.820
they weren't. And so the lessons learned team made several recommendations, but one of them was
00:12:26.480
to recognize that a lot of the problem was not just that we thought we were invincible, but that a lot of the
00:12:32.160
assumptions we made because we thought we were invincible were wrong. And so they recommended
00:12:37.540
creating a team within the U.S. Army at every level of the U.S. Army that's job would be to take
00:12:45.280
every strategy that was developed and try to stress test it, try to break it apart, try to figure out
00:12:52.580
what could go wrong with it and how to make it better. And they called this Red Team. And they came up
00:12:57.240
with a formal system of tools and techniques, and they started training senior officers in these
00:13:03.600
tools and techniques so that they could go and do this to make sure this sort of thing didn't happen
00:13:08.020
again. And they created it. They even created their own school. They called it Red Teaming University
00:13:11.820
informally. It had a code name, the University of Foreign Military and Cultural Studies, because they
00:13:16.840
didn't want our enemies to know what it really was. And they set it up at Fort Leavenworth in Kansas to train
00:13:22.480
officers in these techniques. And I became the first and still only civilian from outside government
00:13:27.920
to go through the Red Team leader training program there. That's how I learned about this.
00:13:32.280
And did Red Teaming influence or change the way or change decisions, like strategic decisions the
00:13:40.960
It really did. And again, a lot of what came out of Red Teaming is classified, but one that was very
00:13:46.400
public, and it shows both the real opportunity and the challenge of Red Teaming, Brett, was the surge.
00:13:54.820
So General Petraeus was in charge of Fort Leavenworth when Red Teaming was set up, and he was an early
00:13:59.960
advocate for Red Teaming. And so when he was put in charge of the war in Iraq in the mid-2000s,
00:14:08.180
he used Red Teaming techniques to come up with this idea of the surge. And Red Teaming is really about
00:14:15.240
contrarian thinking and looking at things differently. So what he came up with was,
00:14:19.900
the president said, we want to pull out of Iraq. By using Red Teaming, he was able to determine the
00:14:25.600
only way we can pull out of Iraq is to send more troops into Iraq to get the situation to the point
00:14:30.960
where we can safely pull out for ourselves and for the Iraqi people. And so the surge was given the
00:14:37.320
green light, and it went forward and was very successful. It really changed everything pretty
00:14:44.720
quickly. It dramatically reduced the violence in the country, the number of bombings, the number of
00:14:49.220
terrorist attacks. So the surge was really working in the way it was intended. But the problem is that
00:14:56.040
the politicians in Washington saw that it was working. And even though General Petraeus had said,
00:15:02.380
you know, we can't just wait till it starts to work. We've got to keep this in place for a period
00:15:08.480
of time here to ensure that the situation really stabilizes in Iraq before we pull out.
00:15:14.800
They said, no, this is close enough. Let's pull out now. And then it started to fall apart again.
00:15:18.600
And so, Brett, what this really illustrates is one of the problems, one of the challenges with Red
00:15:23.900
teaming, not a problem. What it really illustrates is one of the challenges of Red teaming.
00:15:30.020
Which is that if you don't have the support of senior leadership, if you don't have buy-in from
00:15:35.840
senior leadership, it doesn't matter how good the ideas you come up with are, they're not going to
00:15:40.840
work because they can't. And that's what happened with the surge. So even though a lot of what has been
00:15:48.100
done in the military with Red teaming is classified, the way that you know it's working
00:15:52.020
is that very quickly after it was, after they started to implement Red teaming in the late
00:15:58.780
2000s in the US military, very quickly, it's spread around the world. So the British adopted Red
00:16:05.440
teaming, the Canadians adopted Red teaming, the Australians adopted Red teaming, New Zealand,
00:16:10.900
even NATO ultimately adopted Red teaming, though they decided that the idea of teams was a little
00:16:15.200
bit too confrontational. And so they called it alternative analysis. But the point is, is it was so
00:16:20.940
successful that most of the countries in the world that are allied with the United States have now set
00:16:25.940
up their own Red teaming programs or Red team training programs and are using this kind of
00:16:34.780
So Red teaming has been used by intelligence agencies, militaries to defeat terrorists, win wars.
00:16:40.500
But then you talk about, you started seeing companies using Red teaming techniques as well. When did
00:16:46.220
do you start seeing that? What are some example of private or civilian organizations using Red
00:16:52.760
Yeah. So Brett, before I decided to give up almost half a year of my life to go through the Red team
00:16:58.880
training program at Fort Leavenworth, I wanted to make sure that it wasn't just me that thought this
00:17:04.120
was a good idea, that others in business saw this as valuable and something that they would want to
00:17:08.800
learn and use. So I talked to several friends of mine who are in senior leadership positions with some of
00:17:15.120
the most disruptive companies in the world, companies that other companies have really come
00:17:20.880
to fear because they're so good at disrupting other industries. And I talked to folks and what I found
00:17:26.240
out was that while none of them were really aware of this type of Red teaming that the military and the
00:17:31.900
CIA and others were doing, that when I described the tools and the techniques that were involved,
00:17:38.360
they were really similar to some of the ways in which these companies approach their business.
00:17:42.180
So for example, I have a friend who's in a fairly senior position at Amazon. And when I described
00:17:48.280
Red teaming to him and described some of the tools to him, he said, you know, I've never heard of Red
00:17:52.200
teaming. I don't think Jeff has ever heard of Red teaming, but what you describe is very similar to
00:17:58.660
some of the things that we do. And he said, we have an internal process that we do to constantly stress
00:18:08.080
test our own strategies, to constantly challenge our own assumptions and to really try to look at
00:18:14.120
different parts of our business and try to disrupt them, to try to look at them the way a competitor
00:18:19.540
who wanted to disrupt us would look at them so that we can disrupt ourselves before someone does it for
00:18:25.780
us. And he said, you know, a lot of times I go to conferences and stuff and I talk with executives
00:18:31.280
from other companies and they say, how can we be more like Amazon? And he said, I always feel like kind
00:18:36.360
of a jerk because the only thing I can think of to tell them is what you could start over because
00:18:39.800
this is so intrinsic in our DNA. It's so core to what we are as a company and it's been part of our
00:18:45.400
company since Jeff started it. He said, this is the first time I've heard of kind of a systematic way
00:18:51.420
that you could teach someone to think more like Amazon. And so when I heard that, that was really
00:18:55.780
validating. And I heard that from other companies as well. I heard that from folks at Kleiner Perkins
00:18:59.480
on Sand Hill Road who said, you know, some of these techniques you're describing are very similar
00:19:03.200
similar to the way that we vet companies for investments, that sort of thing. And when I heard
00:19:07.600
that, Brett, I knew that this was really something that was valuable and was worth taking the time to
00:19:11.620
learn how to do, to write a book, to share with other people and to set up a company to teach people
00:19:16.200
how to do this. So companies are just doing it by themselves, but have there been companies who,
00:19:20.880
once they learned about what the military was doing with this more systematic approach to red teaming,
00:19:24.680
they're like, let's do that in our company. Absolutely. And a lot of, you know, a lot of our clients
00:19:30.340
because of NDAs, I can't identify by name, but we've worked with companies in pretty much every
00:19:35.160
industry, you know, from aerospace, transportation, technology, telecommunications. One company that
00:19:42.660
I can talk about briefly is Verizon, which has really made red teaming a central part of its strategic
00:19:50.020
planning process and has really figured out how to use these tools in a very effective way to evaluate
00:19:55.980
every major strategy before it's approved. And it has been game changing for them. They have,
00:20:01.800
they have, they have changed the direction of some of their major strategic initiatives
00:20:05.720
as the result of what they've learned from their red teams. And it's, it's, it's been really
00:20:12.040
powerful. Another organization that has used red teaming, one of the first that we trained was
00:20:17.600
the Development Bank of Japan, which is Japan's sovereign wealth fund. And they wanted to use these
00:20:23.780
tools to look at companies as investment targets that they were thinking of investing in, to make
00:20:29.620
sure that they were really putting their money in companies that could use it effectively. And,
00:20:34.760
and to make sure that the ways that the companies wanted to use the money they were giving them
00:20:38.660
would lead to the success that they hope to achieve. So there's a lot of companies that,
00:20:43.280
that have been able to use this since the book came out and it's, it's really spreading.
00:20:47.360
There's a lot of interest, like I say, not just in the United States, but around the world.
00:20:50.620
I just got back from a trip to Great Britain where I talked with a number of companies and
00:20:54.820
a number of organizations over there about how to use red teaming to really kind of,
00:21:00.360
like I said, disrupt yourself before someone else disrupts you. But it's not just companies.
00:21:04.540
It's not just large organizations that can do this. Individuals can red team themselves as well.
00:21:10.800
And we'll talk about that. Cause I think that's when I, that was the big takeaway for me personally,
00:21:14.480
I read this book is like, I can apply this to myself, my own life, but also to my own company as well,
00:21:19.540
my own business. And we'll talk about that here in a bit, but let's talk about the power. Like,
00:21:24.140
what do you think red teaming does? And you make the case that red teaming is really powerful in
00:21:29.240
helping us overcome human biases when we are as individuals, we make decisions as well as in
00:21:34.660
groups. So what are like the most common biases or, you know, errors in thinking that humans make
00:21:44.260
It's a great question, Brett. And it's really important to understand that the military,
00:21:49.320
the CIA didn't just come up with these tools kind of, you know, on a whiteboard somewhere that
00:21:53.900
these are really based on science and they're based on primarily the research that's been done over
00:22:02.460
the past 40, 45 years in cognitive psychology and human decision-making by people like Dr. Daniel
00:22:09.620
Kahneman, who wrote Thinking Fast and Slow, the Nobel laureate, his colleague, Amos Tversky,
00:22:14.680
Dr. Gary Klein, and others. And what these scientists have seen and what these scientists
00:22:21.200
have proven in thousands of experiments is that Adam Smith was wrong. And what I mean by that,
00:22:29.680
Brett, is for the better part of the past 300 years, most economists and most people have thought
00:22:36.860
that Adam Smith's notion of what he called rational choice theory was the way people made decisions.
00:22:42.960
And what rational choice theory holds really simply is, and it is a simple theory, is that
00:22:47.780
we make the best decisions possible with the information available to us. And that if we make
00:22:54.580
a bad decision, it's because we didn't have enough information to make a better decision or because we
00:23:01.560
were swayed by strong emotions like love or hate or a really unhealthy obsession with tulips.
00:23:08.680
And so economists thought, well, this is how the world works. People do their best. And if they make a
00:23:14.340
bad decision, they would have made a better decision if they had more information. But what
00:23:18.680
Tversky and Kahneman and Klein and others have proven is that that's not how people make decisions.
00:23:25.840
That people's decisions, that's how we wish we made decisions, but that people's decisions, no matter
00:23:32.160
how smart they are, no matter how well-educated they are, no matter how experienced they are, no
00:23:37.180
matter how successful they are, are shaped by an array of biases, blind spots, heuristics, which is really
00:23:44.300
just a fancy term for mental shortcuts. And that these things, which are kind of hardwired into our
00:23:50.620
brain, skew our decision-making in ways that we're just not aware of. And, you know, we could talk
00:23:56.960
about what some of those are, but it's important to know that these are things that exist for a reason.
00:24:02.260
You know, these biases and these blind spots exist, these shortcuts exist because, you know, if you
00:24:07.420
were a hunter-gatherer on the African savannah and you stopped to really think deeply about whether
00:24:13.780
the lion that was approaching you was going to eat you or not, you probably wouldn't survive
00:24:17.860
to kind of finish that analysis. So our brains are wired in such a way to help us make really quick
00:24:23.660
decisions when we need to. But the problem is, is that we use that same approach to deal with
00:24:29.820
today's problems, which are much more complex, much more complicated than the ones that people
00:24:34.960
were encountering on the African savannah thousands of years ago. And that's where we run into difficulty.
00:24:40.100
So what are some of these heuristics or biases that you see frequently that really cost
00:24:45.240
organizations or individuals whenever they're making a decision?
00:24:48.560
Oh, there's so many. One of the ones that I think is really kind of endemic in business is
00:24:53.360
sunk cost bias. And sunk cost bias is basically the tendency that we all have when we lose something,
00:25:02.000
when we lose money in particular, to want to recoup our loss. And that desire is so strong that it can
00:25:09.740
often make us do really stupid things that end up costing us more money in the long term.
00:25:15.960
So for instance, at the simplest level, sunk cost bias is why you see people hitchhiking out of Las
00:25:20.920
Vegas, because they don't have enough money for a Greyhound bus, because they've lost their money at
00:25:26.560
the gaming tables and they keep betting in the hopes of recouping their losses until they're left with
00:25:31.440
nothing. But it also affects companies that should know better. So you see companies do things like
00:25:37.240
build a factory, you know, open a new factory and the factory will be unprofitable.
00:25:43.480
The factory doesn't make money because the workers aren't productive enough, the equipment isn't
00:25:47.640
good enough, whatever the reason. And rather than saying, you know what, we invested $250 million
00:25:52.500
to build this factory. It's really just kind of a been a boondoggle. Let's cut our losses and move
00:25:57.900
on. What did they do? They say, well, let's put $50 more million into this factory and try to try to
00:26:03.080
boost productivity. Okay, maybe that works, but it doesn't work. They keep pouring money in it. They say,
00:26:08.040
let's do another $25 million. Let's invest another $50 million. And pretty soon, the $250 million
00:26:13.360
factory costs $700 million, and it's still not turned to profit. So that's one bias that is really
00:26:20.480
dangerous and that red team thinking is designed to overcome. Another one is availability heuristic.
00:26:27.520
So the availability heuristic is really simple. We are much more aware of information that we've
00:26:33.260
just been given than information that we know from the past. So if you every day are, you know,
00:26:43.520
seeing on the news stories about how great some new technology is, you're much more likely to look
00:26:50.200
at it favorably because of that recent information, despite the fact that you saw much more detailed
00:26:56.020
information two years ago that said this technology had fundamental flaws in it because it's not available
00:27:00.980
to you in your mind as readily. And there's tons of them. There's negativity bias, which is that we
00:27:05.920
tend to recall negative or painful experiences much more strongly than positive ones. So if we were
00:27:12.200
successful, for instance, in a particular business strategy for three years, and then all of a sudden,
00:27:19.820
we blew it one time, we're likely to stop pursuing that strategy because of that one bad experience and
00:27:26.980
ignore the fact that in three of the past four instances, it was incredibly successful because the
00:27:31.920
pain is stronger in our mind than the success. Like I said, it's hardwired into our brains and it affects
00:27:38.260
our thinking on complex problems as well as simple problems like that.
00:27:41.740
So those are examples of decision heuristics we make that can be done on a group basis, but also on an
00:27:46.040
individual basis. But you also highlight that our thinking can change and lead us astray when we're
00:27:50.740
starting, when we get into groups. And one of the ones that stood out to me was the Abilene
00:27:54.860
Paradox. What's the Abilene Paradox and how can that lead us astray?
00:27:59.020
Well, the Abilene Paradox, simply put, Brett, is what happens when we say yes, but we mean no.
00:28:05.680
And, you know, it's something that anyone who's worked in an organization is probably really
00:28:10.100
familiar with. Someone will pose a question, there'll be a problem that the group is trying
00:28:14.300
to solve, and no one has a good answer. And then someone will throw out an answer
00:28:17.980
and say, well, we could do X. Now, everyone at the table thinks that X is a horrible idea,
00:28:25.680
but they ultimately do it anyways because no one comes up with a better idea. And the reason that
00:28:32.260
it's called the Abilene Paradox, and no offense to your listeners in Abilene, Texas, is that the
00:28:37.880
psychologist who first identified this several decades ago was sitting at home with his family
00:28:43.060
on a weekend. And it was a Sunday, they had no idea what to do, was hot, trying to figure out how to
00:28:49.500
kill the afternoon. And they'd been sitting around for, you know, several minutes trying to figure out
00:28:55.040
how they could spend the day. No one had a good answer. And then finally, someone said, well, you
00:29:00.300
know, we could always go to Abilene. Now, no one in the room wanted to go to Abilene. Everybody
00:29:05.880
apparently hated Abilene. I've never been to Abilene, but presumably it's not a great place to visit on a
00:29:11.120
Sunday afternoon. And yet, after a few minutes, somebody else said, yeah, yeah, we could go to
00:29:17.320
Abilene. And then the next thing you know, somebody else said, yeah, that's great. Let's go to Abilene.
00:29:22.020
And then everyone's in the car, they go to Abilene, they have a horrible time. And on the way back,
00:29:26.760
everyone's grumpy and unhappy. And the conversation takes the turn that you probably expected to,
00:29:32.820
which is somebody says, hey, why did you want to go to Abilene? And mom says, I didn't want to go to
00:29:39.020
Abilene. I only said we should go to Abilene because dad said we should go to Abilene. And dad
00:29:42.680
says, I didn't want to go to Abilene. I only said we should go to Abilene because grandpa said we
00:29:48.760
should go to Abilene. And grandpa says, well, I only said we should go to Abilene because I couldn't
00:29:53.500
think of anywhere else to go. And, you know, it's a funny story, but it really illustrates a big problem
00:29:59.100
that companies deal with all the time, which is, you know, when people kind of agree to something that
00:30:03.440
they don't really believe in. And it's very dangerous. You know, even more dangerous is
00:30:09.160
something that people are probably a little more familiar with, which is groupthink, which is,
00:30:12.820
you know, the pathology of every organization is that over time, it will start to drink its own
00:30:20.940
Kool-Aid and stop challenging its own beliefs. And everyone will start kind of aligning their
00:30:27.140
thinking. And that's really dangerous because, you know, as General Patton once said,
00:30:32.620
if everybody's thinking alike, somebody isn't thinking. And that's what red teaming and red team
00:30:38.040
thinking is designed to do is make sure that everyone is not thinking alike to kind of promote
00:30:43.640
divergent thinking so that you can converge on the best idea, regardless of where it comes from in the
00:30:50.320
organization, by the way, Brett. I mean, a lot of the tools that are involved in red teaming
00:30:53.980
that were created by the military were created because they recognize that they existed in an
00:31:00.380
incredibly strong hierarchical culture and that the hierarchy of the culture of the military
00:31:06.020
prevented good ideas from being surfaced if they didn't come from the most senior people in the
00:31:11.120
room. So a lot of the tools we teach, a lot of the techniques we teach are really designed to
00:31:15.540
help people anonymously surface their ideas and let people evaluate those ideas independent from
00:31:25.040
who surfaced them so that the best idea wins, regardless of where it came from.
00:31:30.500
Well, let's talk about some of these red teaming techniques that organizations and people
00:31:33.780
do. And your book, Red Teaming, it's more about how businesses can use red teaming in a systematic,
00:31:41.520
a large-scale approach. But as you point out in the book, too, you can also use these things on an
00:31:46.240
ad hoc basis in your own small organization or with yourself as well. So let's talk about one that
00:31:52.540
stood out to me was a key assumptions check. What is that? What's the goal of a key assumptions check?
00:31:58.780
So this is a really important technique. A key assumptions check, it's a little bit of a
00:32:04.420
complicated technique, but simply put, it's a way of conscientiously and intentionally challenging your
00:32:11.720
own assumptions and making sure that they're strong enough to base your plan on. Now, it's important to
00:32:17.940
understand there's nothing wrong with assumptions. We have to make assumptions in order to make any
00:32:23.360
decision, to make any plan, to develop any strategy. Assumptions are essential to the planning
00:32:29.740
process. The problem is, is that a lot of people get confused between assumptions and facts. So simply
00:32:38.600
put, a fact is something that is objectively true right now that you can prove. It's not something we
00:32:45.060
hope will be true in the future. It's not something that we wish were true. It's really true right now,
00:32:50.300
and you can go out and show someone that it's true. So if I say, our company made $150 million last
00:32:57.020
quarter, that's true unless our accountants were cooking the books. You can go and get the financial
00:33:02.800
report and see that that is true. An assumption, ideally, is something that is not yet true, but will be
00:33:11.440
true in the future. It's a fact that's not yet true, but will be in the future. So if I say,
00:33:16.640
we're going to make $150 million next quarter, that's an assumption. Even if I've got the most
00:33:23.460
bulletproof quantitative analysis by top folks on Wall Street that tells me this is exactly how much
00:33:29.580
money I'm going to make, it's still an assumption because it hasn't happened yet. So ideally, like I
00:33:34.480
said, assumptions are just facts that are not yet true, and we make our plans based on them. The problem,
00:33:40.080
Brett, is that too often, assumptions are really just wishful thinking. And so a key assumptions
00:33:47.800
check is about identifying the assumptions that underlie your strategy, your plan, or your decision,
00:33:53.440
and then subjecting them to a series of questions that are really designed to poke and prod them and
00:34:00.520
make sure they don't pop on close inspection. And some of those questions are things, for instance,
00:34:05.780
like, is this assumption based on biases or preconceived notions, going back to what we
00:34:11.500
just talked about, about biases and heuristics? Is this assumption based on a historical precedent?
00:34:16.880
And if so, is that historical precedent valid? Because a lot of times we make assumptions based
00:34:22.100
on our past experiences, and it's not really analogous. Other questions that we ask include things like,
00:34:27.820
what has to happen for this assumption to be true, which is something people often don't think about?
00:34:35.000
And another equally important one, if this assumption proves true, does it remain true
00:34:41.180
under all conditions? So that's the type of questions that we ask. And the important thing
00:34:45.840
is that it's not just saying, check your assumptions. It's a very systematic process
00:34:51.400
for checking your assumptions. You know, I mentioned the Development Bank of Japan earlier,
00:34:55.460
and when I taught this technique to some of the senior leaders of the Development Bank of Japan
00:34:59.080
and taught them how to do a key assumptions check, during a break, one of them said to me,
00:35:05.020
you know, this is really important because we have a written process of how we evaluate
00:35:11.040
investment targets. And one of the steps on that process is to check the assumptions that this
00:35:18.920
investment plan is based on. He said, the problem is, is that the way it works in practice is this.
00:35:24.580
We all sit around a nice conference table in our, in our office in Tokyo, and we get to that point
00:35:32.580
on the checklist and whoever's running the meeting says, have we checked the assumptions that this
00:35:38.060
investment plan is based on? And we all nod very earnestly at each other. And then we check that box
00:35:43.980
and we move on to the next box on the checklist. He said, the process that you've taught us doesn't
00:35:48.820
allow for us to escape that task because you have to go and ask these specific questions.
00:35:54.900
So that's why, why these tools are as, as, as kind of intense as they are, is to, is to really make
00:36:01.740
people do the work, not just say they did the work. So on a personal level, you can use this to,
00:36:06.880
if you're making a decision, should I buy a house? And there's some assumptions there when you say,
00:36:11.780
yes, I'll buy a house. Well, there's assumptions like, well, I'm assuming I'm going to be able to get
00:36:15.920
home insurance. I'm assuming the mortgage will get approved. I'm assuming that I'll have a job
00:36:20.340
in the future where I can pay the mortgage. And so that sort of check will help you make sure that
00:36:24.080
you've, you've stress tested and you're able to plan for contingencies where those assumptions
00:36:27.820
aren't true. Absolutely. What you've just pointed out there is really important, which is that while
00:36:33.300
there, these tools can be done in a very formal setting, they can also be used informally. Now let's be
00:36:40.360
clear on what the difference is though. If you, if you do what you just said, that's valuable and
00:36:46.660
it's a lot, you're much more likely to make a good decision by asking those questions that you just
00:36:51.740
asked than if you just said, Hey, I want to buy a house. Sounds like a good idea. Do I have enough
00:36:56.220
money in my bank account for a down payment? Sure. Let's do it. Pass my credit check. We're good to go.
00:37:02.440
However, I just want to be clear. It's not as good as having somebody else
00:37:07.580
look at your assumptions and ask those questions because like we talked about all of us, no matter
00:37:14.980
how smart we are, no matter how well educated we are, no matter how successful we are, can't see
00:37:20.380
what we can't see. You know, the Nobel prize winning economist Thomas Schelling said the one
00:37:25.620
thing that no one can do, no matter how smart they are is come up with a list of things that would
00:37:31.120
never occur to them. And that's what you get from red teaming in a group is another set of eyes that
00:37:38.600
looks at the problem, that looks at the decision and helps you see what you can't see. So that's
00:37:44.180
the difference. Still valuable, still incredibly valuable to ask those questions as an individual,
00:37:49.460
even more valuable if you can do it, even with a small group.
00:37:53.720
Well, on a personal level, one thing you can do to help you do some red teaming with someone else
00:37:57.600
is like on that house buying decision, have a personal financial advisor, consult them and
00:38:03.420
they can start saying, well, let's think about these. They'll, they'll start picking at it and
00:38:07.200
be like, well, let's check out these things as well. Exactly. Find someone you trust to ask you those
00:38:12.400
tough questions and to help you answer them honestly. That's, that's really what it's about.
00:38:16.160
But, you know, as we've, as we've rolled red teaming out around the world, you know, we found that
00:38:22.300
there are, there are companies like Verizon and like Development Bank of Japan and some of the other
00:38:26.120
large companies we've worked with who've been able to set up red teams in their organizations on an
00:38:31.200
ad hoc basis, train people in these tools and techniques, and then let them really spend several
00:38:36.860
days evaluating important strategic decisions. And that's incredibly valuable. But we found a lot
00:38:43.060
of other companies, a lot of other organizations, and a lot of individual leaders who say, I really
00:38:49.400
want to use these tools and techniques to make better decisions, but I don't have the ability to set up a
00:38:54.980
team. I don't have the ability to get, you know, half a dozen people trained in these techniques.
00:38:59.320
What can I do? So we figured out a way to modify some of these techniques and teach people how to
00:39:04.440
do them individually or in just, you know, a couple, grabbing a couple of people going into a conference
00:39:08.660
room and using them on a less formal basis. Like I say, it's not quite as powerful as the formal
00:39:14.340
process, but it's still effective and it's still better than doing nothing. It's still better than not
00:39:19.640
challenging your own thinking. Another technique that can be used on an ad hoc basis that I thought
00:39:25.620
was, uh, could be potentially powerful is this four ways of seeing what's that. And what's the goal
00:39:31.180
there? That is a really cool technique. And it's really one that I think it's important for people
00:39:35.820
to use. And you are absolutely right. This is something you can do as an individual, even effectively.
00:39:40.100
So at the simplest level of four ways of seeing is something that the military created to deal with
00:39:47.680
the recognition that they often failed when they were putting together plans to consider how those
00:39:55.620
plans looked from other people's perspective than the U S armies. So I'll give you an example.
00:40:01.980
My instructor, when I went to the red teaming university, Colonel Kevin Benson was literally the
00:40:06.940
person, literally the head of the team who planned the invasion of Iraq. And, and so he'd seen
00:40:13.040
firsthand why we needed these tools. And when we were, when we were learning this tool, four ways of
00:40:19.260
seeing, he told our class a story. He said, you know, I, I led the team that planned the invasion of
00:40:25.220
Iraq and we made a lot of assumptions about the Iraqi people, even though we'd never talked with any of
00:40:33.240
them. And he said, one of the biggest and most damaging assumptions we made because it was so
00:40:39.900
colossally wrong was that we believed that inside every Iraqi was, and this was his, his exact words
00:40:50.760
inside every Iraqi was a little American just dying to get out. And he said, what that, what that belief
00:40:58.700
led us to do was to decide things like it's okay that we're going to black this country out, take
00:41:05.820
out its power grid. And yes, people are going to lose their air conditioning and the desert in the
00:41:11.760
middle of the summer. Yes. Their refrigerators are going to stop working and all their food's going
00:41:15.560
to spoil. Yeah. They're probably not going to be able to get clean drinking water for a while,
00:41:19.300
but they're going to be so happy that we've freed them from the, the, the boot of Saddam Hussein
00:41:26.540
that they won't care that they'll, they'll be good with that. He said, you know, as soon as you
00:41:32.720
look at it that way though, as soon as you, as soon as you say, is that really true? You realize
00:41:36.180
it's just complete BS, right? Because anyone who's familiar with Maslow's hierarchy of needs knows that
00:41:42.940
self-actualization is the top of the pyramid and things like food and water and shelter and security
00:41:51.640
are the bottom of the pyramid. And you can't get to the top of the pyramid without making a strong
00:41:55.760
foundation on the bottom of the pyramid. So in reality, if you don't have, you know, food and
00:42:02.500
water, you don't really care much about whether you're living in a dictatorship or democracy.
00:42:06.600
So they created this tool called four ways of seeing to force themselves to look at assumptions,
00:42:13.880
look at problems and look at plans from the perspective of other key stakeholders. Like in
00:42:18.580
the case I just gave the Iraqi people and the way it works, Brett is really simple.
00:42:22.540
You create a quad chart, typical business quad chart, though. It's important to remember on this
00:42:27.960
one, there's no right box. All boxes are equal in the upper left-hand corner. You, you look at how
00:42:36.160
do we view ourselves? Now we could be your company or it could be you as an individual
00:42:41.160
in the upper right-hand box. You say, how does X view X? X could be anyone. It's any stakeholder
00:42:51.840
you're looking at. It could be a competitor. It could be your customers. It could be your trade
00:42:56.080
union. It could be your boss. If you're doing this to figure out how to get a raise and you write,
00:43:02.580
how do they view themselves? Then in the lower left-hand quadrant, you write, how do we view them?
00:43:09.440
And in the lower right-hand quadrant, you write, how do they view us? And you spend some time filling
00:43:17.540
these boxes out. And what you get from this, Brett, is a better understanding of what other
00:43:24.800
people's perspectives are, what their pain points are, what their issues are. And then you can craft
00:43:30.800
your message. You can craft your plan to address some of those if you want to make it more likely to
00:43:38.820
succeed. So if you're using this at the simplest level to figure out how best to approach your
00:43:43.660
boss and get a raise, by doing this, you might find out, for instance, that your boss is under
00:43:50.880
tremendous pressure to keep costs where they're at. However, your boss is also under tremendous
00:43:56.580
pressure to increase sales in your department by 12% this year. So you're more likely to succeed then
00:44:05.700
if you go to your boss and instead of saying, hey, I'm a hard worker, give me more money to say,
00:44:10.220
hey, look, I recognize that you're under a lot of pressure to keep costs down. But I also recognize
00:44:16.860
that you're under a lot of pressure to increase sales. I've been working my tail off. I've been
00:44:21.320
putting in extra hours to help you do that. And I'm willing to continue to do that.
00:44:25.760
But to do that, I need to get a little bit more on my end as well. So if you will give me what I'm
00:44:35.160
asking for, I'll commit to you that I will help you achieve that goal next year. And here's what
00:44:39.580
I'll do to do that. So you see, you're tailoring your message to meet their needs. You're speaking
00:44:43.900
to their needs rather than your needs. And that can be really effective. That can be a real powerful
00:44:49.640
Another technique you highlight in the book that I've used personally is a pre-mortem analysis.
00:44:56.600
This is one of my favorite techniques, Brett. It's a technique that was developed by Dr. Gary Klein,
00:45:02.080
who I've had the pleasure to work with both while researching the book and since then.
00:45:06.340
And this is a technique that is basically all about contemplating failure to answering the question,
00:45:13.040
what's the worst that could happen? But not stopping there, not just saying what's the worst that
00:45:17.300
could happen. But then working backwards and looking at what are all of the steps that would
00:45:24.300
have to happen between that colossally bad failed state that I've now envisioned and the present day.
00:45:35.700
So when we do this technique and practice, we usually put a timeframe out there. So we'll say,
00:45:40.860
for instance, the plan we're looking at, it's going to launch on January 1st, 2020. Let's assume
00:45:46.220
it's January 1st, 2022. And our plan has failed colossally. It hasn't just failed to meet its
00:45:56.340
target. It's actually caused real damage to our organization. And then you think about what does
00:46:03.120
that look like? What does that look like? And then you work back and say, okay, what are the steps that
00:46:08.800
would have to happen from this moment from January 1st, 2020, when we say yes to this strategy to
00:46:15.840
January 1st, 2022, when this colossal failure has occurred? The value in doing that is it shows you
00:46:22.820
the things that could lead to a bad outcome earlier in the process while there's still time to avoid the
00:46:30.760
bad outcome. So if you, for instance, find that one of the things that led to that colossally bad
00:46:36.860
outcome was hiring additional staff and not giving them proper training, well, two things can happen
00:46:43.040
as a result of that. One is you now know that you should make sure when you're, before you approve the
00:46:47.880
plan, that you include adequate training resources for the staff you're planning on hiring. You can also
00:46:53.700
put a flag post out there in the future and say, after we've hired these folks, let's find a way of
00:47:01.320
checking after three months to make sure they've got the skills they need to be effective in their
00:47:06.480
new jobs so that we make sure that they got the adequate training. So it's a way of identifying
00:47:11.700
ways that you could prevent that failure from happening more than just figuring out what that
00:47:17.640
failure looks like. So yeah, I've used that. So you could use this on an individual level. I think
00:47:22.180
a big decision that people often make is, should I quit my job and go all in on my business? And yeah,
00:47:30.360
you got to have some optimism to do that, but it also helps to think about, okay,
00:47:33.640
let me ask this question. The business has failed. What happened to cause the business to fail? And
00:47:40.280
then you can start, yeah, you started using your imagination and then figuring that out and then
00:47:45.320
finding, creating plans to prevent those possible failure points to occur.
00:47:50.860
Absolutely. By the way, this exercise is a lot of fun too.
00:47:53.920
No, it is a lot of fun. But I think one of the dangers with pre-mortem analysis is that if you're
00:47:57.440
neurotic, it can, it can make you like, it can be unproductive in a way if you're not careful
00:48:02.620
because what you start doing is you start thinking like the sky's falling. You let that
00:48:06.620
negativity bias go on, like hijack your thinking. But with pre-mortem, like you're thinking worst
00:48:12.240
case scenario, but also thinking of solutions to those, those problems you come up with.
00:48:18.420
Well, you raise a really good point, Brett, which is that when you're doing a pre-mortem analysis,
00:48:22.600
and then really when you're doing any of these red teaming tools, the point of doing this is not
00:48:28.400
because you think you have a lousy plan and that your plan's going to fail. The point of doing this
00:48:33.420
is you think you have a good plan, but you want to make it better. And that's really the point of
00:48:37.480
red teaming. It's job of a red teamer is not to make a better plan. The job of a red teamer is to
00:48:44.480
make the plan better. So by doing these tools, instead of courting disaster, you're really trying
00:48:51.900
to ensure success and you have to approach your work that way, both as an individual and as a group
00:48:56.780
when you're red teaming. That raises my next question, which is, I think a lot of people
00:49:01.800
or organizations avoid red teaming because it is contrarian in nature and it brings up conflict.
00:49:07.540
So let's say you're doing this within a small group, like your small business or the group you
00:49:12.340
belong to in a larger corporation. How do you bring up contrarian views that come about through red
00:49:19.780
teaming without them being rejected and stepping on toes? Another really good question. So one of the
00:49:26.400
things that we stress, and it's so important for successful red teaming, when we teach people
00:49:32.000
how to red team, we don't just teach them these tools and techniques. We teach them how to communicate
00:49:36.460
the results effectively. Because if you don't, it doesn't matter how great the insights you come
00:49:42.200
up with are. If the people who are going to be making the decisions can't hear and act on what
00:49:47.580
you've recommended, then your whole red teaming exercise has just kind of been a collective navel
00:49:55.300
gazing excursion. So to avoid that, it's really important. It's critical, whether you're red
00:50:03.080
teaming in a group or as an individual, to approach the work of red teaming in a constructive and
00:50:08.000
collegial manner. And to recognize that there's a difference between being a skeptic and being a
00:50:14.600
cynic. There's a difference between being contrarian and being critical. So a skeptic doesn't
00:50:24.320
necessarily believe that the plan is bad. They just want to be shown and see the proof that it's
00:50:30.260
good. A contrarian doesn't want to rip things apart for the sake of ripping things apart. A
00:50:37.840
contrarian wants to look at things from different perspectives to make sure that the problem has been
00:50:43.360
examined from every facet. So when you are approaching your work as a red teamer, you need to
00:50:50.320
be approaching it with the mindset of helping the people who develop the plan. If you're working in
00:50:56.080
a group, helping the people who develop the plan, make that plan better. Not from the perspective of
00:51:01.860
showing everyone that you're smarter than them, that you are more clever than them, that you saw what
00:51:07.140
they missed. It should be constructive and collegial. If you're doing red teaming individually too,
00:51:12.720
if you're going to share your work, don't be a jerk. That's kind of our ground rule. It's real
00:51:20.080
simple. If you're going to share your red teaming work, don't be a jerk. If you're a jerk about it,
00:51:24.900
nobody will listen to what you're saying. The first rule that I learned when I was going through the
00:51:30.000
red teaming training in the army was rule number one of red teaming is don't be an asshole.
00:51:34.160
Because if you go to the group that has asked you to do a red teaming analysis and you say,
00:51:43.140
you know what, we looked at your plan and it's really stupid. You guys failed to account for
00:51:49.300
these three things. And by the way, what would happen if X happened? You didn't think about that,
00:51:54.380
did you? Well, we did. And here's what would happen. That will guarantee that A, no one listens
00:52:00.300
to your red teaming analysis and B, that you will never get a chance to do another one.
00:52:04.660
The way to approach it is to say, hey, we looked at your plan. We looked at your strategy. We think
00:52:10.180
you guys came up with a really good plan here at base, but we see some key areas where it could be
00:52:15.840
made even stronger. And here's what those are. So it's really about how you present your findings,
00:52:21.600
how you approach your work and to be constructive and collegial.
00:52:26.660
And another thing that you pointed out that I thought was useful too in the book is let's say you're a
00:52:29.840
leader and you have a group of people doing red teaming, coming up with contrarian views and
00:52:35.320
showing weaknesses. That doesn't mean you have to take their advice and put it into action, right?
00:52:44.300
Like it's just more information for you. Like you're still the leader. You still get to make the
00:52:47.580
decision. Absolutely. Red teaming does not take away any decision-making authority from leaders
00:52:55.660
because red teams don't make decisions. Red teams provide leaders, provide decision-makers
00:53:01.480
with additional information so that they can make better decisions. That's really what red teaming
00:53:06.400
is about. And it's important to understand that it's not about, about saying, here's what you have
00:53:13.220
to do. It's about saying, here's another option, or here's another way of thinking about this problem
00:53:19.300
that you may want to consider before you make your final decision. That's very different.
00:53:23.860
And that's about empowering leaders, not about taking away their decision-making authority.
00:53:29.020
And I think another thing too, like you have to red team, you're red teaming sometimes as well,
00:53:32.520
because I can see a situation where a red team makes a suggestion or they show a leader
00:53:37.740
contrarian information, but like they don't have all the information, right? There's other factors
00:53:42.460
that the leader is taking into consideration when making the decision, but the red team didn't even
00:53:46.740
think about that themselves. Absolutely. And that goes back to this point that red teaming is not
00:53:51.100
about coming up with a better plan. It's about making the plan better. So you're just simply
00:53:55.240
offering some additional observations, some additional insights. Maybe they're not valid.
00:54:00.400
Maybe they are. The point of a red team is not to be right. It's to make the organization think
00:54:06.520
more deeply. So if you look at it, you know, the Israelis, for instance, have a red teaming
00:54:12.160
organization in their military intelligence directorate called Ipsa Mitzabra, which I don't speak
00:54:17.040
Aramaic, but I'm told in Aramaic translates into on the contrary, the opposite is probably true.
00:54:23.620
And one of the things that's been key to the success of this organization, and I've talked with
00:54:29.140
Israeli intelligence officers who've explained this to me, that the people who are in this group
00:54:34.360
are not rated by how often they are right and the organization is wrong. They're rated on how much
00:54:41.960
their analysis gets the organization to think more deeply about its own conclusions.
00:54:48.040
Well, Bryce, this has been a great conversation. Where can people go to learn more about the book
00:54:52.560
So the book, Red Teaming, How Your Business Can Conquer the Competition by Challenging Everything
00:54:57.260
is available on Amazon or wherever books are sold. And you can come and visit our website,
00:55:03.100
redteamthinking.com. Redteamthinking.com will tell you more about red teaming,
00:55:08.060
give you more resources, and also give you information about upcoming workshops that we're
00:55:12.420
offering if you're interested in getting trained in some of these tools and techniques.
00:55:16.040
Fantastic. Well, Bryce Hoffman, thanks for your time. It's been a pleasure.
00:55:18.620
Likewise, Brett. Really enjoyed the conversation.
00:55:21.060
My guest today is Bryce Hoffman. He's the author of the book, Red Teaming. It's available on Amazon.com
00:55:25.280
and bookstores everywhere. You can find out more information about his work at his website,
00:55:28.380
brycehoffman.com. Also check out our show notes at aom.is slash redteaming.
00:55:32.620
You can find links to resources where you can delve deeper into this topic.
00:55:38.060
Well, that wraps up another edition of the AOM Podcast. Check out our website at
00:55:45.900
artofmanliness.com where you can find our podcast archives, as well as thousands of articles
00:55:49.400
we've written over the years about personal finances, how to be a better husband, better
00:55:52.660
father, you name it. We've pretty much, we've probably covered it. And if you haven't done
00:55:56.520
so already, I'd appreciate if you take one minute to give us a review on iTunes or Stitcher. It helps
00:55:59.840
that a lot. And if you've done that already, thank you. Please consider sharing the show with a
00:56:03.300
friend or family member who you think will get something out of it. And if you'd like to enjoy ad-free
00:56:06.660
episodes of the AOM Podcast, you can do so in Stitcher Premium. Head over to Stitcher Premium,
00:56:10.320
sign up, use code MANLINUS for a free month trial, and you start enjoying ad-free episodes of the
00:56:14.880
AOM Podcast. As always, I appreciate the continued support. Until next time, this is Brett McKay,
00:56:19.860
reminding you not only to listen to the AOM Podcast, but put what you've heard into action.