The Art of Manliness - July 31, 2025


#591: Solve Problems Before They Become Problems


Episode Stats

Misogynist Sentences

10

Hate Speech Sentences

3


Summary

So often in life, we get stuck in a cycle of reaction. We deal with emergencies, we put out fires, and we figure out a way to be more proactive rather than reactive, thereby preventing fires from starting in the first place. But we can t seem to switch gears and explore how to start solving the problems of business, business, and society before they become problems. My guest today explores why that is, and explores why we can do to begin solving problems in business, society, and business before they are problems.


Transcript

00:00:00.000 Brett McKay here and welcome to another edition of the art of manliness podcast so often in life
00:00:11.560 we get stuck in a cycle of reaction tackle the most urgent task we deal with emergencies we put
00:00:16.220 out fires we intuitively know we'd be better off we figured out a way to be more proactive rather
00:00:20.660 than reactive thereby preventing fires from starting in the first place but we can't seem
00:00:24.480 to switch our approach my guest day explores why that is what we can do to start solving the
00:00:28.820 problems of business life and society before they become problems his name is Dan Heath today we
00:00:33.340 talk about his latest book upstream the quest to solve problems before they happen we begin our
00:00:37.520 conversation discussing the issues that keep us from nipping problems in the bud including problem
00:00:41.480 blindness lack of ownership and tunneling along the way Dan shares insights on how to overcome these
00:00:46.140 roadblocks we then shift gears and explore how to find the best upstream solutions to problems which
00:00:50.780 requires getting as close as possible to the problem while also being able to survey the system it's
00:00:54.860 embedded in from a bird's eye view Dan explains the principles at play with plenty of real life
00:00:59.040 examples of how these tactics were used to effectively tackle big seemingly intractable social problems lots
00:01:04.500 of great insights that you can apply to solving problems in your personal life business and
00:01:07.680 community after the show's over check out our show notes at aom.is slash upstream
00:01:12.160 all right Dan Heath welcome to the show hey thanks Brett glad to be here so I've been following your
00:01:27.700 work for a long time since made to stick and you got a new book out upstream the quest to solve
00:01:33.080 problems before they happen curious how does this book a continuation of the work you've done with
00:01:38.360 your brother chip on your other books well this one was a slow burn to be honest the first time
00:01:44.440 I opened a file called upstream notes was 2009 so this has been in the back of my brain for over a
00:01:51.800 decade and there were two things that happened right about the same time that got me interested in
00:01:57.500 this topic the first was I heard a parable somewhere I can't even remember where but the parable goes like
00:02:03.660 this you and a friend are having a picnic on the side of of a river and uh just as you've kind of
00:02:11.520 laid out your picnic blanket you're ready to get started you hear a scream you look behind you there's
00:02:16.340 a child in the river apparently drowning splashing around and so you and your friend just instinctively
00:02:21.940 jump in grab the kid bring him to shore and just as your adrenaline from the save starts to die down a
00:02:29.400 bit you hear another scream you look back there's a little girl in the river drowning so back in you go
00:02:35.200 you fish her out and no sooner have you brought her to shore that you look back there's two more kids
00:02:40.080 drowning in the river and you begin this kind of revolving door of rescue in and out and just as
00:02:45.540 you're starting to get really fatigued from this work you notice your friend is wading over to the
00:02:50.320 shore and steps out as if to leave you alone and you say hey where are you going I can't do this by
00:02:56.040 myself there's all these kids drowning and your friend says I'm going upstream to tackle the guy
00:03:01.920 who's throwing all these kids in the river and that story really stuck with me and then it feels
00:03:08.880 like maybe a month later I was having a conversation with an assistant police chief in a Canadian city and
00:03:15.020 he told me a story that that just resonated with that parable and and what he told me was was was a kind
00:03:22.120 of thought experiment where he said imagine you got two police officers and one of them goes downtown
00:03:29.440 in the morning to an intersection that is kind of notoriously chaotic and and just by stationing
00:03:35.860 herself there visibly she gets the drivers to be a bit more cautious to be more careful and her presence
00:03:42.300 deters accidents from happening and then the second officer goes to a different part of downtown where
00:03:48.760 there's a prohibited right turn sign and she hides around the corner and waits for people to break that
00:03:54.900 rule and then she nabs them and gives them a ticket and his question was which of these officers is doing
00:03:59.860 more for public safety and he said indisputably it's the first that's preventing accidents from happening
00:04:06.140 but he said guess who gets promoted in the department guess who gets rewarded guess who gets raised
00:04:12.240 it's the second officer because she comes back with this stack full of tickets that's the evidence of her work
00:04:19.000 and something about those two things together just got me thinking in depth about this issue of
00:04:25.920 why are we so often drawn to reactive elements of life like the cop who's reacting to people who make this
00:04:33.120 illegal turn when it theoretically at least would be desirable to do a better job preventing
00:04:39.980 problems going upstream and tackling the guy who's throwing the kids in the river rather than
00:04:44.180 perpetually saving the kids downstream so that was the birth of this well yeah as you said it's fun i
00:04:50.080 mean it's i mean it's not fun but it is kind of fun to react to problems because you feel important
00:04:53.980 you feel like you're doing something then as opposed to you know thinking about well how can we make this
00:04:59.340 thing not happen at all yeah i mean there's a kind of heroism that comes from downstream response i mean
00:05:06.360 in some of the obvious ways you know a firefighter putting out flames or
00:05:10.200 a lifeguard jumping in to to save someone who's drowning there's genuine heroism there
00:05:14.660 we even have forms of it in in white collar jobs you know the the person who
00:05:19.740 stays up all night to meet the critical deadline and gets a lot of praise around the office and
00:05:25.300 and i suspect we all know some people who almost seem to live for those moments in fact i've gotten some
00:05:32.200 some emails from some people who said they suspect that their colleagues almost want the flames to
00:05:37.360 break out so they can be the firefighter and i think while certainly we should be glad there are
00:05:42.660 people around to save the day i mean my point of view is the need for heroics is usually pretty good
00:05:49.460 evidence that there's a systems problem right it's it's great that the lifeguard saves the kids in the
00:05:54.760 ymca pool but if things have been properly configured if you didn't have kids in the pool who are weak
00:06:00.480 swimmers if the lifeguard chair had been put exactly where it's supposed to be so there are no blind
00:06:04.960 spots for the lifeguard if the lifeguard had been taught better scanning techniques maybe there never
00:06:09.520 would have been a drowning incident in the first place and so that that riff on heroism like is a
00:06:15.340 hero the person who saves the day or is a hero the person who keeps the day from needing to be saved
00:06:20.900 that really got in my head so let's talk about so i mean i think we all agree we'd rather like
00:06:26.760 prevent the problems then you have to deal with them when they do happen but as you start off in
00:06:31.740 the book you talk about how it's really hard to first solve downstream problems because we can't
00:06:36.960 even oftentimes can't even see that the problem exists and you call this problem blindness and in
00:06:42.120 your research you say there's a lot of things going on psychologically sociologically that causes
00:06:46.960 this problem blindness what's going on there yeah problem blindness says oftentimes we can't see
00:06:54.660 the problems around us or or even if we can see them we code them as if they're just inevitable
00:07:01.680 like let me give you an example there was a guy named marcus elliott who was a medical doctor that
00:07:07.340 was interested in sports and back in 1999 he was hired by the patriots the football team
00:07:12.860 to join their staff they've been plagued by hamstring injuries especially from some of their skill
00:07:18.500 players a lot of wide receivers out it had 22 major hamstring injuries the season before and it was just
00:07:23.880 really wreaking havoc with their performance so marcus elliott comes in and in the mental model at
00:07:29.920 that time in pro sports especially football was you know look this is the dangerous game it's a violent
00:07:36.420 game people are going to get hurt that's just the way it is of course but marcus elliott had a very
00:07:42.260 different philosophy his point of view was most of the injuries that happen in pro football are actually
00:07:47.760 the result of subpar training inadequate training and so he started this this brand new regime at
00:07:53.800 the patriots where before it was like a one-size-fits-all program it's like people that were in radically
00:08:01.200 different positions you know nose tackle and wide receiver were getting generically speaking the same
00:08:07.060 kind of training they're trying to get stronger they're lifting weights and so marcus elliott starts
00:08:11.760 doing this this individualized training where first he assesses them on a variety of things and
00:08:16.720 and he's looking in particular for muscle imbalances because what often creates an injury is is when for
00:08:22.500 instance your right hamstring is significantly stronger than your left and that can show up in
00:08:26.840 ways that that end in injury and so he starts doing these kind of one-off training programs to make sure
00:08:32.140 there was balance among muscle groups that they were preparing for the kinds of of skilled maneuvers
00:08:38.160 they'd be doing during the game and the proof was in the pudding the season after he had begun his
00:08:43.100 regimen there were three hamstring injuries versus 22 and so all of a sudden marcus elliott is making
00:08:50.180 believers out of people it's like the attitude before well injuries are just part of the game they
00:08:55.520 always will be here it's a violent game of course that happens that's problem blindness that means we may
00:09:01.340 be aware certainly we're aware when athletes get hurt but we just assume there's nothing we can do about
00:09:06.880 that and it takes in these situations someone like marcus elliott to come along and say hey wait a minute
00:09:13.180 what we're coding as inevitable what we're coding is natural is neither like we can do something about
00:09:19.880 this we can fix that problem and that's how we overcome problem blindness but how was he able to see that
00:09:25.820 there was a problem like what was it about marcus that made him different i think with people like marcus what
00:09:31.880 happens is they have an understanding of the problem that's deeper than most people i mean
00:09:38.080 keep in mind he's a medical doctor it's not often you find medical doctors and end up as trainers on
00:09:42.980 on pro teams and so you know when i'm talking to him on the phone he he no longer works with the
00:09:48.280 patriots he has his own sports training outfit and he does this this just incredibly obsessive analysis
00:09:56.080 of pro athletes i would almost describe it like an mri for the way pro athletes move i mean he they'll
00:10:03.560 train multiple cameras and all these diagnostics on on nba athletes and watch how they pivot and jump
00:10:10.200 and land and and some of the stuff he was telling me i i literally could not even understand i just don't
00:10:15.240 have enough knowledge of physiology but he can get down to the granular detail of you know look at the way
00:10:21.120 you landed after that rebound and look at the tension that that's running across your knee and
00:10:26.280 based on our diagnostics people who have the kind of tension that you're experiencing right now
00:10:30.960 almost always have a knee injury within the next season or two and so when you're that close to a
00:10:38.100 problem you start spotting leverage points you start spotting opportunities for change and i think
00:10:44.400 that's what allows them to see that there's hope another issue you say that keeps people from
00:10:49.540 actually working on these upstream solving these problems upstream is that there's a lack of
00:10:53.380 ownership that there's no one held accountable for solving those problems why why is that a lot of
00:10:59.220 times it has to do with with silos that develop in organizations i'm sure everybody that works in
00:11:04.700 business knows exactly what i'm talking about like here's an example from expedia the online travel
00:11:10.380 site so back in in 2012 this guy named ryan o'neill who worked in the customer experience unit he was
00:11:17.940 looking at some data and and he discovered something that kind of blew his mind and that
00:11:22.620 was that at that time for every hundred people who booked you know what a flight or a hotel or a car
00:11:29.800 on expedia 58 of them end up calling the call center for support 58 out of 100 now the whole point of an
00:11:38.200 online travel site is presumably self-service and yet the that not the vast majority but the majority of
00:11:44.220 customers using the site ended up needing intervention so he's like what what in the
00:11:48.240 world is going on here so he starts digging into the data figuring out why are customers calling us
00:11:52.680 the number one reason customers were calling was to get a copy of their itinerary 20 million calls
00:12:01.480 were logged in 2012 alone for people trying to get a copy of their itinerary and so this is kind of
00:12:06.380 one of those forehead slapping moments where you're like how could this have happened why was there not
00:12:11.360 an alarm going off when we logged like our eight millionth call for a copy of the itinerary and so
00:12:17.860 the ceo at that time worked with ryan o'neill to create a war room where they started analyzing this
00:12:23.500 question and figuring out hey we need to change things we need to start asking a different question
00:12:27.660 how do we keep people from needing to call us and the fixes were the easiest thing in the world right i
00:12:32.680 mean you give customers tools to get their own itinerary you change the way you send out the email so
00:12:37.280 they don't end up in spam filters and on and on and on the solution was not the hard part what made
00:12:42.400 this hard was that expedia was organized in a way where it was in everyone's interest to ignore or
00:12:50.420 neglect this problem so you think about the silos there's a marketing team whose job it is to try to
00:12:56.520 attract people to come to expedia instead of one of the other travel sites they get measured on you know
00:13:01.780 numbers of people coming and then you've got a product team whose job it is to make the site so
00:13:06.800 easy to use that you're just constantly pushing people toward a transaction and so they're measured
00:13:11.560 on can we get transactions closed and then you've got a web team that's measured on uptime and and speed
00:13:17.460 and then you've got the customer call center and they're measured on what how quickly can i get
00:13:22.500 somebody off the phone and how satisfied are they with the resolution so all those goals make kind of
00:13:28.780 superficial sense when you hear them but then you realize something it's literally no one's job
00:13:34.140 to stop a customer from needing to call nobody i mean it's even worse than that no one would even get
00:13:40.280 a gold star if that happened it's not on anybody's scoreboard and so this is the kind of thing that
00:13:45.340 happens in organizations where because we're constantly pushing for efficiencies and specialization
00:13:51.460 because we want to wring more productivity out of the process we start missing what might be
00:13:58.180 major major problems because they just transcend the gaps between silos and once expedia caught on to
00:14:05.100 that and once they decided to push their way upstream it turned out the solutions were actually very very
00:14:09.880 simple and those 20 million calls just vanished so another barrier is what you call tunneling what do
00:14:16.700 you mean by by that there's a great research study by this um this woman named anita tucker who
00:14:22.240 for her dissertation at harvard she followed nurses around for hundreds of hours just shadowed them to
00:14:28.080 see what their life was like and what she found was they're solving problems constantly you know they
00:14:33.700 they go to get a towel for a patient there's no more towel so they have to figure out where to get a
00:14:37.840 towel they ask for some medication from the pharmacy and they get the wrong medicine or the wrong dose or
00:14:42.820 the pharmacy's out or some equipment breaks anita tucker talks about this one day that a woman was
00:14:48.420 trying to discharge a mother who just had birth and in the discharge process they realized the baby
00:14:54.580 doesn't have the security anklet that goes you know around its ankle to they're intended to keep
00:15:00.140 children from being abducted and and so when it's missing it's a big deal they hunt around for it
00:15:04.460 it turns up in the baby's bassinet so great they can get the mother checked out then three hours later
00:15:10.400 the same exact thing happens with a different baby you know again missing an anklet this time they do
00:15:15.560 a frantic search can't find it so they had to create another way to check the mother out and honor
00:15:20.580 security and and so that was that and so this portrait that anita tucker is painting is nurses are
00:15:28.500 responsive they are improvisational they're resourceful they don't go running to the boss every
00:15:35.320 time something goes wrong they can kind of own things and work around problems and when you think about
00:15:39.860 it like that it's pretty pretty inspirational it's a great portrait but then from another perspective
00:15:44.980 you look at the situation and you go this is the description of a system that will never improve
00:15:52.220 a system that never learns right because if you are constantly working around problems and you're
00:15:59.580 never solving those problems at the systems level you know why are we running out of towels why are
00:16:04.620 these anklets slipping off baby's ankles you're dooming yourself to solving those problems forever
00:16:10.960 now to be clear this is not a nurse thing i'm not picking on nurses i think anita tucker could have
00:16:16.740 shadowed probably any profession and found exactly the same thing and the phenomenon she discovered
00:16:22.540 i think is well described by some psychologist who wrote a book called scarcity and they call this
00:16:28.680 tunneling and they say that tunneling happens when we have a scarcity of of time or resources to combat
00:16:36.540 the problems we're facing and when we have that scarcity it's almost like we we give up thinking
00:16:42.640 that we can solve all the problems on our plate and we may even give up trying to prioritize them
00:16:48.160 and it just becomes this experience where we feel like we're in a tunnel you know just conjure up that
00:16:53.520 that mental image and in a tunnel all you're thinking about is god how do i get forward if
00:17:00.360 there's something blocking my way i want to get it behind me as quick as possible i just have to keep
00:17:05.040 moving you know those nurses okay i'm out of towels i don't have time to do like root cause analysis on
00:17:11.100 why there's no towels i just i got 10 patients clamoring for my attention what am i going to do i got
00:17:15.180 to go steal a towel from the unit down the floor right and and that makes sense and that's familiar
00:17:21.040 behavior and i suspect all of us can empathize with tunneling but we just have to realize that
00:17:25.600 it's a trap that if we're stuck in the tunnel we stop asking the really important questions like
00:17:30.960 are we even going the right way you know are we headed to the destination might there be an entirely
00:17:36.780 different tunnel that would get us there faster or better so this is one of the key traps that i
00:17:41.900 think keeps us downstream tunneling well i think you talk about this in the book you know one problem
00:17:47.320 that we we spend a lot of time money and resources trying to solve downstream is poverty and poverty
00:17:52.740 is often caused by tunneling people who are in poverty they have scarcity of time resources bandwidth
00:17:58.940 and so they're just trying to put out all the fires they don't take the time they don't have the
00:18:02.880 ability to take the time out and they go okay what can i do to not have these problems in the first
00:18:06.920 place exactly right like if you look at something like payday loans which of course we all know are
00:18:12.940 just notoriously expensive and the aprs can be hundreds of percentage points a year and it can be a
00:18:20.100 real trap but but from the perspective of tunneling it makes sense like if you've got more problems than
00:18:26.140 you can handle if you've got a sick kid and you're working two jobs and you're already on thin ice and
00:18:31.200 you can't afford to miss any more days and and child care isn't easy for you and nutrition isn't easy
00:18:37.560 and then your car breaks down and it's like to keep your life in order i mean doesn't it make sense
00:18:44.220 that you would just walk down the street to the payday loan place and get enough money to fix your
00:18:48.460 car so you can get to work that day and not get fired it's like we come from from outside with this
00:18:53.740 attitude of of kind of pristine financial advice thinking and oh well you know that that's not wise
00:19:00.680 to make a decision where you have an apr that's so high but but if you're in the tunnel it looks like a
00:19:05.500 solution and i think that's why you know to make some of these upstream solutions possible we first
00:19:11.220 have to find ways to escape the tunnel yeah you got to provide some slack in your life exactly and i
00:19:16.840 think um i mean it's easy to say in the case of poverty i i i don't think i have the answers i wish
00:19:22.040 i wish someone did to figure out how do we get more slack in the system in organizations i think it's it's
00:19:27.620 slightly more practical though still difficult like in that nurse situation you know the way i portrayed it
00:19:33.820 seem like well they're going to be stuck in the tunnel forever but i think there are surprisingly
00:19:38.580 easy ways to at least provide an escape a temporary escape like there's a bunch of health systems that
00:19:45.060 have adopted what are called safety huddles where you know every day usually in the morning they have
00:19:50.560 an all hands meeting it might be very quick 15 20 minutes where they talk about safety near misses from
00:19:58.420 the day before you know things that went wrong or almost went wrong you know maybe a the wrong
00:20:03.540 medication almost delivered to a patient caught at the last second and they talk about hey how do we
00:20:08.220 keep those kinds of things from happening in the future and that's systems thinking and they look at
00:20:12.600 the day ahead of them they say is there anything really different or complex that's going to be
00:20:16.820 happening today that we should watch out for and what i love about that is that's a kind of
00:20:22.400 structured way to get out of the tunnel right that would have been the perfect opportunity for that
00:20:27.220 nurse dealing with the missing security anklets on babies to say hey this weird thing happened
00:20:32.660 yesterday twice with the anklets they're falling off and i swear we're putting them on tight like we
00:20:37.900 need to figure out what's going on here we need to deputize someone to take this over and so so even a
00:20:42.980 temporary escape from the tunnel i think can be really powerful one thing i've done personally in my
00:20:48.260 in my life my wife and i have done is like whenever i'm sure we all have had those experiences where
00:20:51.560 we're just getting overwhelmed everything's kind of piling up because you got sick kids got sick stuff
00:20:56.120 happened at work and you're just like i can't get this done so we have this thing called a reset day
00:21:00.520 where we just take a day off during the work week so the kids are at school we plow through all those
00:21:05.900 things have been building up and then we also take time to figure out how can we prevent this from
00:21:09.400 happening again and we do it i don't it's just you do when you feel like you need it but i've found
00:21:14.180 that to be like a way to provide slack and it pays off that taking that day off pays off dividends
00:21:19.300 down the road i love that idea and it's the perfect illustration of how slack can be the antidote
00:21:25.740 to tunneling i i've been fascinated by in researching this book you know probably we're going to talk a
00:21:31.920 lot about big social issues but i've also been fascinated by how upstream thinking can can make
00:21:38.000 a difference in our personal lives like i talked to this guy who you know all couples have these
00:21:42.620 recurring things that they bicker about uh you know you left the toilet seat up again and that sort of
00:21:47.820 thing so his thing with his wife was the hallway light and he was always going in and out usually
00:21:53.800 to take the dog out and so he'd flip on the hallway light and he'd come back in and he'd forget to turn
00:21:58.860 it off and that just irritated his wife so it became like this little thing that they bickered about
00:22:03.560 and one day all of a sudden he realizes this thing we've been fighting about that's like a recurring
00:22:08.400 irritant in our relationship i can solve this like i can make this go away forever and so he files for
00:22:16.680 divorce i'm totally kidding he didn't file for divorce no it was a much simpler fix he went to
00:22:23.640 home depot and he bought what's called a light timer which is just like a different plate that goes over
00:22:29.040 your light switch and you can press a button that says five minutes the light will turn on for five
00:22:33.620 minutes and then it'll auto turn itself off and and i just love stories like that because how many
00:22:40.120 places in our life have we just learned to adapt to something we've adapted to a problem we've just
00:22:47.300 come to accept that well we're forever going to be bickering about that one thing when i mean for god's
00:22:53.340 sake it took one trip to home depot and a ten dollar light plate and now this irritant has disappeared from
00:22:59.760 this couple's existence forever i mean that's that's upstream that actually inspired me so one of the
00:23:05.320 irritants we have in our families that with with ipads so we have like you know ipad tablets for
00:23:10.280 each of our kids and the problem is charging them like they're always running out it's like one percent
00:23:14.560 and like you only have one cable and we'd always like plug it in the computer and they'd always like
00:23:19.440 bicker about who gets it and i was like i'm tired it's like almost every other night and so i just i
00:23:23.460 decided to buy a hub for usb chargers and it's gonna put it in there they're gonna be able to plug in
00:23:29.480 their ipads at night both of them and not gonna have that issue anymore and i'm looking forward i love that
00:23:34.340 i love it and then and then you're kicking yourself like why didn't i do this like three
00:23:39.200 years ago like my favorite example i love this one because it's just so wonderfully trivial so this
00:23:46.700 woman was talking about she got transferred to a different department at work and she had to move
00:23:51.820 desks and her desk was right by one of those heavy doors that goes into a stairwell and and every time
00:23:59.880 that door was open from somebody coming up the stairs it had this just horrible squeak and it
00:24:05.480 just you know drove her crazy and so she puts up with this for like two days and then on the third
00:24:12.100 day she brings in like a can of wd-40 and just lubes up the hinges the squeak disappears and people on the
00:24:19.740 floor treat her like she is a miracle worker they are like astonished that she solved this problem
00:24:24.960 that they've probably been living with for months or for years and and that's what i mean about our
00:24:30.740 capacity to adapt to situations can almost be a curse right because we we come to just accept that
00:24:38.120 certain problems are part of our world that that really don't need to be well these three things
00:24:43.400 happened there so she saw a problem she took ownership of the problem when she didn't have to
00:24:47.480 like she had to volunteer for it like it wasn't assigned to her and then she what she got out of the
00:24:52.140 tunnel she did something about it exactly right and and your ownership point is is the one that i
00:24:58.200 really want to call out because this is a very very strange thing about upstream versus downstream
00:25:03.320 problem solving so with downstream problems you know you can almost always pinpoint when something goes
00:25:11.060 wrong whose job is it to fix you know a house catches on fire you know of course it's going to be the fire
00:25:17.200 department's going to fix that you know or or at the ymca pool someone's thrashing it's going to be
00:25:22.500 the lifeguard's job to fix that but when you flip it around and you say whose job is it to prevent
00:25:29.040 fires from happening all of a sudden the ownership gets very diffuse right well the homeowners have some
00:25:36.320 responsibility and the builders have some and the people who write building codes have some and
00:25:41.300 the fire department in the sense of education has some and all of a sudden when a problem doesn't
00:25:47.980 have an owner the chances are it's not going to get fixed and so this leads to a very odd conclusion which
00:25:53.680 is even though downstream activity is is obligated and almost mandatory like of course if there's a
00:26:00.980 problem we're going to fix it upstream activity even though the stakes can be enormous is often voluntary
00:26:08.360 i mean it's it's often chosen rather than demanded so upstream activity starts when somebody somewhere
00:26:16.020 says i didn't create this problem but damned if i'm not going to be the one who fixes it and sometimes
00:26:22.620 that's in trivial things like being the person who brings in the wd-40 to work sometimes it can be huge
00:26:27.980 things like telling some stories in the book about homelessness and substance abuse and others where
00:26:32.860 it was a group of people who voluntarily put on their shoulders the burden of solving that
00:26:38.180 problem we're going to take a quick break for your word from our sponsors and now back to the show
00:26:43.260 well this leads nicely to my next question so we talked about the barriers of of solving upstream
00:26:49.260 problem problems upstream once you do that you've looked at you have like seven questions that people
00:26:55.700 should ask themselves as they're trying to fine-tune what the problem is and work on working on a
00:26:59.660 solution for that problem and the first question you found that's useful is how do you unite the right
00:27:05.020 people because as you just said upstream problems the responsibility is diffuse it might be a whole
00:27:09.660 bunch of different people who are responsible that can solve that problem so how do you how do you find
00:27:15.180 those people who can actually help solve the problem before it happens this was one theme that that i
00:27:21.700 noticed again and again and very different looking kinds of problems is that people learn to do what
00:27:27.200 i call surrounding the problem so i'll give you an example from there's a city called rockford in
00:27:32.820 illinois it's actually the second biggest city behind chicago and they had a problem as do many
00:27:39.580 cities this is a kind of a former factory town um that had gone into hard times after the great
00:27:45.060 recession and so they had a homelessness problem there and the mayor was a gentleman named larry morrissey
00:27:51.180 he was in his third term he'd been working on homelessness for nine years and by his own
00:27:56.220 account he said they'd made no progress and maybe even the problem had gotten worse so he'd become a bit
00:28:02.340 jaded about the issue of homelessness and around that time one of his colleagues challenges him to
00:28:08.120 take what was called the mayor's challenge which was a uh an initiative sponsored by the federal
00:28:12.680 government to encourage communities to try to end veteran homelessness in their cities and and so
00:28:18.020 morrissey is kind of skeptical he's like what's going to change we've been doing this for nine years
00:28:21.780 we've gotten nowhere he agrees reluctantly to take this challenge and about 10 months later
00:28:28.580 rockford becomes the first city in the united states to eliminate the problem of veteran homelessness
00:28:35.420 and so you just look at kind of the the bookends of that story and you go what in the world happened
00:28:40.980 in 10 months that didn't happen in nine years and it has to do with basically two things they they
00:28:47.040 changed a bunch of their their strategic and systems work which we can unpack later but but to me i think
00:28:53.000 the fundamental thing that they did was they changed the way they collaborated so the first thing they
00:28:58.840 did was you know back to that example of whose job is it to keep fires from happening question here is
00:29:05.300 whose job is it to keep homeless people from being on the streets and you could make a case for about a dozen
00:29:10.600 different parties the va has part of the ownership and the police department uh the the health care
00:29:18.160 system the homeless shelters social services and so it's one of these situations you know back to the
00:29:24.100 expedia example where things are heavily siloed there's lots of gaps between organizations so in
00:29:28.940 rockford they start getting together you know meeting as a group with all those constituents i just
00:29:34.100 described and others they meet around the same table and so that's part one of the story is you got
00:29:39.080 the right group of people together people with all the different facets of the problem and then the second
00:29:44.740 thing is you changed how those parties collaborated so one thing that had plagued homelessness up to
00:29:51.580 that point is that they lacked useful data like the federal government requires cities to do what's
00:29:58.520 called a a point in time census where you you kind of go out one night every year and you do your best
00:30:03.880 to count all the homeless people and then you wait until the next year and you do another one and they
00:30:08.120 realize that's just totally inadequate for trying to manage a problem in real time and so what they
00:30:14.620 created on their own you know back to that notion of upstream work being voluntary they created a
00:30:19.660 by name list of every person in the community who's homeless i mean literally i saw it it's a google doc
00:30:27.100 and you go down and you're like well there's steve and there's michael and there's david and and for
00:30:31.400 every person there's a description of what their situation is and how's their health and roughly how old
00:30:36.160 they are and who's talking to them and so these meetings among the the different organizations who are
00:30:42.780 surrounding homelessness their meetings would be conducted name by name you know instead of talking
00:30:50.160 theoretically about hey what can we do about this horrible systemic problem of homelessness no no it
00:30:55.400 was steve okay who's seen steve in the last week where is he well he's still got his tent set up under
00:31:00.500 the bridge but he's been coming into the homeless shelter a bunch to eat lunch we want to let steve know
00:31:05.600 that we have housing for him when he's ready to move in you know we're ready to get him off the street so
00:31:10.320 who's going to talk to steve this week and it became practical it became human it became tangible
00:31:16.960 and it was also easy to score victories because you start seeing hey last week steve was on the street
00:31:22.860 this week steve is in supported housing and larry morrissey the mayor said that that kind of approach
00:31:29.900 was transformative that these meetings they would have used to be bitch sessions he said and now it was
00:31:36.120 like they had a goal they had a tangible mission and so that's how you can spin your wheels for nine
00:31:43.800 years on homelessness and accomplish nothing and then in 10 months you can become the first city on record
00:31:48.460 to end veteran homelessness and i think that's such a powerful illustration of how we may have more power
00:31:56.540 than we're even aware of that just by changing the way we collaborate and who collaborates and how we
00:32:02.760 measure our progress we can make a big difference in problems that we might have thought intractable
00:32:07.860 so the next question you ask or think is useful to ask when you're trying to solve these ups these
00:32:11.960 problems upstream is how to change the system and that's that's a big question because systems like
00:32:17.300 that can be hard to change they're embedded in bureaucracy they're calcified with tradition
00:32:22.420 so it can seem like it's impossible to change the system so i think i think that can that question
00:32:29.700 lead people to be like well there's nothing we can do right there's a problem blind this is how it
00:32:33.880 always is so what does that look like in action when people ask how to change the system yeah changing
00:32:39.300 the system can can be a very big deal for for the reasons that you said you know systems can be big
00:32:44.620 they can be bureaucratic they can be slow to change but i think the saving grace is that small changes in
00:32:51.620 big systems yield big changes so that's why they're worth fighting for like one of my favorite stories in
00:32:58.720 the book is about this guy named darshak sanghavi who is working in the health care system in the
00:33:04.400 federal government and his job is to look for prevention programs that deserve the support of
00:33:12.140 medicare medicaid that you know could receive funding from those two and so he's scanning and he comes
00:33:17.400 across this program called the diabetes prevention program the dpp which is a a well-known program in
00:33:23.360 health care it's been proven again and again to um to stop some people that are at risk of developing
00:33:30.760 diabetes from developing it so that's a big deal because you know diabetes is a chronic disease it's
00:33:36.560 very expensive to treat it's uh it causes a lot of harm for the individual and so to get medicare
00:33:43.380 medicaid funding for dpp sanghavi has to establish two things number one that that this program makes a
00:33:49.280 difference in people's health and so check there's a ton of evidence for that second thing he has to
00:33:53.920 prove is that the program will save the government money well it seems like there's a slam dunk case
00:34:00.060 there too because if you can stop someone from developing a chronic disease chronic diseases cost
00:34:05.600 a ton to treat surely you stand to save a bundle so sanghavi you know puts his evidence together he thinks
00:34:12.580 this is going to be my first real widespread victory in this role he takes it to the government
00:34:18.360 actuaries who are the people who can certify it as a cost saving program it's the last step before
00:34:23.740 expansion and the actuaries say no we can't certify this as cost saving and the reason is
00:34:31.400 that this dpp program is extending people's lives and when you extend people's lives their health care
00:34:39.080 costs more so just just sit with that logic for a second and think about that that is not a sick joke
00:34:45.920 i'm making that was the actual official logic of the federal government which is of course the biggest
00:34:51.020 payer in our health care system and so sanghavi is just sitting there just stunned like really
00:34:56.940 the the success of this program is going to be the force that brings it down and so he and his boss a guy
00:35:05.040 named patrick conway write an appeal to the chief actuary and then this is my favorite part of the story
00:35:11.680 this is in late 2015 and just three days before christmas something remarkable happens one of the
00:35:18.400 the actuaries that reports to the chief actuary it's a guy that's on the cusp of retirement sends
00:35:23.860 this memo and in the memo he makes this impassioned case that this ruling is just wrong it's morally
00:35:32.580 wrong and in the memo he envisions what would happen if the media got wind of this and the kind of
00:35:38.180 headlines that they would write you know medicare saves seniors die and and the press storm that
00:35:46.900 would come out of this but but fundamentally the case he's making is that we should not penalize
00:35:53.880 programs that help people live longer that that is is an abuse of what actuaries stand for and he says
00:35:59.780 you know actuaries have a special responsibility because while doctors at their worst might only harm a few
00:36:06.540 people actuaries at their worst could harm millions and he says calculators should play a role in
00:36:13.340 determining how much we should reimburse hospitals and doctors but calculators should not play a role
00:36:18.720 in determining how long people should be allowed to live you can almost hear like the trumpets playing
00:36:25.080 in the background you know with this with this memo and justice prevails in this situation the chief
00:36:31.000 actuary reverses the decision dpp gets funded there is now a change in a federal register somewhere that
00:36:38.300 says a prevention program's ability to extend lives cannot be held against it in computing its costs
00:36:46.200 and so back to this idea of systems change that's what systems change looks like i mean it's it's boring
00:36:53.880 on the merits i mean if i showed you the wording of that legal clause in an actuarial guidebook somewhere
00:37:01.680 i mean you would yawn it's it's it's not satisfying like a rescue is or a gunfight or a life-saving
00:37:09.320 expedition but boy does it matter i mean that one little tweak to the federal rule book is going to
00:37:16.380 save or extend thousands of lives there are people that will never develop diabetes who otherwise would
00:37:23.240 have in a world where that one little cluster of sentences hadn't been changed and so that's why
00:37:28.940 systems change is worth fighting for even though it can be quite difficult because it can make all the
00:37:34.120 difference upstream right it was a small change it was difficult to get through and i think the next
00:37:39.360 question you you think is useful to ask is like look what's the leverage point and in this situation
00:37:43.500 it was a small change like this was like the leverage point this was the point where change could happen
00:37:47.220 and so the question what's the leverage point can help you find out what's one thing you can change in the
00:37:52.680 system that will have all these cascading effects downstream yeah that's the thing is is when when
00:37:58.920 you're dealing with some big problem it can be paralyzing like where do you start if you if you
00:38:04.700 got interested in hunger in your community my god where would you what would you do in the first week
00:38:09.960 and i think the best advice that i've gotten from the people that i've i've studied is get closer to
00:38:15.780 the problem to immerse yourself in it if there's any people listening who are who are kind of
00:38:20.760 operations gurus lean or six sigma you know you know this phrase go to the gemba you know get close
00:38:27.020 to where the work is happening so you can observe the problem and i'll give you example from from a
00:38:31.840 really big issue in chicago you know there have been recurring crime waves in chicago and and during
00:38:39.060 one of those some academics that formed what's called the university of chicago crime lab which is
00:38:44.740 this it's almost like a bridge between academia and and police and government practice they were trying to
00:38:50.660 find you know evidence-based ways to reduce crime and at this time there had been just an absolute
00:38:56.860 wave of of youth homicide so lots and lots of young men were were being killed and the lore at that time
00:39:04.700 was it's gang activity you know it's gangs they're fighting the you know struggling over turf people are
00:39:11.960 getting killed well these academics said you know we don't know much about gang activity but we do know
00:39:17.840 how to study problems and so what they did was they went to the medical examiner's office and they
00:39:23.900 asked to examine the records and reports for the last 200 young men who had been killed and they went
00:39:31.180 through those reports and what they found was a very different picture yes there were some deaths
00:39:36.400 related to gang violence but what they found was that more commonly was a situation like this and this is
00:39:42.740 essentially a a streamlined version of a real case that a couple of groups of young men got into an
00:39:49.860 argument on the street and one of the groups was arguing that that a guy from the other group had
00:39:55.200 stolen one of their bikes and it started as an argument and it escalated and and one of the guys
00:40:02.540 started to walk away and and the other group took offense to that found it disrespectful and shot the
00:40:08.100 guy in the back and so what you found when you got close to the problem was what's happening is that
00:40:16.740 the the kind of dumb arguments that teenage boys all over the world get into were escalating to gun
00:40:23.560 violence and so harold pollack was one of the academics involved and he was studying these reports
00:40:28.580 he said you know at university of chicago we have to have equations and so our equation after reading all
00:40:34.140 these reports was young guys plus impulsivity plus alcohol plus guns equals a dead body and so you
00:40:42.220 think about that just think about that let let's zoom out of this situation and think about what
00:40:46.600 they're doing they go you know without bias to to just study and look close at what happened in these
00:40:54.660 situations that went awry and then they come out with this this kind of equation which admittedly is
00:41:00.920 simplified but what it says is those are all independent leverage points right you could try
00:41:07.180 to intervene to reduce access to alcohol you could try to intervene to reduce access to guns you could
00:41:13.020 try to combat impulsivity somehow those are all independent ways to try to get some progress on this
00:41:18.560 problem and in this case they chose the leverage point of impulsivity and we can talk more about the
00:41:25.760 the solution they ended up funding if you like but but i think the important thing for our perspective
00:41:31.220 is their instinct to get closer to the problem was what unlocked the potential leverage points that
00:41:37.320 could be used to find a solution well yeah we can talk about because i thought it was really interesting
00:41:40.680 this is they decided to do the impulsivity thing and basically they one guy developed this after
00:41:46.320 school program for boys where they learned how to control their emotions it's a fascinating program it's
00:41:52.760 called becoming a man bam that's what they call it for short and it was invented by this guy named
00:41:57.780 tony d who had kind of a rough upbringing but found himself as a young adult discovered that he loved
00:42:05.800 psychology and that he wanted to to help men young men grow up with male mentors and to teach them
00:42:12.860 how to be a man and how to live with integrity and so he he figures out this program which is just
00:42:19.120 utterly unique it's it's like imagine 10 or 12 uh 16 year old boys in in high school who are brought
00:42:27.480 together they put their chairs in a circle and and they start these sessions with uh with a check-in
00:42:32.800 which is just sort of like what's on your mind how are you feeling today how are you feeling
00:42:37.040 psychologically physically spiritually and and and at first as you can well imagine i mean these kids are
00:42:44.500 it's like crickets no one is gonna no 16 year old boy is just gonna make themselves vulnerable in a
00:42:49.640 situation like that and so for the first session or two tony d has to basically claw it out of them
00:42:55.380 will you at least tell me whether you're mad sad or glad today but eventually they come to trust each
00:43:01.560 other and they come to open up and what tony d does with them is like a combination of of a support
00:43:08.260 group and tough love and male mentorship but but there's also an important part of the program that's
00:43:15.720 about self-control and it's about tony d contrasts warrior energy with with savage energy and it's
00:43:24.840 fundamentally about anger is a natural state i mean especially for for a teenage male anger is going to
00:43:32.160 happen but it's a question of how do you use anger do you let it be a destructive force
00:43:36.980 or a constructive one can you be the person when that argument breaks out in the street about the
00:43:42.400 stolen bicycle can you be the one that takes three seconds to just reflect you know how badly could
00:43:48.360 this go and is that what i want do i want to live with the consequences of escalating this
00:43:53.620 and and so tony d had created this program called bam this kind of fascinating program and when the crime
00:44:01.000 people found out about it they said ha like what if what tony d is doing is operating on this leverage
00:44:09.040 point of impulsivity in other words what if what if his program is teaching people to rethink the the
00:44:16.360 instinct to escalate to get violent and so they they end up funding the bam program they do a randomized
00:44:22.560 control trial to test whether it really works and the results come back and they kind of astonish
00:44:27.780 everyone like it takes almost a year for them to finish crunching the data and they've had to get
00:44:34.240 the police department involved because they're trying to cross reference arrest rates and so forth
00:44:38.080 and they have this unveiling and and harold pollack the guy i mentioned earlier who had examined the medical
00:44:43.520 examiner reports he tells the people involved among the students who participated in this bam program
00:44:49.640 arrests were down 28 percent versus the control group and violent crime arrests were almost cut
00:44:55.880 and half and everybody's jaws dropped and pollack said it was one of the greatest moments of his
00:45:02.280 entire career and and that's the kind of thing that that can happen with a with a a problem that is as
00:45:10.260 seemingly complex and unsolvable as as a crime wave in chicago they took it apart they got close to it
00:45:19.080 they found a potential leverage point they found a program that would act on that leverage point and in this case
00:45:25.020 it worked and in this case they were able to measure success but one tricky thing about solving problems
00:45:31.080 upstream is sometimes it can be hard to measure right sometimes like how do you measure a problem
00:45:36.020 that didn't occur right i mean you can say well it could have been worse it's like well could it have
00:45:40.880 been worse i don't know you have nothing to judge it against exactly right and and if you go back to
00:45:46.260 at the very beginning i was talking about the two police officers and one of them you know stayed in the
00:45:50.860 busy intersection and kept accidents from happening you ask yourself how do you prove as you said when
00:45:56.480 something doesn't happen you know that that morning the police officer's presence stopped a guy who was
00:46:03.040 commuting to his job downtown he would have been killed in an accident that morning had it not been
00:46:08.320 for her presence but he'll never know and she'll never know and so the only thing that you can rely on in
00:46:14.920 a situation like that is data you know you keep logs of the accidents that happen before and after you
00:46:21.240 positioned a police officer there and and if you do your proper statistical analysis and the number of
00:46:27.060 accidents go down maybe you can attribute that to your work i mean that's that's what upstream success
00:46:31.660 looks like it's it's like numbers moving on a page it's not as tangible and it's not as satisfying
00:46:37.680 as fishing a kid out of the river but then having said that that the data is the scoreboard for this
00:46:44.420 kind of work it also opens up a whole can of worms like i think about it this way imagine that
00:46:50.360 in whatever town or city you live in the chief starts touting that crime has gone down 20 percent
00:46:57.200 you know over the past few years and and that on the surface is a huge victory everybody should be
00:47:02.760 happy about that and that's an example of of measuring upstream progress with data crime did not
00:47:08.740 happen okay so hooray but then you start having to poke at that data in different ways
00:47:14.240 for instance what if you found out that the crime had gone down by 20 percent everywhere in the u.s
00:47:20.320 now how would you feel about the police chief's genius and approaches what if you found out that
00:47:26.240 the crime was down 30 percent nationally and 20 percent locally now how would you feel about the
00:47:31.960 intervention success or failure what if you found out that that the chief had been so vigorous and
00:47:38.620 aggressive about enforcing you know a reduction in crime that you discovered that officers at the
00:47:44.300 street level were kind of burying crimes like looking the other way in certain situations or doing
00:47:50.020 what's called downgrading where very serious crimes like rape are often you know scaled down to something
00:47:57.520 like sexual assault because nobody wants to have a rape number show up on their record and so all of a
00:48:04.620 sudden what looked like just plain and simple statistical proof that you've done a good thing
00:48:10.100 we have a lot of questions like could you have actually made things worse even as the numbers
00:48:16.560 suggested you were making things better and so that's a theme in the book is number one we're almost lost
00:48:25.140 if we don't have some kind of data to use to orient us and to guide us so data is essential but data is
00:48:31.840 also just a minefield of potential problems that we have to be aware of and we have to constantly
00:48:37.340 be trying to root out well the last question i want to talk about is you know when you start
00:48:42.460 messing with systems systems are complex so you can make one change one place and like you can see
00:48:48.020 the change you want and maybe it has the desired effect but then it causes a problem somewhere else
00:48:54.500 so how do you ensure the changes you make isn't going to actually cause more problems for you
00:48:59.980 this is another one of those cautionary themes in the book is is because we're intervening in systems
00:49:07.240 we've got to be very very clear on what the ripple effects of our intervention are like here's an
00:49:13.240 example from new york city so about 10 years ago there was a a young google engineer a young man he was
00:49:20.840 just walking through central park and this freak accident happens he was struck by a falling branch from
00:49:28.100 an oak tree and it caused brain injuries and paralysis it was just a horrible thing total fluke
00:49:34.000 injury except that a bit later the the controller of new york city a guy named scott stringer
00:49:39.560 he starts analyzing the claims that the city is paying out like this this engineer's lawsuit
00:49:46.040 eventually got settled for 11 and a half million dollars and and what scott stringer realizes is
00:49:51.200 there's actually a whole rash of settlements coming from falling branches and he's like
00:49:56.720 what the hell's going on here and he keeps investigating and finds out that a couple of
00:50:03.100 years earlier the city's pruning budget had been cut in an effort to save money and so one of one of his
00:50:10.540 assistants david saltenstall said you know whatever money we thought we were saving on the pruning side
00:50:16.620 we were paying out and then some on the lawsuit side and so they start studying the city's claims
00:50:22.940 and they start finding these examples of you know there's there's one playground in brooklyn that was
00:50:28.900 responsible for multiple lawsuits like five different kids had broken their leg on this playground sued the
00:50:33.820 city because a swing had been hung a little bit too low and and that's an example of what we're talking
00:50:39.560 about where depending on how you frame the situation yes the uh the parks department saved
00:50:46.620 some money by cutting the pruning budget that looked good for them for that piece of the system
00:50:50.860 but if you zoom out and look at the system as a whole that was a bad decision it cost the system as a
00:50:56.420 whole money because it it number one cost more payouts in terms of lawsuits leaving aside the obvious human
00:51:03.440 misery that came from the falling branches and and i think a good caution is there's this systems
00:51:10.980 thinker named danella meadows who i i quote a lot in the book and she said you know when you're thinking
00:51:17.000 about a system you've got to figure out a way that lets you see the system as a whole not just the part
00:51:23.980 that might have drawn your attention to begin with and i think that's a great caution is is a lot of times
00:51:29.820 we're coming into problems with some angle you know there's some part of the income statement in a
00:51:35.660 business there's some part of our scoreboard that we're operating against that's kind of provoked us
00:51:39.820 to action but we got to be careful what are the parts that are linked up to that and might making
00:51:45.420 problem a better actually make other problems worse in a way that that brings down the system as a
00:51:50.700 whole and this is where having the right people like multiple different groups involved can help solve
00:51:56.040 that because you can see how this could affect different parts of the system that's a really
00:52:00.620 good point yeah exactly so you know the the parks department is making this budget decision unilaterally
00:52:06.500 but if they'd really thought through and if they brought in some colleagues from other places maybe
00:52:10.460 they could have discovered that in advance maybe they didn't have to learn this the hard way
00:52:14.020 so let's say someone's listening to this and they're part of an organization it could be their
00:52:17.680 business uh non-profit church or whatever and they there's a lot of like downstream problems and
00:52:23.520 they're like i want to i want to make the cell on that we're going to start solving these problems
00:52:27.400 upstream how do you any advice on how to make that cell because it's a it can be a hard sell to take
00:52:32.680 time off to solve these problems that you might not it might take a while to to solve the problems
00:52:38.720 how do you how do you think you can make the cell yeah it can be tough i mean uh this was this was
00:52:46.140 not intentional but there's kind of two meanings of the word upstream you know upstream in the sense of
00:52:50.020 preventing problems and then there's the sense of upstream as in swimming upstream and i think
00:52:55.260 it's no accident that that the word means those two things it's hard to break out of this pattern
00:53:00.660 and i think maybe your your best strategy is is to do two things one we've already talked about which
00:53:07.060 is to get close to the problem and really be able to describe it up close you know to look at those
00:53:12.000 medical examiner reports or to to get up close to the factory lines and look at where errors are
00:53:18.540 happening and then the second part that we haven't talked about is find a way to show what you've
00:53:24.660 discovered because i think showing is is as our you know eighth grade english teacher always told us
00:53:30.540 showing is a lot better than telling and i'll give you an example i was working with duPont years ago
00:53:35.360 and and one of the managers told me the story that they were trying to get one of their factories to
00:53:40.840 take waste more seriously and you know any any factory these days after you know 50 years of
00:53:47.680 quality improvement is is going to be pretty good just from the get-go and so we're talking about
00:53:51.880 you know round off error type waste probably one to five percent or something but still is a problem
00:53:57.680 we're tackling but if you're if you're somebody who works on the front line of a factory like how do
00:54:01.580 you get excited about going from three percent waste to half a percent or something and so there wasn't a lot
00:54:06.580 of motivation around this move to reduce waste and so the manager is trying to figure out like how do i
00:54:12.180 how do i get people to want to do something different and he realized that he had to stop
00:54:17.660 talking about it at the process level and he's had to to to be able to help them visualize the harm
00:54:24.580 and so one day they show up to work and they're expecting to you know get on the factory line and instead
00:54:28.780 he kind of ushers them all into a van and drives them to the landfill they used and there was one
00:54:37.260 particular section of the landfill that was kind of dupont's area and he he brought them out there
00:54:43.460 and he showed them just this vast expanse of of trash and waste and you know you see something like
00:54:51.160 that it just it hits you at a visceral level not an intellectual level just the the horror of how much
00:54:57.340 stuff you have just dumped onto the earth because of your work and he said what you're seeing right
00:55:02.920 now is the reason why i want us to start taking waste more seriously he said all of a sudden it
00:55:09.140 made a night and day difference that people started getting with the program they started helping him
00:55:13.500 iterate the processes and i love that story because i think it's a good a good inspiration for the rest
00:55:20.620 of us that that as we get closer to the problem as we turn up those leverage points we don't want to
00:55:26.180 talk about those things we don't want to intellectualize about them we want to show
00:55:30.320 people what we found you know we want to show people that medical examiner report about the two
00:55:36.740 groups of kids that ended up in a gunfight over an argument over bikes because those are the things
00:55:41.680 that are going to motivate action when other people can see what we see well dan this has been a great
00:55:46.280 conversation where can people go to learn more about the book in your work if you're interested in
00:55:50.360 and more about the book just go to upstream book.com and it will have all the details you
00:55:55.680 could ever want fantastic well dan heath thanks for your time it's been a pleasure thank you enjoyed
00:55:59.640 it my guest today was dan heath he's the author of the book upstream the quest to solve problems
00:56:03.880 before they happen it's available on amazon.com and bookstores everywhere you can find out more
00:56:07.560 information about his book at his website upstream book.com also check out our show notes at
00:56:11.640 aom.is slash upstream where you can find links to resources where you delve deeper into this topic
00:56:15.920 well that wraps up another edition of the aom podcast check out our website at
00:56:26.420 artofmanliness.com where you find our podcast archives as well as thousands of articles that
00:56:30.040 we've written over the years about pretty much anything you can think of and if you'd like to
00:56:33.100 enjoy ad-free episodes of the aom podcast you can do so on stitcher premium head over to
00:56:36.340 stitcherpremium.com sign up use code manliness for a free month trial once you're signed up
00:56:40.420 download the stitcher app on android ios and you can start enjoying new episodes of the aom podcast
00:56:44.440 ad-free and if you haven't done so already i'd appreciate if you take one minute to give us
00:56:47.660 review on apple podcast or stitcher it helps out a lot and if you've done that already thank you
00:56:51.440 please consider sharing the show with a friend or family member who you think would get something
00:56:54.540 out of it as always thank you for the continued support until next time it's brett mckay
00:56:57.800 reminding you not only listen to the aom podcast but put what you've heard into action
00:57:10.420 you