#209 ‒ Medical mistakes, patient safety, and the RaDonda Vaught case | Marty Makary, M.D., M.P.H.
Episode Stats
Length
1 hour and 45 minutes
Words per Minute
175.27483
Summary
In this episode, Dr. Marty McCary talks with Dr. Peter A. Atiyah about the importance of patient safety in the medical field, including the recent case of a nurse who made a medical error that resulted in the death of a patient in late 2017.
Transcript
00:00:00.000
Hey, everyone. Welcome to the drive podcast. I'm your host, Peter Atiyah. This podcast,
00:00:15.480
my website, and my weekly newsletter all focus on the goal of translating the science of longevity
00:00:19.800
into something accessible for everyone. Our goal is to provide the best content in health
00:00:24.600
and wellness, full stop. And we've assembled a great team of analysts to make this happen.
00:00:28.880
If you enjoy this podcast, we've created a membership program that brings you far more
00:00:33.280
in-depth content. If you want to take your knowledge of the space to the next level at
00:00:37.320
the end of this episode, I'll explain what those benefits are. Or if you want to learn more now,
00:00:41.720
head over to peteratiyahmd.com forward slash subscribe. Now, without further delay, here's
00:00:48.080
today's episode. My guest this week is Marty McCary. This name, of course, sounds familiar to you as
00:00:54.280
Marty has been on this podcast a number of times. Most recently, we've spent a couple of episodes
00:00:58.300
discussing COVID, which I can assure you barely comes up in this episode. By way of background,
00:01:02.340
though, Marty is a professor at Johns Hopkins, which is where we met many years ago. He's professor
00:01:06.360
of surgery and also public health researcher. He's a graduate of Harvard School of Public Health and
00:01:10.260
served on the faculty of Hopkins for the last 16 years. He's also served in the leadership at the
00:01:14.360
WHO. He's a member of the National Academy of Medicine and serves as the editor-in-chief of the
00:01:19.180
second largest trade publication in medicine called MedPage Today. He writes quite regularly for the
00:01:23.600
Washington Post, the New York Times, and the Wall Street Journal. He's also the author of two New
00:01:28.220
York Times bestselling books, Unaccountable and The Price We Pay. In this episode, we talk about
00:01:33.460
patient safety. But of course, the real impetus for this is the recent case of Redondavot. In case that
00:01:39.000
name doesn't sound familiar, Redondavot is a nurse or a former nurse at Vanderbilt Medical Center who
00:01:45.080
made a medical error that resulted in the death of a patient in late 2017. This case has garnered a lot
00:01:52.560
of attention lately for reasons we will get into in this case. But the headline is that for probably
00:01:58.320
the first time, certainly to anybody's recollection, a mistake of this nature was prosecuted
00:02:04.160
criminally. And the implications of this are pretty significant. We talk about that a lot. But we really
00:02:08.300
begin the discussion by talking about the culture of patient safety. What is the risk to a patient when
00:02:12.860
they walk into the hospital? What are medical errors? How do they take place? And how big a problem
00:02:17.520
is it? We also talk about how much has changed in the last 20 years. And I think Marty and I were
00:02:22.360
pretty lucky to train in an era that actually witnessed the transformation or witnessed the
00:02:27.180
changing of the guard in terms of the attitude towards this. So one thing to note about this
00:02:31.700
podcast is that in an effort to get it out as quickly as possible, it's going to be an audio
00:02:35.460
only episode and the show notes will be relatively sparse. So without further delay, I hope you enjoy
00:02:40.460
my conversation about this very important topic with Marty McCary.
00:02:43.840
Hey, Marty. Awesome to be talking with you about this today. I'm kind of bummed that we can't do
00:02:53.780
these in video, but I guess that's the nature of your other life. But anyway, no one else gets to
00:02:57.880
see your beautiful face except me right now. Good to see you, Peter.
00:03:01.440
This is a topic you and I have been talking about privately for about two months now. I think we
00:03:07.260
decided that it was an important enough subject that we should actually bring it to the larger
00:03:11.260
sphere and talk about this publicly. And that's the issue of patient safety, something that's near
00:03:15.480
and dear to your heart. You've worked on this tirelessly for almost as long as I've known you.
00:03:21.120
And I think we met in 2003, 2002 actually we met. Yeah. And this has been something that you,
00:03:28.180
along with many of your colleagues, people who I knew like Peter Pronovost have also taken up the
00:03:33.080
mantle on. And when I think back to my medical training, Marty, when I think of the beginning
00:03:37.980
versus the end, it's a five-year stint, a lot of changes actually happened. Something as simple,
00:03:43.680
quite frankly, as a timeout was not something that existed before I entered my residency. So when I
00:03:49.600
was an intern, there was no such thing as a timeout, a surgical timeout in the operating room. We can
00:03:53.480
explain what that means to people. Yet by the time I left my residency, you couldn't do an operation
00:03:58.420
without a timeout. So clearly the culture of medical safety is something that the field of medicine
00:04:05.120
has been struggling with for a couple of decades. Can you give us just a little bit of a history of
00:04:10.260
that with more color than my sort of clumsy approach at it? When we were a residence at that
00:04:16.200
time, it was around that time that we entirely blamed the individual. I specifically remember
00:04:24.080
at one of the M&M conferences, I don't know if you- Tell people what M&M is. I think I've talked
00:04:28.600
about it in the past. It's very important for people to understand, especially in surgery,
00:04:32.040
what M&M is. So M&M stands for morbidity and mortality, and it's a weekly or in some smaller
00:04:38.420
hospitals, a monthly conference where things that go awry or any death on that department service
00:04:47.240
will be reviewed and discussed. And it's part of internal quality improvement. It's legally protected
00:04:53.200
under a special clause, so it's under quality improvement so that it's not discoverable in
00:04:59.240
court. And we can have the liberty to discuss things honestly. And it's an amazing conference.
00:05:05.500
My favorite conference as a resident, truthfully.
00:05:08.840
Well, sometimes it was also entertaining, but your eyes would pop out. I remember as a medical student,
00:05:14.880
you listen to these stories. I mean, the Swiss cheese defect of medical errors where everything
00:05:21.340
goes wrong. The perfect storm of how you could, this happened and that happened and the patient
00:05:26.860
ultimately was hurt by it or a near miss. And you see yourself in these situations like, gosh,
00:05:32.740
I could have done that. And I remember as a medical student, my eyes popped out of my head. I'm just
00:05:38.820
thinking, oh my God, can you believe that just happened? And M&M after M&M conference, I would just be
00:05:45.800
blown away. But I was also exhausted. And then as a fourth year medical student, my eyes popped out a
00:05:52.960
little less. And then as an intern, I was just so tired. I would just kind of like raise my eyebrows
00:05:58.740
as I'm half trying to get a nap in. And then as an attending, you'd be totally numb to it, completely
00:06:05.720
numb to, yep, that stuff happens and we should try to do better. M&M is an incredible conference
00:06:11.480
because you hear the discussions of what we could have done better.
00:06:15.240
Well, I think the point that it's not discoverable in a court is also important,
00:06:18.540
at least where we trained. I think at Hopkins, I always felt that it was a very honest conference,
00:06:23.780
meaning I felt that people really went up there and shouldered the blame for mistakes.
00:06:29.800
Without that, it becomes, if this were a conference that were done in a court,
00:06:33.780
you could never get to truth and reconciliation.
00:06:37.320
What I loved about the conference was the intense humility that you would see exerted by these
00:06:43.900
powerful names in American surgery. I mean, giants in the field say, with all honesty,
00:06:49.880
I didn't look carefully enough at the CAT scan before the case. I should have recognized that
00:06:56.680
there was an aberrant artery in that location that I ended up getting into trouble with. I feel bad.
00:07:03.880
I spoke with the patient and you'd hear these incredible moments of honesty. And I thought,
00:07:09.340
that's healthy for the field. When we were residents, the resident presenting, who was
00:07:15.020
often just completely getting fried for things that were out of their control, right? You don't
00:07:19.800
want to blame a nurse. You don't want to blame a colleague. You don't want to blame your weak medical
00:07:23.180
student who dropped something. You try to present it in a neutral way and you jump on the grenade for
00:07:28.540
the team. And I remember specifically, there was a trauma patient who died. And this guy had
00:07:35.060
basically was dead on arrival. And there's nothing that we could have done medically.
00:07:40.120
And when I say we as a profession, I was not in the case. But the chief resident felt bad and
00:07:45.380
basically just said, I should have pushed harder. I should have just pushed everybody harder. And I'm
00:07:51.880
thinking, yeah, we need to do our best. But you're beating yourself up in this spirit of it's all about
00:07:57.540
the individual's responsibility. Now we've matured to recognize we need to have safe systems,
00:08:03.440
right? We need to have the chest tubes in the operating room or in the trauma bay so you can
00:08:08.120
get to them quickly. We need to value non-technical skills as doctors, not just the technical skills of
00:08:14.460
doing procedures, but effective communication and inspiring confidence in people around you
00:08:19.820
and organizational skills. We just generally haven't valued that kind of teamwork and communication
00:08:26.240
skills. We've matured now to recognizing that when something goes tragically wrong, we need to ask,
00:08:31.520
how can we do better? But how can the system, how can the hospital be set up differently? How can the
00:08:35.940
NICU be moved to be closer to the labor and delivery ward? How can the elevator be held for
00:08:42.980
the trauma team before they get there so they don't have to wait for the elevator to come down?
00:08:46.800
That's a systems approach. And that is entirely novel in the last 20 years of medicine.
00:08:52.460
What was the impetus for this, Marty? When you think back to before you and I trained,
00:08:56.900
was there a single catalyzing event that somehow just finally took hold literally during the time
00:09:04.920
we were training? Or was it no single event, but rather a gradual progression? And I'll give you an
00:09:11.460
example. I think everybody's familiar with the story of how the 80-hour work week came to be a manner
00:09:17.940
in which residents trained. And that really came out of a singular event. I don't remember the woman's
00:09:21.520
name, Libby something. Libby Zion. A young woman who, God, I don't even remember the story. Other
00:09:26.680
than I think she went to a New York hospital. She was in an ER and a resident took care of her,
00:09:32.260
prescribed her a medication without realizing she was on another medication. I believe it was a
00:09:36.240
psychiatric medication. There was a huge contraindication to this. And I think she died
00:09:41.300
of hyperthermia or something like that. I mean, she had a tragic outcome in the ER as a result of,
00:09:47.440
unfortunately, probably poor supervision on the part of the resident as opposed to fatigue. But
00:09:52.700
it became a rallying cry around residents working too hard and not getting enough sleep. But really,
00:09:57.780
I believe that the family of Libby Zion carried the torch on this. And many years later, obviously,
00:10:02.400
that resulted in the changes with the ACGME. Was there a similar event that precipitated the push
00:10:08.320
to safety? Or was it more an accumulation of events? I would say it was that Libby Zion case. And
00:10:14.940
it happened in 1984. It came to light subsequent. But it was her father who happened to be a New York
00:10:22.000
Times reporter. And it showed for the first time to the world what many of us had known. And that is,
00:10:29.700
you can die not just from the illness that brings you to care, but you can die from the care itself.
00:10:36.200
And that can occur at a rate that may be higher than we appreciate. She was given a medication that
00:10:42.800
should not have been given to her. She had a interaction that should have been recognized.
00:10:48.040
And out of that case came a ruling that you can't have people working 48 straight hours because that
00:10:55.820
was credited to be a root cause. And around that time in the 1990s, when there was a tension around
00:11:01.980
this, New York State set up a commission to make sure you don't have people completely sleep
00:11:08.860
exhausted doing procedures and making critical decisions, that the Institute of Medicine put out
00:11:14.300
a report. 1999, they issued a report, a groundbreaking report where they essentially reviewed records
00:11:21.720
independently and found that about, in their estimate, 44,000 to 98,000 people a year in the United
00:11:29.800
States die from preventable medical mistakes. Sometimes it was sloppy handwriting. Sometimes it was
00:11:36.840
ordering something that should have been done on another patient. Sometimes it was forgetting
00:11:42.020
something. Sometimes it was the patient falling through the cracks. But they identified what is
00:11:46.680
now known as a preventable adverse event, also known on the street as a medical mistake.
00:11:52.280
And people were blown away. I remember as a resident being told, hey, this report just came out from this
00:11:57.900
giant institution, highly respected Institute of Medicine, now called the National Academy of Medicine,
00:12:03.060
saying that maybe up to 100,000 people a year die, not from the disease that brings them to care,
00:12:08.700
but from the care itself. And there was protest and anger. And the residents were like, this is BS.
00:12:15.100
And within a couple of years, thanks to some big national names, including a pediatric surgeon
00:12:20.580
who's made his career this topic, and with a lot of humility talking about how he made mistakes as a
00:12:29.100
surgeon. This really quickly became adopted. Doctors resonated with it, the public. It's almost as if
00:12:35.460
everybody had or knew of a story. And quickly, this Institute of Medicine report put into stone the
00:12:42.380
idea that dying from medical mistakes, if it were a disease, would rank as the eighth leading cause of
00:12:47.740
death. Now, what's interesting is Lucian Leap, one of the co-authors, wrote a dissenting commentary
00:12:53.060
afterwards where he said, look, it's much higher. Look at the methodology that we used.
00:12:57.540
We're just reviewing charts. Not every mistake is documented. And he actually thought it was an
00:13:03.180
underestimate. Let me push back a little bit, not for the sake of pushing back, but just to sort of
00:13:08.280
ask the question in a probing way. So let's say it's 100,000 people die a year in hospitals because of
00:13:15.600
medical errors. Is there any way to determine how many of those are deaths in people who were going to
00:13:22.900
probably die during that admission anyway? So these were accelerated deaths versus people like
00:13:28.700
Libby Zion, who she was a young, otherwise healthy woman with an isolated psychiatric illness,
00:13:33.640
probably was not going to die anywhere near that hospital admission. And therefore that admission,
00:13:39.400
so think of it as like people who are on the edge of the cliff for whom the medical error pushes them
00:13:43.860
over the cliff versus people who are 30 feet away from the cliff for whom the medical error picks them
00:13:48.460
up over the fence and shoves them over the cliff. It's a great point. And the study did not distinguish
00:13:53.680
the two. And it's true. Many times, like if you look at the people in the hospital, they tend to be
00:13:57.720
older. And many times the medical error hastened the death, but was really not the primary cause of
00:14:05.100
death. But any medical error that resulted in death, even if it hastened an imminent death, was counted as
00:14:11.640
a medical mistake. And that's, of course, very difficult, as it should be. As it should be. I think back to
00:14:16.620
all of the ones that I saw. I'll just tell one story. I may have even told this on the podcast
00:14:20.400
before at some point, and you'll appreciate it because it was during my intern year very early.
00:14:25.260
So August, I'm guessing, maybe July. It was the first or second month of internship. We were not
00:14:32.240
at the mothership. We weren't at Hopkins. We were out at Sinai, which is one of the satellite hospitals.
00:14:37.740
And you don't necessarily have the same quality of the support staff there. That becomes relevant in
00:14:43.740
the story. So a resident wrote an order for a patient who was in the ICU. She was in the ICU,
00:14:49.900
but going to be transferred out. So she was not ventilated, basically just waiting for a bed
00:14:54.980
to move to the floor. And she was having a hard time sleeping. So he wrote for one gram of Ativan.
00:15:05.600
Exactly. Ativan is a benzodiazepine that would normally be dosed somewhere between
00:15:09.560
half a milligram and maybe two milligrams, maybe five milligrams, right? And that would be in someone
00:15:16.400
who's used to a lot of that medication. What he meant to write was one milligram and not one gram.
00:15:22.040
So he wrote for 1,000 times the dose. So that was mistake number one. Now, almost without exception,
00:15:30.500
any nurse, because this is back in the day when you wrote an order on a paper chart,
00:15:34.580
any nurse would immediately recognize that as an error. But this just happened to be a brand new
00:15:40.960
nurse. And so she took the order from the chart exactly as it was written, as one milligram,
00:15:46.600
pardon me, as one gram, and transmitted that order directly to the pharmacy. Which again,
00:15:52.580
any pharmacist with any experience would recognize that's a supraphysiologic dose that would be enough
00:15:59.100
to kill an entire, I don't know, enough stadium of people. But again, the pharmacist was also brand
00:16:07.360
new. It was a night shift, maybe putting the new person on at the night shift when there's less
00:16:11.600
action. So the pharmacist sent up all of the Ativan that he had in the system, which was tens of
00:16:18.960
milligrams, and said, look, I'll get the rest later. I have to reach out to another hospital to get it.
00:16:24.700
But whatever it was, here's 20 or 30 milligrams of Ativan, and I'll get the other 970 milligrams
00:16:31.280
later. Which again, should have been a red flag, but it wasn't. And then of course, the nurse
00:16:35.720
administered this dose of Ativan to the patient, who very shortly after stopped breathing. Now,
00:16:42.340
fortunately, this happened in an ICU, and therefore, the nurse was able to see that the patient had
00:16:47.760
stopped breathing, called the doctor, they intubated the patient. And the next morning,
00:16:51.820
she was ultimately extubated and fined. This was a near miss. It's a huge medical error,
00:16:57.020
but it did not result in death. Though had this occurred on the floor, it would have resulted in a
00:17:00.760
death. That story illustrates exactly what you spoke about earlier, which is the horrible Swiss
00:17:06.680
cheese effect of how many pieces can you line up and still fit a pencil through.
00:17:13.860
Exactly what the Swiss cheese model of medical errors is. It shows how when we look back and review
00:17:20.620
these catastrophic errors, oftentimes, every single thing is a little off. And what happens is,
00:17:30.380
sometimes we refer to it as a comedy of errors, sometimes we call it the perfect storm,
00:17:34.320
but it happens. So that's the terminology we're using now is if it avoids a patient harm,
00:17:39.900
it's a near miss. And if it involves patient harm, it's called a preventable adverse event.
00:17:44.420
Let's talk about how things advanced going from the early 2000s until where we are now. What have
00:17:51.300
been some of the biggest advances? And do we have metrics to objectively talk about whether or not
00:17:57.560
improvements have come along? Well, it's amazing. You sit in those M&M conferences and you just start
00:18:02.540
thinking, gosh, that mistake, that lab test was never conveyed to the intern. That patient was
00:18:10.340
getting tube feeds into a tube that did not go to the stomach. It's almost as if you can't reproduce
00:18:16.940
it. You hear something that's insane and you think, gosh, you can't make this stuff up. And every error
00:18:23.880
appears to be unique, but there are certain basic principle root causes. Oftentimes, the physician is
00:18:28.940
exhausted, burnt out, didn't have the support or help they needed, may have had what we call alert fatigue.
00:18:34.920
Meaning they're being pinged with a lot of unnecessary alerts. Yeah. Such that when a real
00:18:40.320
alert comes along, it's easy to ignore. Exactly. Like you feel like the pharmacy is crying wolf when
00:18:44.880
every time you prescribe something, there's some alerts. So people just click through them. Sometimes
00:18:48.600
you're actually prescribing a therapy for someone and you have to override five or six alerts just to
00:18:55.660
prescribe one treatment path. In 2006, our friend, Peter Pronovost, who was at Johns Hopkins,
00:19:03.900
tackled one form of preventable adverse event, which were central line infections. And basically,
00:19:10.700
we had known for a long time, there was a protocol that if you use it reduces the risk of infecting the
00:19:16.440
central line. You put a full length gown on the person, you wash. And just for folks to understand,
00:19:21.980
a central line is an intravenous catheter, but it goes into one of the very deep veins. So typically
00:19:27.020
either a deep vein in the neck called the jugular, a deep vein in the neck of the subclavian or a deep
00:19:31.700
vein in the pelvis or in the groin called the femoral vein. And it's a big deal procedure, both from the
00:19:36.820
risk of infection and the risk of hitting an artery or puncturing the lung. And we saw so many central
00:19:42.740
line complications when we were residents in medicine nationally, not just at Johns Hopkins. Nothing was
00:19:49.600
unique there about it having a higher rate of central line complications, but the lines would get
00:19:55.040
infected, they would get clotted, you had to change them frequently, many people had lines they did not
00:20:00.500
need. So a protocol was developed by Peter Pronovost and the nurses in the ICU to say, look, try to avoid the
00:20:09.140
groin whenever possible, because those femoral lines are more likely to get infected just being down there in the
00:20:15.440
groin, use a full length drape, wash your hands extensively, use sterile technique, wear a mask and
00:20:22.060
face shield. And so we saw this protocol rigidly adhered to and a dramatic reduction in central line
00:20:29.900
infections. And then Peter had a relationship with the Michigan Hospital Association, which then adopted
00:20:35.600
it broadly in an ICU collaborative of dozens of hospitals. And they basically got the median central
00:20:42.760
line infection rate down to just below 0.5, which if you round down is zero. And this news that...
00:20:50.320
And what was it before, Peter? Because I remember it was a huge reduction.
00:20:58.500
A log fold reduction. And it's consistent with what we saw when we were interns. Gosh, taking care of
00:21:04.340
infected central lines was routine. This was celebrated as a major milestone in patient safety. Here is one
00:21:11.400
form of preventable harm. Granted, it's less than 1% of all the preventable harm in healthcare, but we
00:21:16.800
succeeded. We standardized something. We got broad compliance. It was a rapid adoption, not the typical
00:21:23.520
17-year lag between evidence and broad adoption in practice that we see with other things introduced.
00:21:30.340
But I want to add, Marty, there was another change that took place, at least during my tenure, that I have
00:21:36.280
to believe had a significant impact. So when I was a cocky, aggressive medical student, I don't know how
00:21:43.100
I got away with this, but it was the Wild West back then. I was putting central lines in people
00:21:47.800
as a medical student. Now, I never did it without supervision. So I always had a resident supervising
00:21:52.120
me. But it's pretty unusual, I think, for a third and fourth year resident to be putting in central
00:21:57.240
lines. By the time I finished medical school, I'd probably put in 25 central lines. And in part,
00:22:03.640
that was because I did a stint at the NIH, and I got some expert experience there in the clinical
00:22:07.980
service in oncology. So I was pretty good at putting in a central line. I show up as an intern.
00:22:12.220
I mean, I probably put in 100 central lines as an intern, unsupervised. Again, I'm very fortunate. I
00:22:18.700
probably put in 400 central lines in all of residency. I had one hema pneumothorax that showed up four
00:22:25.740
days afterwards. We never saw them the original x-ray, so that was my bad to miss that. But I remember by the
00:22:31.520
time I was in my fifth year, that Wild West was gone. Interns were not allowed to put in central
00:22:37.880
lines. Only the second year resident in the ICU was putting in central lines, and they were doing
00:22:44.020
it under the supervision of the ACS in a fluoro lab such that you could immediately get an image
00:22:50.020
right after. Now, I don't know if that procedure stuck, and it sure seemed a little aggressive given
00:22:56.340
how I came up, but I got to believe it was for the best. Do you know how that sort of played out,
00:23:01.360
and what the protocol is now at a teaching hospital for central lines? We used to joke that I'd put a
00:23:06.080
central line in somebody in the parking lot if they looked at me wrong. Well, it's amazing how good you
00:23:11.140
could get at that technique, and the students that had a broad experience before they started their
00:23:16.060
residency were definitely more efficient. They would get the job done more reliably. The fact that you did
00:23:21.520
so many before your residency probably explains why you were one of the most highly sought after
00:23:26.680
and regarded residents in our Hopkins program. I had no idea what my line infection rate was.
00:23:32.720
All I knew is I wasn't causing pneumothoraces, but I guess my point is really, I would guess that
00:23:38.100
complications like pneumothoraces also went down with this change. So it wasn't just that the work
00:23:44.820
that Peter and others did fixed line infections. I think it brought a greater appreciation to the
00:23:50.860
seriousness of this procedure, and I'm wondering, did anybody follow up and say, hey, guess what?
00:23:55.860
The rate of pneumothorax went from 1% to 0.1%. Do we have any insight into that?
00:24:00.980
So what you're describing is the move towards dedicated teams. By the time we finished our
00:24:07.340
residency, they had a central line associated, a central line team, and then that matured into a
00:24:13.180
dedicated team. And then it turned into a rule that you really were not supposed to put in central
00:24:17.540
lines at all. You were only supposed to have the dedicated team do it. And they were so freaking
00:24:21.740
good because they were doing just that. And then they started using ultrasound. So it's not like you
00:24:27.160
take 10, 20 probes until you aspirate blood. And so that usurped all the success of this protocol
00:24:34.280
that Peter Pronovost introduced. And that team, of course, used the protocol. So it wasn't for naught.
00:24:40.360
I mean, the fact that we could conquer something like central line associated infections that nobody
00:24:46.700
thought you could ever tackle and the cost savings and the avoidable harm associated with that, that
00:24:51.720
was a major milestone in patient safety. And then two years later, after the Pronovost publication on
00:24:59.060
the central line toolkit or bundle, which within three years was becoming standard in many ICUs around
00:25:04.980
the country, Medicare decided they're not going to pay for a catastrophic medical mistake, what we call
00:25:10.240
a never event. Something that should never happen regardless of the circumstances. You should never
00:25:16.240
leave an instrument or a sponge behind during surgery unintentionally.
00:25:20.860
And what year was that, Marty, that Medicare said that?
00:25:23.060
2008. That was a major step. Up until that time, financial system and medicine was not rewarding,
00:25:29.780
but if you had to go back and do an operation to remove a retained foreign object, you were paid
00:25:35.120
for that procedure as well. And Medicare basically said, why are we incentivizing? Why are we rewarding
00:25:40.280
this financially? Let's just agree to not pay for this stuff. And then other insurance companies
00:25:45.340
started to follow, and now it's accepted. That's on the hospital if you have a catastrophic medical
00:25:51.020
mistake. And then in 2009, the WHO organized a committee to address patient safety. At this time,
00:25:59.260
patient safety was the hottest thing in healthcare. And we were recognizing, again, people die from the
00:26:04.480
care, not just from the disease. I had just published at Hopkins the surgery checklist with
00:26:10.980
Peter Pronovost as my mentor, and we put out a bunch of articles. And the WHO basically said,
00:26:16.360
we're convening people interested in patient safety. We'd like to invite you to present about
00:26:20.580
your checklist. I presented it. Atul Gawande was chair of the committee. He was not that interested in
00:26:26.780
the idea initially, warmed up to it. Wait, didn't he write a book called The Checklist Manifesto or
00:26:31.640
something like that? He eventually saw the great story in the checklist and wrote a book. But
00:26:36.500
initially, he actually presented a competing idea at the WHO, which was something called the Surgery
00:26:42.760
Apgar Score, which nobody adopted. It was discarded. People thought it was a dumb idea. It was not risk
00:26:49.380
adjusted. Baby's born, and you do a rapid test that predicts the baby's survival. And this
00:26:54.660
has been an old school score that used to be done on babies. And it was the idea that you could do a
00:26:59.600
rapid assessment in a matter of seconds and assess a prognosis. And he just loved that concept. It was
00:27:05.020
a great story. And he thought we should do that for all surgical patients. Well, people were saying,
00:27:10.100
hey, if you have a breast biopsy, it's different from having a heart bypass. Like, you've got to adjust
00:27:15.040
for the severity of the surgery. But he just loved the idea of a rapid assessment. The committee voted
00:27:20.880
unanimously against his proposal. I remember Atul was frustrated. And he thought, I don't know if I'm
00:27:26.980
going to continue as chair of this committee. And then the committee said, you know, Atul,
00:27:32.100
the checklist is simple. Pilots already do it. Look at the success in aviation. It's low budget for our
00:27:38.940
WHO committee to adopt a checklist to go up on the operating room wall. And the committee loved it.
00:27:45.960
So it became the initiative. Our checklist became known as the WHO surgery checklist. And to this
00:27:52.020
day, it hangs on the operating room walls of most operating rooms in the world. And eventually he did
00:27:57.340
a study. I was a part of the study showing how it reduced adverse events and had an impact. So that
00:28:04.580
was another moment in patient safety was in 2009. What year was that? Not to derail us, but a story that
00:28:12.760
got a lot of attention, which was a heart transplant at Duke, where did they fail to do a cross type
00:28:19.580
or? Yes. Do you remember that story? Yep. So that was a major milestone in patient safety. And a lot
00:28:26.160
of good came out of the lessons learned there. So they were doing a heart transplant on a young girl
00:28:30.980
at Duke University. Which is just to put in perspective, we're talking top five places in
00:28:36.280
the world you would ever have a heart transplant would be at Duke University. Yeah. Cardiac transplant,
00:28:40.860
they're definitely top three or four, in my opinion. So they did not check a cross match,
00:28:47.220
which just for listeners, you take the blood of the donor and the blood of the recipient,
00:28:51.540
and you see what happens in the lab. Does it result in sort of this hyper accelerated allergic
00:28:56.560
reaction, if you will, that you can see in the lab? If so, that's what we call a hyper acute rejection
00:29:02.540
signal. You abort the transplant. It's done routinely. It's a standardized procedure. Somehow-
00:29:08.480
It's done before even a blood transfusion. That's right. So in other words, I guess people should
00:29:12.760
understand this. If you were getting a blood transfusion and your blood type is A positive,
00:29:17.400
it's not enough to go to the blood bank and say, just give me any old bag of A positive. They still
00:29:22.280
have to do a cross match. They still have to take a bit of that blood that should match with yours and
00:29:27.880
yours and do that test, shouldn't they? I don't honestly know the protocol for blood transfusions,
00:29:33.660
but certainly organ transplants, something I'm very close to. Absolute 100% routine. As you can
00:29:40.460
understand why, I mean, there's nothing worse. I've never seen it, but I know of surgeons who have,
00:29:45.280
you put an organ in a recipient, you sew it in, and all of a sudden the organ fails right in front
00:29:51.580
of your eyes. You see swelling, you see this sort of ischemia. That's why we do a cross match. Well,
00:29:57.280
the cross match was not checked. Unfortunately, the heart failed. The universe, the hospital
00:30:03.120
doctors, of course, felt terrible, did everything they could to prioritize her as a status level one,
00:30:10.220
high priority, the highest priority to get a second heart transplant. They attempted a second
00:30:15.400
heart transplant. The transplant failed, and it was a tragic case all the way around.
00:30:20.900
It's insanely tragic because the woman died, and then you could effectively argue two other people died
00:30:26.640
who didn't get a chance to have the heart that would have worked for them.
00:30:30.420
That's right. The opportunity missed. So it was an over what? What kind of cost are we talking about?
00:30:37.620
Well over a million dollars, but even more concerning the lost years of life in a young,
00:30:45.160
promising human being. So you had now this recognition.
00:30:49.780
And how did the mistake happen, Marty? What did the autopsy show? Like, obviously,
00:30:53.900
this is something that gets done before every transplant and presumably almost never gets
00:30:58.980
missed. Where was the Swiss cheese on that one? What went wrong to prevent that cross match?
00:31:03.940
It turns out that a nurse in the operating room sensed something was not right. There was this feeling
00:31:12.920
among within this nurse that something just was not correct. And what she had was alluding to was that
00:31:19.800
they had not done that cross match. And she did not feel comfortable speaking up. She had the
00:31:26.500
thought, hey, wait a minute. What was the cross match? Did not voice that concern or voice it to
00:31:33.160
the appropriate head of the operating room, the surgeon, and felt terrible about it. And it created
00:31:39.280
this notion that- Do we know why it wasn't done? I can understand that maybe her spidey sense tingled,
00:31:45.080
but like, why wasn't it done? Something that is so routinely done, do we know why it wasn't done?
00:31:51.100
I don't know if it's known, but I've certainly seen patients go to the operating room in my career
00:31:56.780
where something should have been done beforehand and it wasn't. And I'm sure you've seen that.
00:32:01.500
So the idea of creating a culture of speaking up or an atmosphere in the operating room where people
00:32:07.880
feel that there's collegiality, teamwork, and they would feel comfortable voicing a concern,
00:32:12.900
that no longer was a soft science. It now undermined a gigantic operation in a young girl
00:32:21.620
and had catastrophic consequences. So all of a sudden, standardizing what we do became more of
00:32:28.060
a science. That was another major step. And then the ubiquitous nature of medical errors got documented
00:32:34.740
in a 2014 Mayo Clinic study where in a survey of 6,500 doctors, 10.5% of doctors surveyed say
00:32:45.000
that they had made a major medical mistake in the last three months.
00:32:51.220
10.5% of US doctors report that they made a major medical mistake in the last three months.
00:32:58.280
Now, I might have felt like that in the lowest points of my residency, but I was surprised when
00:33:04.460
I saw that. Now, it may have been caught, what we call the near miss, but it did sort of democratize
00:33:10.640
the idea of, hey, if you felt like you've done something like this, you're not alone. It's actually
00:33:16.000
sort of part of this crazy life that we have as doctors where you're getting pulled in all these
00:33:20.440
directions and there's pressure and stress. And that's assuming everything at home is fine.
00:33:25.200
People are dealing with external pressures. And then in 2015, Mass General Hospital had a study
00:33:32.620
done by researchers there, which Mass General is embarrassed by. They've taken down the study
00:33:38.280
from their website. Most of their studies get put out in a communications sort of press release.
00:33:44.420
The link doesn't work anymore, but the study showed that about one in 20 medications administered
00:33:49.720
in the operating room involved an error. And that meant that about 50% of operations had a medication
00:33:57.920
error. So every other operation has some medication error, most caught, but they did this sort of in-depth
00:34:05.220
analysis of 277 operations at Mass General, not a small shabby chop shop.
00:34:12.140
One of the three best hospitals in the world, certainly in the United States.
00:34:15.220
As they like to call it, man's greatest hospital. I would say to sort of round out the history of
00:34:22.000
patient safety, the modern history. In 2016, we put out a report from my Johns Hopkins research team
00:34:27.980
said, okay, we've been citing this 1999 Institute of Medicine report that about 100,000 people a year
00:34:34.780
die from medical mistakes. Has that number changed in the last 25 years? What's the updated number?
00:34:42.460
Let's look at all the more recent studies. So we did a review and we showed a range where that number had a broad
00:34:49.220
range and the median point of that range was 250,000 deaths, which just would surpass the current number three
00:34:56.840
cause of death, stroke, and would put it after cancer and heart disease, which are far higher, 650,000 a year.
00:35:04.620
Medical error, if it were a disease, would rank as the number three cause of death using this estimate.
00:35:09.820
Now it's not a perfect estimate. We didn't do autopsies on every death. We don't have good numbers,
00:35:14.900
but we basically said, look, if we were to update the number, it might even be higher,
00:35:19.360
but the CDC does not collect vital statistics on medical errors because you cannot record a death
00:35:26.680
as a medical error because there's no billing code for error. And that's how we record our national
00:35:32.100
vital statistics. They use the billing code system.
00:35:34.440
Now, but it could also be a lot lower. So how would we put a 95% confidence interval around that
00:35:40.480
125,000 to 350,000. Now, Joe Johns put out a study right just before said it was 400,000. Now that was
00:35:47.740
the most highly cited study up until that time. And he would argue that our estimate was low. We didn't
00:35:53.620
do any original research. We basically pooled together the existing studies, which are not perfect,
00:35:59.200
but we're just trying to bring attention to it. And as Don Berwick commented on the study,
00:36:03.340
whether it's the third leading cause of death or seventh or ninth, it's a major problem.
00:36:08.660
So there was a lot of discussion, heated discussion about this estimate, this review article. A survey
00:36:15.940
was done. A third of doctors believe the estimate. A third didn't know, and a third didn't believe it.
00:36:23.080
And we had this kind of heated discussion or spirited discussion for about a year. And then
00:36:28.640
the opioid epidemic hit. And opioids emerged as the number one cause of death in the United States
00:36:35.800
among people under 50. Opioid deaths were a form of medical error when they were prescription opioids.
00:36:42.700
I'm guilty of it myself. I gave opioids out like candy. I feel terrible about it. That is a form of
00:36:48.740
medical mistakes. We just, this year, surpassed 100,000 opioid deaths in a trailing 12-month period
00:36:57.700
for the first time. Yeah, 107. So are you saying that that estimate of, call it 300,000 deaths,
00:37:05.160
is including that 100,000? No, this was prior. But the 107,000 deaths in the last 12-month period
00:37:12.880
were any opioids. So a lot of that now is fentanyl. Heroin, it would be included. That's right.
00:37:19.880
So we don't know the estimate of prescription opioids. We think it's down because we've gotten
00:37:25.360
smart prescription opioid abuse is probably way down because it's more regulated. And fentanyl-laced
00:37:32.020
products is driving a lot of the opioid deaths now. But we were prescribing, Peter, let's say,
00:37:37.860
mid-career for me. One opioid prescription for every adult in the United States. That's how much
00:37:45.280
we were giving out opioids. People didn't need it. It was the medicalization of ordinary life
00:37:51.120
for some people with mild pain. A lot of countries said, look, we only give opioids to people with
00:37:59.000
End-of-life cancer and acute major surgery in the perioperative period. I remember giving a talk in
00:38:06.820
Lebanon. And I remember offering, hey, we're doing a lot of work on reducing opioid prescribing. I'm
00:38:12.020
happy to give the opioid talk at this conference. And they're like, what are you talking about? That's
00:38:16.880
an American problem. We've never prescribed opioids outside of extremely narrow scenarios.
00:38:23.480
Complications of unnecessary medical care, normal complications of unnecessary medical care is a form
00:38:30.160
of medical error. And that's where we really tried hard to say, let's broaden the idea that people
00:38:36.200
don't just die from disease. They die from the care itself. So that's a bit of our journey in
00:38:43.200
patient safety, which really encompassed our residency, Peter, up until recently. That has been
00:38:49.320
the modern era of patient safety. Before then, no one would ever talk about it. Now, when you're on
00:38:55.260
rounds, people say, you know, we could give a blood transfusion, but the patient is kind of a borderline
00:39:01.900
indication in terms of whether or not they should get a blood transfusion. But we have to consider
00:39:07.940
the fact that one in 80,000 blood transfusions can result in the wrong blood type being passed on from
00:39:17.000
the lab and hurting a patient. We never considered the role of human error in the care of our patients.
00:39:24.160
But now we're like, hey, do we need to keep people in the hospital for a week after surgery?
00:39:28.900
There's the added risk of falling, a new environment, tripping over your gown,
00:39:34.920
wearing slippers that are very slippery and they don't make sense, they're uncomfortable,
00:39:38.540
and getting an infection in the hospital. There are risks to being in the hospital that we have to
00:39:43.000
weigh with the risk. Where does nosocomial or hospital acquired infection rank in the causes of
00:39:50.280
medical errors? Nosocomial infections specifically you're talking about? Yes.
00:39:54.920
It's difficult because some people consider any infection after surgical care to be a nosocomial
00:40:02.920
infection, but not all are preventable. So there was the study out of...
00:40:07.420
So like even a wound infection after surgery would be considered nosocomial?
00:40:11.600
That's right. It would be, but it's not necessarily preventable. So nosocomial,
00:40:16.420
meaning you're getting it from the hospital, may not necessarily be preventable because we're not
00:40:20.880
going to eradicate bacteria from planet earth. And there's a feeling that say with knee replacements,
00:40:25.700
we're pretty darn good. You get a knee replacement, the risk of an infection is eight
00:40:29.760
tenths to nine tenths of 1%. Pretty darn good. Now, what should it be? We don't know. We may be at the
00:40:36.560
baseline level. They're wearing spacesuits doing the procedure or using sterile technique. I mean,
00:40:42.220
they're using glue to close the incisions now over the closure. So maybe this is the level that we
00:40:48.420
have to accept. Now there's a debate in patient safety. Some people say we have to achieve zero harm
00:40:54.480
and you'll hear that model a lot. I worry about that. That sounds a little bit like zero COVID to
00:40:59.760
me, which is... Trigger word. It creates sort of an unrealistic expectation. It might detract from
00:41:05.880
focusing on bigger things. Your example of knee replacement is a good one. Orthopedics have really
00:41:11.340
figured out how to do joint replacement in the most sterile manner imaginable. I'm kind of curious as to
00:41:16.840
what the bigger opportunities are that are away from this. Patients falling through the cracks,
00:41:22.000
normal complications of interventions they don't need. There still are medication errors,
00:41:27.380
but they're not from sloppy handwriting anymore. They're from a lack of visual cues in the patient's
00:41:32.960
chart. So now you're entering an order. You don't have a binder in front of you with the patient's
00:41:39.380
name and you know exactly whose chart you're in. You're flipping screens. You're in different tabs and
00:41:45.120
you write an order for somebody who didn't need it or the wrong person or something like that.
00:41:50.340
This happened to me actually about a month ago. So we use an electronic medical record in our
00:41:55.240
practice and I was in one patient's chart looking at a bunch of labs and looking at a bunch of things
00:42:02.220
and we had just switched to a new EMR. We used one EMR for many years and then we just switched to a new
00:42:09.520
one which has a completely different look. And when you switch to a new patient, it's not entirely
00:42:15.780
obvious. Yeah. Scary. Again, nothing came of it because I wasn't there to prescribe a medication,
00:42:21.280
but I was blown away at how long it took me to recognize that I was in another patient's chart.
00:42:30.020
Yeah. And you do prescribe through this EMR. So there is a scenario by which I could have said,
00:42:36.220
and again, in our practice, this is pretty low stakes because we're not prescribing that many
00:42:40.660
things. But I take your point, which is you always used to know what Mrs. Smith's chart looked like
00:42:46.140
because it was the biggest one and you recognized your handwriting in it and you had all of those
00:42:50.320
other cues that told you where you were. I mean, that was a pretty miserable system and had a lot
00:42:55.400
of problems with it, but that's one thing it had going for it over an EMR.
00:42:58.580
And it was cyber secure. The old fashioned docs who had very good handwriting, you can think of
00:43:07.020
probably one that we know, Charlie Yeo. They are basically saying, look, we have a good system.
00:43:12.280
People need to write more effectively. Another healthy movement that came out of this patient
00:43:17.520
safety endeavor has been the idea that sorry works. And what drives malpractice claims is your
00:43:24.920
honesty with patients, not whether or not you make a mistake. And I found that to be true in my
00:43:30.520
practice, that if you're very honest with people, they're incredibly forgiving. I remember ordering
00:43:35.760
a CAT scan. I was busy. It was between operations. It ended up getting done on a wrong patient, similar
00:43:42.780
name, done on the wrong patient. I don't know if I mixed up the names or the nurse made a clerical
00:43:48.420
mistake in entering the order because we do a lot of verbal orders, you know, as attending physicians.
00:43:53.420
And this patient was already angry at me. They had a pancreatic leak. They just were frustrated with
00:44:00.940
their care. I think their expectations were unreasonable, but of course, you got to be
00:44:04.580
polite. So this guy was already pissed at me. I figure, great. He just got a CAT scan he didn't
00:44:10.600
need. It's very obvious he didn't need it. He was recovering. Now he's going to sue me or something.
00:44:15.740
I immediately hear about this. I run up to the patient's room and I say, look, sir, I want to tell you
00:44:21.240
something. You got a CAT scan you did not need. It was not intended for you. I'm not going to sugar
00:44:26.700
coat it and say we wanted to make sure and look for something. It was a clerical mistake. I take
00:44:31.860
full responsibility. I'm sorry. If you want the results, I haven't even seen it yet. I just heard
00:44:37.600
about this and I wanted to tell you first, I'll get the results and share the results with you.
00:44:41.960
This guy who had a pissed off look on his face as I walked into his room,
00:44:45.940
smiled and looked at me and said, doc, thank you for being honest with me. I really appreciate
00:44:53.440
that. And our bond grew. He developed trust in me. And I'm proud to report that guy and I are
00:45:02.360
Facebook friends today because he never sued me. And people are hungry for honesty. We saw it during
00:45:08.680
COVID. We see it with so many aspects of medicine. Let me share with you a counterpoint to that story.
00:45:14.180
I'm not telling something that isn't already publicly known. So a very close friend of mine
00:45:18.900
here in Austin, his name is Eddie Margain, a wonderful guy. And I've gotten to be very close
00:45:24.240
with Eddie. And one night over dinner, we were talking about this and somehow he brought up the
00:45:28.700
story of his wife, Lorena. At the time I didn't know this, but of course she's written a book about
00:45:32.800
this. So again, nothing I'm saying here is not already publicly known. Lorena was having some
00:45:37.480
medical issues and had kind of the big workup. And sure enough, they found that she had a mass on her
00:45:42.120
adrenal gland. So the adrenal gland for listeners is a small but incredibly important gland that sits
00:45:47.380
on top of the kidney. So you have two kidneys and therefore you have two adrenal glands, one on top
00:45:51.360
of each. The adrenal gland produces all sorts of relevant hormones, but certainly the ones that we
00:45:55.900
think of the most would be cortisol, epinephrine, norepinephrine. And she had this tumor on her adrenal
00:46:01.840
gland. And obviously the treatment for this is to have it removed. And you can live with one adrenal
00:46:05.200
gland. So this is a relatively straightforward operation. So she had the operation. This was here in
00:46:10.800
Austin. And in the weeks that followed, she went from bad to worse. She just felt horrible. She
00:46:17.620
couldn't understand what was wrong. To make a long story short, she ended up eventually going back to
00:46:23.160
see the doctor only to discover Marty that he had taken out the wrong adrenal gland. He had removed her
00:46:27.940
healthy adrenal gland. And the one with the tumor was still there. So now they needed to go back and have
00:46:33.280
that one removed. And so now she is a person who has no adrenal glands, which creates a lifelong
00:46:38.480
challenge. You can't live without your adrenal glands. So now you are dependent on exogenous forms
00:46:45.260
of glucocorticoids. The story gets even more difficult because there were more surgical
00:46:50.040
complications and things like that. Lorena is about the sweetest person you'll ever meet,
00:46:54.880
not a negative vindictive bone in her body. And she only wanted one thing. It wasn't money. She just
00:47:02.520
wanted an apology. And the surgeon wouldn't give it. You hear these stories and you understand
00:47:09.620
the reputation that the field of surgery can sometimes bring on itself. He had a million
00:47:14.880
excuses. The egos. The arrogance, the hubris. He could not bring himself to apologize to this woman.
00:47:23.080
And the lawyers don't help either because many times the hospital lawyers who make a lot of policy
00:47:28.560
for doctors, they've made a lot of COVID policies that have been driven by hospital lawyers,
00:47:32.600
general counsels of businesses. They tell you oftentimes, don't talk to them at all.
00:47:39.180
That's right. Don't talk to the patient. Don't talk to the family.
00:47:41.520
We're going to quickly negotiate a settlement and we're going to gag everybody. In the settlement,
00:47:46.620
they rush to the family. Oftentimes before the family even thinks about a claim,
00:47:50.720
they realize what happened. They rush to the families and say, you know, we feel for you and
00:47:54.720
your family. They won't apologize. And we would like to provide some compensation. Here,
00:48:00.360
just sign these documents and then everyone's gagged for life. And we have not had an honest
00:48:04.360
conversation about patient safety in America because of that. And that's why I wrote when I
00:48:08.600
wrote the book Unaccountable about this issue of patient safety and how we can do better.
00:48:14.780
I wrote in there that we should ban all gagging in medicine. This should be an honest profession,
00:48:20.040
no gagging. Lorena and Eddie are incredibly successful, well-off. She said out of the
00:48:25.680
gate, we're not here to sue. There's no amount of money you can give us that's going to change our
00:48:30.360
lives. We just want to make sure this never happens to anybody again. And that's an honest
00:48:35.340
request. I mean, it's reasonable. If you look at other industries, they've achieved high levels of
00:48:41.600
reliability. I'm too practical. I'm a clinician as you are. So I don't subscribe to the zero harm
00:48:49.460
approach. I mean, sure, it might be a goal, but we have to be honest and look for reasonable
00:48:54.340
improvements in this problem. But look at aviation. In the last 25 years, how many plane
00:49:01.420
accidents have we seen? In 2009, there was the flight going to Buffalo where 50 people died.
00:49:07.900
That's about it in the last 20 years. In 2018, there was a woman partially ejected through a window
00:49:13.520
who died. But say in the interim nine years, 2009. In the US. In the US, in the nine years from 2009
00:49:22.480
to 2018, 6 billion passengers without a single fatality rate. Today, about 2 million people a day.
00:49:31.820
Pilots are not just jumping in the cockpit and start barking orders at each other. They have a systematic
00:49:37.220
way to use checklists and pathways and have safety nets. And they've created what they call crew resource
00:49:44.020
management that encourages people as part of the discipline to voice any concern about safety and
00:49:50.640
not to ridicule anyone who brings up that concern. That's a life lesson that can be used in any setting
00:49:56.380
that you want people to ask questions and even challenge some deeply held assumptions you may have
00:50:02.880
without ridiculing them. If you make fun of them once, I found, if you mock a nurse once or yell at
00:50:09.380
them for bringing something up because you're busy, they will never feel as comfortable voicing a concern
00:50:14.840
to you. And your patients suffer. You suffer from that lack of safety culture. So Marty, we wouldn't be
00:50:21.240
probably even having this discussion if it weren't for a case that has gained a lot of notoriety slash
00:50:27.060
awareness over the past year and certainly in the past couple of months since the verdict on
00:50:32.800
this trial. But let's assume that a lot of people aren't familiar with this case from Vanderbilt.
00:50:37.120
Can you walk us through in detail the timeline of events? If my memory serves me correctly,
00:50:45.040
All right. So take us back to that fateful day in 2017.
00:50:48.220
So this is an amazing story on so many different levels. Redonda Vaught is a 36-year-old nurse
00:50:57.040
who was hired at Vanderbilt in 2015. Now, Christmas Eve, 2017, she was taking care of a patient named
00:51:07.040
Charlene Murphy, 75-year-old woman who was admitted for a subdural hematoma or a brain bleed,
00:51:14.080
actually improved quickly. And two days later, she was ready to leave and they ordered, the doctors
00:51:21.000
ordered sort of one last scan while she was in the hospital. The nurse, Redonda, took her
00:51:26.620
to the scanner and ordered some Versed. There was some Versed ordered, which is a sedative to help
00:51:36.700
This was ordered in the ICU before she came down or who ordered the Versed?
00:51:41.820
Presumably the physician who was caring for her. And the nurses will often say, look,
00:51:45.820
can I have an open order for Versed if I need to use it in the CAT scanner? And every now and then,
00:51:50.180
the radiologist will order it while they're down there. They'll say, hey, this person's having
00:51:54.360
trouble staying steady. Can we get a Versed order? I mean, how many times have you and I said yes to
00:51:59.660
that request? So it's a commonly used medication in that scanner. She goes into a system, relatively new
00:52:07.740
system that's got automated dispensing. There have been many complaints that there are too many alerts
00:52:14.340
and you often have to override the system because there was not good coordination between the electronic
00:52:21.720
health records and the pharmacy. So in this system, you frequently had to override alerts.
00:52:28.380
Well, she types in VE to order the Versed and up comes maybe through a default Vecuronium,
00:52:37.680
which is a paralyzing agent. It's a potent paralyzing agent and gives it to the patient.
00:52:44.240
So just to be clear, so she's typing in VE, it auto-populates VEC, Vecuronium instead of VE-R,
00:52:51.400
Versed. She doesn't realize this. She clicks on it. What is this? Like a little mobile Pyxis device
00:52:57.520
that she's traveling with? I'm assuming she's in the radiology suite when this happens, or is she doing
00:53:02.180
this back in the ICU? I'm not sure the location, but it's clear that she typed in VE,
00:53:07.640
Vecuronium comes out. Now, she had to override the alert. There was an alert and the Vecuronium
00:53:15.940
came up as a powder when most people would know Versed is a liquid. But there are other things that
00:53:23.480
come up as powders and you just have to inject some saline to suspend the powder. That was a
00:53:28.980
warning flag. You know, we talk about the Swiss cheese model. She reportedly was distracted
00:53:34.160
and she suspends the powder into a solution. The cap should routinely... And by the way,
00:53:41.600
just for folks to understand, why would Vecuronium even be there? It's really only something that can
00:53:47.240
be used when a patient is either in surgery and they're fully anesthetized and on a ventilator
00:53:53.700
or in the ICU under the same conditions. There's no other need for Vecuronium other than in a patient
00:54:00.260
who is on a ventilator. That's right. So presumably because this patient was in the ICU, I mean,
00:54:05.280
because otherwise you shouldn't even have Vecuronium in the Pyxis system, right?
00:54:08.800
That's right. And the cap routinely has on the cap emergency drug warning. It has a little warning
00:54:15.020
on the cap. So there are a bunch of these sort of... And it had that? Did it have that?
00:54:18.500
It routinely has that. I want to be very careful with my words. I didn't see documentation that that
00:54:24.320
one had it, but it routinely has that as a standard thing. Now, she immediately, because this is a
00:54:30.100
potent paralytic agent, paralyzed the patient and she died. Now, they were not in the ICU to
00:54:34.920
immediately resuscitate the patient. It's a tragedy. The woman was 75, otherwise going to go home.
00:54:42.820
Vanderbilt had documentation where two neurologists listed the cause of death as basically the brain
00:54:51.220
bleed. And it was deemed essentially a natural cause of death. This was reported to the medical
00:54:57.960
examiner. And... How is that even possible? The woman was presumably wide awake when she went down
00:55:04.060
to have one more scan before leaving the hospital? That's right. She dies on the scanner and the cause
00:55:12.040
of death was stated as cerebral hemorrhage or subdural hematoma? That's right. So the family was told what?
00:55:20.340
She just died on the scanner? The family has been gagged and basically is not speaking about the
00:55:26.900
case. Although one family member told the media that they want to see the maximum penalty to her.
00:55:33.380
And the grandson said that the woman who died would have forgiven the nurse. Now, the nurse
00:55:40.620
immediately feels horrible, says exactly what she did, recognized her mistake. As the patient was
00:55:48.080
deteriorating, felt this, I may have caused this, and admitted, reported this whole thing, was 100%
00:55:56.760
honest. I mean, in an incredible way, has even said subsequently that her life will never be the same,
00:56:03.120
that she feels that a piece of her has died. I think we've all been a part of medical mistakes
00:56:08.900
where we still think about that. The medical examiner does not investigate the case because the
00:56:14.600
report is a brain bleed. So in other words, the death certificate, which is usually filled out
00:56:20.680
at that moment. Let's walk people through how this works, Marty. Because again, you and I take
00:56:24.120
this for granted because we've done it a thousand times. I don't think people understand how this
00:56:28.460
works. So this woman stops breathing in the MRI scanner. I assume it was an MRI, not a CT if they
00:56:35.500
were trying to sedate her, but whatever it was. So they declare her dead there, or maybe they would
00:56:41.640
take her up to the ICU and try to resuscitate her further. But certainly within minutes,
00:56:45.680
she's going to be declared dead. Do we know if they intubated her and kept her alive for a little
00:56:50.440
while longer until they declared her brain dead? I don't know the timing of the death, but they
00:56:56.000
attempted a full resuscitation, right? You would, as soon as you recognize this, anybody who's recognized
00:57:01.780
to be unresponsive like this and desaturating would immediately be resuscitated.
00:57:06.100
The point is at some point when they cease to resuscitate her and or when they declare end of
00:57:14.120
life, usually a resident fills out a death certificate at that point. Now, I don't know
00:57:19.040
how many of those things I've filled out in my life, but it's a very painful process.
00:57:25.500
No, no, no. Nobody wants to be the one to have to sign the death certificate and fill it out.
00:57:29.080
You have to be very careful in what you write on it because it wants a primary cause of death,
00:57:33.740
a secondary cause of death, and you have to write on the right line. And it has to be,
00:57:38.120
I remember it has to be, I don't know, maybe this is done electronically now, but
00:57:41.640
certainly at the time, I remember having to redo many of these things. They'd get sent back to me
00:57:46.800
for months and months and months until I got it just right. But somebody had to write on that
00:57:50.680
death certificate, subdural hematoma is the proximate cause of death.
00:57:55.200
Well, two Vanderbilt neurologists issued this report and this came up later. The medical examiner
00:58:01.880
down the road, two years later, changes the cause of death to accidental. They get tipped off.
00:58:10.200
By a report that comes out. So it'll be clear here as I progress. Basically, within a month,
00:58:16.680
Vanderbilt, this is per investigative reporting by the Tennessean, quote unquote,
00:58:21.340
the Tennessean reports that Vanderbilt takes several actions to obscure the fatal error from the public.
00:58:29.180
Okay. It was not reported to state or federal officials. That's required by law. You've got
00:58:34.300
to report it to the state and you've got to report it to CMS, Center for Medicare and Medicaid Services,
00:58:41.400
Report any death or any death that is deemed accidental?
00:58:44.360
Any, what we call sentinel event, which is clearly a preventable adverse event related death,
00:58:50.680
will be referred to as a sentinel event. They've got to be reported. Not reported to the Joint
00:58:55.240
Commission. You could argue that's an accrediting body. It's private. You can break their rules.
00:58:59.440
It's not a violation of the law. But Vanderbilt basically takes these actions to hide or obscure
00:59:06.340
the error, according to the Tennessean, from their investigation. They fire the nurse and Vanderbilt-
00:59:17.120
Vanderbilt immediately negotiates an out-of-court settlement with the family,
00:59:21.740
gags the family from saying anything about it. Everybody is gagged in the family,
00:59:26.180
except for the grandson, who is legally not included in the gagging. He ends up speaking
00:59:30.700
up later. The hospital, when they're asked subsequently about the case, say, oh, we can't
00:59:35.940
discuss it because of this legal settlement that we have. By the way, they don't say anything
00:59:44.080
So just to make sure I understand, we're a month after this woman dies. The death certificate
00:59:49.060
and the neurologists all agree she died of subdural hematoma. But clearly, the family has been told
00:59:56.000
the truth, which is why they're receiving a large settlement and asked to sign a gag order.
01:00:03.620
That's right. The nurse, Redonda Vaught, gets a job at another hospital as a bed coordinator,
01:00:11.580
which all hospitals have bed coordinators. It's a hospital called Centennial in Nashville.
01:00:15.920
And then you go all the way to the fall in October. Remember, this happened Christmas Eve. So
01:00:21.980
you're almost a year out. An anonymous person reports to the state and CMS that there was an
01:00:30.380
unreported medical error. Okay. They basically got tipped off by some whistleblower who's anonymous.
01:00:36.260
Then the Tennessee Health Department, which is tipped off, formally states that they're deciding not
01:00:43.080
to pursue any action on this tip-off. The agency actually said in a letter that the event did not
01:00:51.000
constitute a violation and therefore they're not going to do anything. Now, just as an interesting
01:00:57.900
side note, many of these state medical boards are basically sleepy organizations. If you know the
01:01:03.700
story of Dr. Death. Tell the story. Neurosurgeon in Texas, multiple horrific catastrophic outcomes,
01:01:09.920
all believed to be sentinel events, catastrophic, avoidable medical mistakes, negligence, and people
01:01:16.040
dying in his practice over many years, documented by Laurel Beal in this famous podcast. Maybe at one
01:01:23.880
point it was the most popular single podcast in the entire world of podcasting titled Dr. Death.
01:01:30.920
And basically everyone knew of this doctor's problems. The residency program knew, but they just
01:01:37.240
graduated him to sort of get rid of him. He had problems at numerous hospitals. Nobody would say
01:01:42.120
anything. And this kind of speaks to this problem of the old culture of patient safety. Finally,
01:01:47.640
there's something so egregious that happens that he gets arrested and goes to jail. Now, the state
01:01:52.600
medical board didn't want to touch it for the longest time. That's typical of state medical boards.
01:01:57.360
They generally don't want to touch anything. Now, ironically, you can meet Dr. Death and kill people
01:02:03.060
and they don't touch it. But if you prescribe ivermectin, all of heaven and earth is coming
01:02:08.840
down on you. Now, I don't believe ivermectin has any activity against COVID. I should just state that.
01:02:14.440
But it has no downside. And I don't recommend it. I'm not. But I mean, right now they're going after
01:02:21.220
people who prescribe ivermectin with warnings. And they want your hide if you prescribe ivermectin.
01:02:29.180
Just an irony. So they basically say that she does not violate the statutes and or the rules
01:02:36.100
governing the profession. They put out a statement, Tennessee Department of Health.
01:02:41.180
Yep. This is almost a year after the event. And can you imagine what she's thinking? Like,
01:02:45.960
you just want it to be over. Things are escalating.
01:02:49.480
Did she ever speak to the family? Was she permitted to apologize to the family?
01:02:53.560
She was under orders by Vanderbilt never to speak to the family. But she had said through
01:03:00.200
the media several times that she takes full responsibility. She even said in her trial that
01:03:06.280
she was 100% at fault. Which is, I think, beating herself up over something that was probably a
01:03:11.960
combination of her mistake and a system. But I'll leave my commentary out of this.
01:03:17.140
So CMS starts investigating. Medicare, when they take this seriously, this whistleblower complaint,
01:03:25.520
they do a surprise investigation at Vanderbilt. End of October, early November,
01:03:31.640
they spend about a week investigating Vanderbilt. They are pissed. Vanderbilt clearly did not report
01:03:38.760
this. Clearly a violation. And they get so serious about this, they basically conclude that the
01:03:47.660
medical error was not reported in violation of the rules. And they threaten all Medicare payments to
01:03:55.820
the institutions, to Vanderbilt. They are serious. And that's when this becomes public, about a year
01:04:02.760
after the event. Because Vanderbilt would not discuss it. But a journalist was able to get a
01:04:09.220
hold of this report from Medicare. Because it's a public document. It's a public agency.
01:04:14.820
Did they have to request a FOIA? Or did they just get it on their own?
01:04:18.740
That was not through a FOIA. That was public information. But no names were in there. So
01:04:22.520
Redonda Voigt is still basically not a name in the United States at this point. Now, CMS told Vanderbilt,
01:04:30.060
if you cannot show that you have taken system-wide actions to prevent this in the future,
01:04:36.900
we are going to suspend all Medicare payments to Vanderbilt University Medical Center.
01:04:41.380
If you talk about a threat, it's maybe the biggest threat in healthcare in the modern era.
01:04:45.460
Vanderbilt gives CMS a so-called plan of correction. You know, here's what we're doing.
01:04:50.480
We're taking this seriously. And they don't release that to the public. A journalist then
01:04:56.340
got that plan of correction through a FOIA request, Freedom of Information Act request.
01:05:02.340
Tried to get it from Vanderbilt directly, but they were denied. Then on February 19th,
01:05:07.520
the name Redonda Voigt became public information when she was arrested for reckless homicide
01:05:15.320
and impairment abusing an adult. Now, that's when people found out about what happened.
01:05:22.600
Just to make sure I understand that, she was arrested?
01:05:26.780
Tell me how that happened. So a DA saw the case and said, we're going to press charges?
01:05:33.280
That's right. The district attorney in Davidson County basically said, we're going to go after her.
01:05:41.040
Let's stop there for a second. How often does that happen, that a medical mistake happens
01:05:46.700
and a district attorney presses criminal charges against the doctor or the nurse or technician or
01:05:53.480
whoever is involved? I have been in this field of patient safety my entire career. I've never heard
01:05:59.320
of it with a nurse. I have heard of outright fraud resulting in arrests. For example, the doctor
01:06:07.120
death story. There was a doctor in Michigan who was giving chemotherapy to people who didn't have
01:06:12.020
cancer. I mean, that's sort of cold-blooded fraud. If you exclude two types of errors, if you exclude
01:06:17.800
fraud, so all financial crimes that are fraud, and if you include like doctors who are raping patients
01:06:24.960
where they're just breaking the law, I'm talking about a medical error that was not made deliberately.
01:06:32.080
Never heard of it. Never heard of an arrest for an honest medical mistake. In fact, one of the
01:06:37.540
principles of patient safety that we have been advocating throughout the entire 25, 23 years
01:06:44.380
of the patient safety movement in America has been the concept of just culture, which is a doctrine
01:06:50.720
which says that honest mistakes should not be penalized. They should be penalized if there was
01:06:58.720
malintent or substance abuse or somebody should be suspended from their role if they are an ongoing
01:07:04.800
threat. But honest mistakes should not be penalized. And that is a doctrine that has enabled people to
01:07:10.800
speak up about this epidemic of medical mistakes in the United States. And that has been celebrated as
01:07:17.520
the sort of giant milestone of the American patient safety movement. And it's a worldwide concept.
01:07:22.820
I've traveled the world and people believe in the just culture doctrine. The arrest of Redonda
01:07:30.020
Voigt undid, in my opinion, 23 years of advancement in patient safety, it undermined the very fundamental
01:07:38.680
doctrine of just culture. She was arrested. By the way, she had the entire time in documents that
01:07:46.400
subsequently came out, immediately admitted what happened at the moment this woman died and throughout
01:07:53.240
and ever since and to this day. And I've had a recent interaction with her, I can touch on that.
01:07:58.840
But basically, the victim's family, one of the members of the family, had basically said the
01:08:04.380
patient would have forgiven her. So the trial started when?
01:08:07.680
Right now, we're about a year after. But because of COVID, the trial doesn't start until...
01:08:14.120
Three months ago, March 21st to 25th, about a four-day period. So in the interim,
01:08:20.300
there was a meeting of the Tennessee Board of Licensure, basically the Department of Health.
01:08:24.500
Remember, they had said they're not going to pursue this. They then flip. The executive at
01:08:30.180
Vanderbilt University, C. Wright Pinson, who's actually a pancreatic biliary surgeon, I know him.
01:08:36.360
He sort of admits to this board that looks into Vanderbilt and says, yes, the death was not
01:08:43.460
reported, essentially, I'm paraphrasing, and that our response at Vanderbilt was too limited.
01:08:48.740
Now, at this point, Redonda Vaught is getting a lot of national attention, and she's got big
01:08:53.940
legal bills, and she goes on a GoFundMe campaign, raises over $100,000, and basically says in the
01:09:00.740
GoFundMe campaign that, look, she made a mistake, and she needs legal costs. I mean, this woman could
01:09:05.100
not have been more honest about what happened. Also, around that time, nurses nationwide take
01:09:11.600
notice. There's millions of nurses in the United States. They start getting very emotionally
01:09:17.380
connected to this. They start showing up at some of these hearings in front of the Department of
01:09:22.480
Health, and they say, I am Redonda. That becomes a slogan that nurses around the country take on.
01:09:29.500
They put it on social media. They stand outside, hundreds of them, around the time of her trial
01:09:35.040
with signs, I am Redonda. Basically saying what you and I were saying. Every doctor, every nurse I
01:09:42.820
talked to, I was talking with Zubin Dabani, same reaction. I see exactly what may have happened.
01:09:50.060
Gosh, that could have been me. Look at the study from Mayo Clinic. 10.5% of people admit to a major
01:09:55.700
medical mistake in the last three months. People reconnect with Redonda Vaught. Several dozen people
01:10:02.120
are out at every appearance. She makes her court plea in February of 2019, just about a year after the
01:10:09.360
incident, a year and a month. She pleads not guilty. Now, her lawyers argue that Vanderbilt
01:10:14.500
shares part of the blame. Now, several months later, the Tennessee Department of Health, which said
01:10:20.700
they're not going to pursue action against her, they flip. They reverse their position, and they go after
01:10:26.720
her. And they use the argument that they must immediately investigate what they describe as a
01:10:33.840
threat to the public. Her lawyer, knowing that they're going to go to trial for the criminal case for
01:10:38.840
murder or homicide, he asks the judge to postpone the Tennessee Department of Health hearing
01:10:45.860
because he sees... Wait, I'm sorry, Marty. I just missed something. I don't think I was paying
01:10:50.600
attention. This was homicide, not manslaughter? Homicide. This is homicide. Reckless homicide and
01:10:58.300
abuse. Now, she has two hearings and for two legal proceedings ahead of her about a year after the
01:11:05.680
incident, a year and a half out. She's got the Tennessee Health Board, and she's got the criminal
01:11:10.980
case to go. So her lawyer says, look, Tennessee Health Board, they're acting like a bunch of clowns.
01:11:16.420
I'm paraphrasing. They said they're not going to take any action. And then over a year later, they
01:11:21.560
suddenly reverse their position. What's going on? So he makes this argument, and the Tennessee
01:11:27.900
Department of Health says, very fishy, they say, no, we must do this immediately. We cannot postpone it
01:11:33.800
until after the criminal trial because she may pose an, quote unquote, urgent threat to the public.
01:11:39.100
I can't believe what you're hearing here. The administrative judge, Elizabeth Cambron,
01:11:44.500
decides not to delay her Department of Health hearing, and it goes ahead of her criminal hearing.
01:11:50.660
And she ends up going in front of this board. At the same time, Vanderbilt is just hanging out,
01:11:58.820
arguing they can't say anything about the case. This Tennessee investigation says that they've
01:12:04.440
obscured the circumstances of her death. And this grandson is so frustrated, he makes a statement
01:12:12.040
around then that says that Vanderbilt is engaged. And remember, he's not under the gag order. He says,
01:12:18.440
quote unquote, that there's a cover-up that screams. There's a cover-up that screams.
01:12:24.180
COVID comes, hits this country. If you haven't remembered, that's a coronavirus that resulted in
01:12:32.060
two pandemics, a tragic pandemic which killed about a million Americans, and then a subsequent
01:12:38.160
pandemic that followed called a pandemic of lunacy. But in July, finally, they get their trial.
01:12:45.280
The first one is the Department of Health. She says at the Department of Health hearing,
01:12:49.360
this is completely my fault. Her license is revoked, even though the board says things that we would
01:12:56.940
sympathize with. They say, the vice chair of the board says, we all make mistakes. And there have
01:13:04.360
been many mistakes and failures in this case, suggesting basically that Vanderbilt has part of
01:13:09.780
the blame. But they say, our role is just to evaluate the role of the nurse here, and they revoke her
01:13:15.540
license. Kind of ridiculous what their statements are. Then three months ago, it goes to the criminal
01:13:21.600
trial. And the Davidson County DA, Glenn Funk, has his three assistant DAs go to the mat in court. And
01:13:31.420
they aggressively and viciously went after her. These three assistant DAs, Debbie Housel, Chad Jackson,
01:13:39.120
and Brittany Flatt recently became assistant DAs. It's kind of a new job for them. And they go
01:13:46.720
viciously after her and argue that there was negligent homicide. Now, she does everything she
01:13:55.620
can to try to defend herself. Now, what's their argument? Their argument is this was such an egregious
01:14:01.400
error. I guess I'm just trying to understand how this is homicide. Maybe I just don't understand the
01:14:06.040
law well enough. But if you kill somebody while you're driving, let's assume you're not under the
01:14:11.880
influence of alcohol or anything like that, and you're not driving recklessly. You're driving safely
01:14:16.440
and you kill a cyclist. I'm not aware of a driver in that situation having... I certainly know this was
01:14:24.740
the case in California when I lived there, but I know that there was no instance in which a driver who
01:14:29.100
killed a cyclist faced criminal charges unless there was reckless behavior involved or alcohol.
01:14:35.360
So what rises to the level of even manslaughter, vehicular manslaughter is presumably what? Is
01:14:42.120
that when you're driving recklessly and another person dies as a result of it? I guess I'm just
01:14:47.340
trying to understand what the DA's argument was here, legally, and then separately, politically.
01:14:53.420
I don't know if you can speak to either of those, of course. These are broader questions.
01:14:56.720
Those are the same questions I had. I'll tell you what I know, and that is that she was charged with
01:15:01.560
quote-unquote negligent homicide and abuse of an impaired adult and found guilty of both of those
01:15:08.100
charges. Now, in the arguments that they made, they had cited 10 mistakes that she had made,
01:15:14.120
and it was kind of the Swiss cheese model that we had talked about with patient safety. This is
01:15:18.620
the perfect storm, if you will. It was, she was distracted. She overrode the warning alert,
01:15:25.820
even though nurses at that hospital say that they do that every day. Nurses said every day they
01:15:31.980
override alerts, that it was a powder and not a liquid, that the cap should have said it was a
01:15:38.420
paralyzing agent. There's so many things that they point to that you can frame somebody, you can make
01:15:43.580
somebody look like they are doing something that is, can you imagine if they had the insights that we
01:15:49.080
have at our M&M conference, it would just look really bad on the outside. They did everything they
01:15:55.180
could to paint. These are aggressive young lawyers. Now, Glenn Funk, who's the DA, who was getting a lot
01:16:01.560
of attention around this time because this is his office that was bringing the charges against
01:16:05.940
a Vanderbilt nurse for a medical mistake that was an honest mistake that she admitted to immediately.
01:16:12.140
He had two other VAs who were running against him condemn this saying, you know, this is a farce.
01:16:18.060
What's going on? Something is fishy here. There are rumors, conspiracy theories in Nashville that
01:16:24.840
maybe there is some entity behind this oddly aggressive action against this nurse, a competing
01:16:32.160
health system, Vanderbilt University itself to bring attention away from its error and not reporting and
01:16:38.640
other errors related to this case. I don't know. I have no opinion on any of those, but those are
01:16:43.880
definitely circulating ideas because to have a DA so aggressively go after a nurse for an honest
01:16:50.020
mistake with such a significant charge. It is odd. It is odd. Now, she was found guilty and sentenced
01:16:58.880
very recently. And in the sentencing, she was convicted of homicide. That's right. Found guilty,
01:17:05.340
negligent homicide. And in the sentencing, what was the possible range of sentences she could receive?
01:17:13.160
I know what the sentence was, and we'll talk about that. But coming out of the trial, what was the
01:17:18.060
potential? The judge had considered three years of jail time, but of course, the judge could have said
01:17:23.680
whatever the judge wanted to say. They could have said 20 years or a lifetime. Negligent homicide
01:17:30.200
is not something where I think there's a ceiling on how many years you can give somebody.
01:17:36.420
Did legal experts have a point of view on what was expected?
01:17:40.640
I've not heard any experts comment on what was expected. I think at every stage in this entire case,
01:17:47.100
people expected the thing to end. The DA would say, she's been through the ringer now. We're going
01:17:53.880
to basically slap her on the wrist and do a settlement or something like that. Never happened.
01:17:59.240
And so as this grows, nurses around the country are finding they connect with her. A bunch of letters
01:18:15.600
So the judge was merciful to give her three years of probation. And so there'll be no jail time for
01:18:23.100
But she's a convicted felon for the rest of her life.
01:18:25.660
Well, not for the rest of her life because she got something called judicial diversion, which means
01:18:30.520
that they can expunge her criminal record if she serves the probationary period on good behavior.
01:18:37.000
So, you know, an act of mercy from God there. And God, I'm not referring to the judge. I'm referring
01:18:44.840
So the prosecution, I'm sure, was very upset with that sentence. It sort of undermined a lot of their
01:18:50.460
Yep. Here is what one Vanderbilt physician, you know, these letters of support started coming out
01:18:56.080
to the public. Here's what a Vanderbilt physician wrote. And I think this Vanderbilt physician speaks
01:19:00.140
for many of us. He said, we cared, referring to the nurse that he worked with, Redonda Voigt,
01:19:06.740
we cared for so many patients together. What was notable, what was the consistent high level of
01:19:12.080
attention I saw her to provide to so many of our patients and their families when we worked together,
01:19:18.840
she was very conscientious and aptly cared for many complex patients. All these letters of support
01:19:25.800
of people she worked with at Vanderbilt come out. Lots of Vanderbilt physicians pissed off at what's
01:19:30.580
happening. They're not happy that their impeccable medical care is getting characterized nationally by
01:19:36.840
the actions of their administration. Here's what the DA's office did in response to these letters that
01:19:45.840
I am sickened by those who rallied around her as a hero. I thought she was a horrible anomaly,
01:19:52.880
but now I think there are hundreds of thousands of nurses who must also be dangerous practitioners
01:19:59.740
since they defended the indefensible so readily. That was Lisa Bergelko. She is an assistant professor
01:20:08.780
at Newman University. She wrote that letter in support of the DA's prosecution and the DA put that
01:20:14.120
letter out in the public domain almost as a... And who is this professor? She's a professor of what?
01:20:19.420
She's a professor of nursing. She's a nurse herself. So this is the saga that we live with now. And in
01:20:28.280
my opinion, we have had decades of progress in patient safety, about 23 healthy years of
01:20:34.320
significant improvements in the culture of safety and the way we approach safety, undone with a single
01:20:41.720
group of assistant young district attorneys that decide to go after one individual at the exclusion
01:20:49.780
of doing anything about a hospital that, unlike the nurse, did not admit to anything initially and
01:20:57.380
broke the law. What do you think is the fallout of this? Have you spoken with nurses or doctors in
01:21:05.580
the interval since the conviction explicitly about this? And do you have any anecdotal evidence that
01:21:14.040
that's going to change the culture of reporting and open and honest dialogue around medical mistakes?
01:21:20.980
There's a preliminary statistic that one in five nurses in the profession are quitting during the
01:21:27.320
pandemic. Now, some of that is pandemic burnout. Some of it's a number of factors. But a lot of nurses are
01:21:33.520
leaving the profession. And there's this feeling that they don't feel valued. And this has been a bit
01:21:40.960
of a smack in their face. And so hospitals around the country who are dealing with real critical
01:21:45.780
nursing staffing shortages are trying to pay attention to the concerns that nurses have about
01:21:51.680
this case. They're trying to make it clear that this is not their approach. I have talked to lawmakers
01:21:58.220
at the state level in different states who are thinking about passing protections for nurses.
01:22:03.520
to try to encourage people in nursing. If you look at the protection that police officers have,
01:22:09.860
they have an immunity intrinsic to their jobs. And should that immunity be extended to people like
01:22:16.140
Redonda Vaught? It's delicate, but this is now a new conversation that has surfaced. I also had an
01:22:22.900
interaction with her. And that is that she had reached out for help to our friend Zubin. And Zubin had
01:22:30.200
passed that information on to me. Now, I get so many of these requests, you know, I've been unfairly
01:22:36.120
sued, or I'm going to court, or I have a deposition. Can you help? I honestly just did not see the email.
01:22:43.540
I felt horrible once I saw that this blew up. And Zubin had pointed out to me that he had sent me this
01:22:49.120
email. And I reached out to her and just basically told her, like, if I can be an expert witness or help,
01:22:55.440
I'm happy to do so on your behalf. So I found her to have an incredible spirit, good attitude.
01:23:03.660
I feel bad for her. She was crying at the trial when she was found guilty. But that is my interaction
01:23:10.400
that I've had with her. Maybe to put a bow on all of this, if you're someone listening to this,
01:23:15.140
and you're thinking about how you can interact with the healthcare system, it seems that the majority of
01:23:20.280
medical errors take place inside of hospitals. Is that a fair assessment? Yes. It makes for a
01:23:26.280
frightening experience when you're going to a hospital. Because usually if you're going to a
01:23:29.760
hospital, even if it's electively, you're going to have an elective surgery. Or, you know, you're
01:23:34.280
going there non-electively, which is even more frightening. The medical side of it is bad enough
01:23:38.860
in terms of what you're worried about and what could happen. But I think this discussion we've had
01:23:43.620
over the last, you know, whatever 90 minutes speaks to another threat that might even rival that
01:23:49.960
threat. My personal view is it's less than that, but we'll never know that answer potentially.
01:23:54.640
What can a person do if they or their loved ones are going to be admitted to the hospital,
01:24:00.160
either electively or emergently, to reduce the odds of any of these medical errors? They run the
01:24:07.520
gamut from incorrect medicine administration to unnecessary procedures. There's no end to what
01:24:14.820
these mistakes look like. Is there anything that the patient or the patient's family can do to reduce
01:24:18.620
the risk of that? There's a lot. So first of all, things are much better, in my opinion. Hospitals are
01:24:24.620
safer. There's more awareness. When you bring up these questions or issues, there's attention to them.
01:24:29.380
Every hospital has a patient relations department. And if things just don't seem right, if you feel that
01:24:35.820
you're not communicating effectively with your care team, you feel care is not coordinated,
01:24:41.540
you have a concern or there was an error, you can call the patient relations department. They've got
01:24:46.800
somebody on call 24-7. That's basically a standard thing in the hospitals now. It's important to have
01:24:53.180
an advocate with you anytime you get medical care. You've got a loved one in the hospital. It's amazing
01:24:59.500
how it seems that the care is just overall much better, holistic, comprehensive, and coordinated
01:25:06.520
when there's a family member or loved one there. Could be a friend, but they're there taking notes.
01:25:12.680
They're asking questions. When you come in for rounds, they're asking to talk to the doctor who's
01:25:18.360
in charge of their care at least once a day. They sometimes set an appointment where they say,
01:25:23.820
look, I'd love to talk to the doctor. And you can communicate this often through the nurse or the
01:25:28.540
nursing assistant to say, is there a time I can plan to be here where I can speak with the doctor
01:25:33.760
caring for so-and-so? It's important to ask about alternatives. We've generally had this sort of
01:25:40.240
burnout mode response to any questions in medicine as residents where if they ask any question,
01:25:47.720
you just tell them they could die if they don't have something done and you don't get into the details.
01:25:53.380
And it's like, look, we got to just ship and load and unload the trucks. If we're told this
01:25:58.420
person needs a CAT scan on rounds, we're supposed to see that it gets done. And we may not have a good
01:26:05.040
breadth of knowledge as a young trainee of the alternatives. Some people do a good job. Other
01:26:11.260
people may not be able to present those options. Well, if you ask the right questions and ask about
01:26:17.820
alternatives. So for example, you're supposed to go down to have a filter put in your large vein in
01:26:25.740
your body called a vena cava. And you might say, wait a minute, you know, typically we decide on
01:26:31.360
rounds, this gets done. The intern explains, hey, the doctors want to put a filter in. They may or may
01:26:37.040
not even explain it. Sometimes you get patient transport shows up to take you down there and
01:26:42.300
patients not really in the loop. You know, they're getting medications. They don't even know what
01:26:46.720
they're getting, what's getting infused, what they're taking by mouth. The more they can be aware of
01:26:51.760
what's happening, ask about the reason for those and the alternatives, the better the care is going
01:26:58.660
to be. And that's a hard ask though, Marty. Medicine really is a foreign language. And I
01:27:04.060
think back to when I was a trainee, I'd like to think I was pretty good at explaining things, but
01:27:10.420
you're laying there in a bed, you've got an IV, a nurse is coming in and usually putting something
01:27:16.080
into an IV or giving you a little cap with pills in it. Patients are really intimidated to say,
01:27:21.800
can you tell me what each of these pills is and tell me what each one of them does.
01:27:26.000
We now have a protocol, if you will, that our nurses are supposed to explain to the patient
01:27:32.660
every medication that they give them. So let's say it's time for your twice a day
01:27:38.720
LASIK medication, which is LASIK is a medication that's given to move body fluid from the third
01:27:46.620
spaces in your body into your urinary system. So if you've got too much fluid in your body,
01:27:52.120
it'll cause you to urinate some of that fluid out. So the nurses will actually explain to the patients,
01:27:59.080
I'm injecting some LASIK medication. This is a medication to address the swelling in your body
01:28:05.960
and it will cause you to urinate more. And so this is actually a big effort right now in patient
01:28:12.220
safety. And we actually had a protocol for a while where we had one of our doctors, actually Peter
01:28:18.480
say on the closed circuit television in the patient rooms, ask us questions, ask about the medication
01:28:25.180
that's being given to you. You should know what it is and what it's for. And you should ask your
01:28:32.060
doctor or whoever walks in the room nurse if they've washed their hands. And it became this
01:28:37.400
sort of partnership where we want you to ask, hey, have you washed your hands? Before it was kind of
01:28:42.040
like, how dare you ask me? Of course I washed my hands. Of course we didn't always do it. But this
01:28:48.100
is the sort of new dialogue that we are trying to promote to make the patient a participant in their
01:28:53.760
care and not just a bystander. And when you do it, what I've noticed, the more educated they are
01:28:59.780
or their surrogate, the better the care is. Many times they just say, wait, wait, wait, this does
01:29:05.660
not make sense, what we're doing here. They were supposed to have this and this, why not do it at
01:29:10.840
the same time? This doctor wants to do this and this. And so I do see improvements, a change in the
01:29:17.080
culture, a awareness and this effort to educate people. And the more people can do it, I mean,
01:29:23.320
you are in the middle of a very complicated system of care when you're in the hospital.
01:29:27.720
I mean, the more you can be aware of what's happening, the safer the care.
01:29:31.960
What's the biggest thing that has to change or biggest three things that have to change
01:29:36.220
to be sitting here in 10 years and say, we've cut that medical error death rate down by 50%?
01:29:43.820
Payment reform, number one. So there's not really a great financial incentive for better safety.
01:29:48.900
If a hospital is safer, what is their financial reward? We know there's an altruistic
01:29:54.760
moral reward. And we know people generally like that, but a lot of times the CFOs are making the
01:30:00.100
decisions. They want to see an ROI and you bring in something to a hospital that say is going to reduce
01:30:05.620
the number of misses in radiology. Let's say there's a software program that will take a second
01:30:13.100
look at chest X-rays and chest CAT scans to look for lesions that the radiologist missed. And if
01:30:20.040
it's identified with the AI that can pick up lesions now pretty sensitively, but that lesion is not
01:30:26.500
noted in the report by the radiologist. This is all sort of digital. This is all computers are doing
01:30:33.320
this. They do the AI. They look for the reports, look for the keywords that there's a tumor, lesion,
01:30:38.260
coin size lesion, and they can reconcile in our systems whether or not there's a discrepancy.
01:30:45.160
And by the way, what I'm describing is a real product out there that's used at Sutter Health.
01:30:50.240
AI is used to look at the scans as sort of a second check. The same thing has been done with EKGs.
01:30:56.060
And then they look for discrepancies in the reports. And if the AI picks it up and the report doesn't,
01:31:02.180
then that list of that patient having an unreconciled difference goes to the radiologist and
01:31:07.920
they're to review that list of non-reconciled differences between the AI and the radiologist.
01:31:14.100
Now, what is the ROI to the hospital on adopting that technology? Zero. Negative. It's a cost that's
01:31:22.840
not rewarded. And so what we've done is we've relied on the values of executives to adopt technologies
01:31:31.300
that they believe in. Many times the doctors are the champions for this. The head of radiology says,
01:31:35.920
you know, I know this is not going to be great for our bottom line, but we're doing well financially.
01:31:40.440
Let's adopt it. Let's be honest. Many hospitals had their most profitable year last year. And some
01:31:46.600
hospitals have so much cash reserves. Reserves are so great that they have investment arms. And they're
01:31:52.280
basically hedge funds with hospitals on the side. At this point, some of these medical centers,
01:31:57.840
they have so much money in cash. So we rely on individuals and innovators to say,
01:32:03.680
there's no formal ROI that you're going to see on the bottom line immediately,
01:32:08.360
but we believe this is better care. And you're seeing that adoption very sporadically and very
01:32:15.420
haphazardly. There's a lot of the patient safety innovations that make sense, but they have a tough
01:32:21.180
time getting in. So we've got to change the payment model. That I think is the number one.
01:32:25.740
But I thought you said that CMS was already saying, we're not going to reimburse for
01:32:30.920
cases where there are errors. That's a stick more than it is a carrot. But has that changed the
01:32:38.040
culture? That has changed the culture, but it's only not reimbursing three specific types of errors,
01:32:45.700
which are three types of what we call never events, which is death of an ultra low risk person
01:32:53.220
in the operating room at the time of surgery, a retained sponge, a retained foreign object.
01:32:59.600
There's an airway never event, which nobody should die of an airway, lack of an airway exposure.
01:33:05.000
So these are very narrow events. They're rare events. And so yeah, CMS is not paying for it,
01:33:10.580
which has put a ton of attention on these issues. And the reporting to the state on these issues
01:33:16.080
has created a ton of scrutiny around these events. And those events are, I mean,
01:33:20.460
the counting process we do now coming out of surgery is intense. It started off when we were
01:33:26.420
residents like, yeah, I think we got all the sponges and instruments out. Okay. And then it
01:33:31.460
went to the nurse. Do we have all the sponges and instruments out on the set? Yeah, we've got them
01:33:37.140
all. Then it was count them to make sure it's the same number we started with. And then it was a formal
01:33:42.620
count that was recorded. And now it's an RFID or scanned barcode scan system. And so we've
01:33:50.440
matured a lot with that. That's because of this intense scrutiny around this particular type of
01:33:56.640
mistake. Now, if you overprescribe opioids after an uncomplicated vaginal delivery, I mean, OB doctors
01:34:03.860
will tell you, you should not be giving opioids through uncomplicated vaginal delivery. And yet
01:34:08.360
women will go home with a bottle or other minor procedures. And so if you prescribe 30 opioids,
01:34:15.320
when we know best practices would never allow more than 10 opioid pills in a narcotic naive adult,
01:34:22.580
that's an error, but are we even measuring it? Now at Hopkins just began the measurement and data
01:34:29.960
feedback process for that type of error. I wish there was more attention. And if I could say one more
01:34:35.620
thing, I probably shouldn't, but what the heck? During COVID, we saw this intense bias towards
01:34:44.880
laboratory research. That the only real serious type of research is laboratory research done under a
01:34:52.000
hood in a laboratory at places like the NIH. And that's how we solve disease. That all this other
01:34:58.580
stuff, the stuff Marty's interested in, systems change, standardizing processes, that's soft stuff,
01:35:05.080
culture speaking up. And that's not really science. And so what you have is all of our health agencies
01:35:11.680
really entirely focused on laboratory medicine. And what happens is you get young investigators,
01:35:18.240
faculty, they can't get grants to do research on this stuff. They're not rewarded. They don't get
01:35:24.840
promoted. They're told by their department directors, they got to have a lab or do something lab related.
01:35:30.000
There's one small government agency that funds this kind of stuff called the Agency for Healthcare
01:35:34.940
Research and Quality, massively underfunded, fair amount of cronyism in how they fund their grants as well.
01:35:40.300
But during COVID, we wanted to know the behavioral aspects. How does it spread? When are the most
01:35:47.560
contagious? Do masks work? None of those questions were answered with good evidence. Instead, we had
01:35:53.760
massive efforts going on in the lab. Appropriately, it's not downplaying that. We need that, but we need
01:35:58.880
both. And so the NIH CDC never did a study to say, is it airborne or surface transmission? Instead,
01:36:07.920
they let that debate linger in the public domain for months from January until April, letting people
01:36:14.500
argue, opine on TV. They could have done that study in 24 hours. Natural immunity, cloth masks,
01:36:21.560
N95 masks, the reduction in transmission. All those studies could have been done. Immediately,
01:36:26.900
they didn't because they were entirely focused on laboratory pathways and blocking and medications
01:36:34.040
and pharmaceutical solutions. We need those, but you saw it come at the complete exclusion
01:36:40.160
of basic clinical research. And we see the same thing with patient safety. That bias towards laboratory
01:36:47.240
research is hurting us badly. And as you know better than anyone in the United States and the world,
01:36:53.720
where's the NIH research for food as medicine and the inflammatory state and environmental exposures that
01:36:59.780
cause cancer and school lunch programs? Instead, we're talking about bariatric surgery and throwing
01:37:05.180
insulin at people and second-line antihypertensives. Where's the science of sleep medicine at the NIH?
01:37:12.180
So these are the giant blind spots in our current national funding mechanism. And patient safety is one
01:37:18.500
of those blind spot areas. Still, I'm surprised, I guess, based on the recent reports over the past five
01:37:24.420
years that it still remains kind of in a blind spot. Because if you just looked at it through the lens of,
01:37:28.240
even if it's the eighth leading cause of death and not the fourth leading cause of death,
01:37:32.880
that would still be enough presumably to justify a more systems-based approach to
01:37:36.840
the problem solving. Now, I guess I will say this, it's a very different type of research.
01:37:41.880
And it's not really the type of thing that they've mastered. There's a well-understood playbook for how
01:37:48.060
you go from idea to grant, funding cycle, results, publications, et cetera, within the sphere of
01:37:55.300
the type of research that they're currently funding, both translational and basic and clinical
01:38:00.280
for that matter. But this is different. I don't know. I got to be honest with you, Marty. I don't
01:38:03.300
come away from this discussion particularly optimistic that either the system is going to
01:38:08.840
get that much better or that an individual can do much to protect themselves. I feel like you or I,
01:38:14.900
if we're in the hospital with a friend or family member, I think we're lucky because we really know
01:38:19.020
what questions to ask and we can probably reduce the damage potential by a little bit. Not entirely.
01:38:25.980
I think back of the case of this woman who died at Vanderbilt. I mean, even if that was my grandmother,
01:38:30.820
it's unlikely I would have been in the scanner with her. I would have been waiting back in the ICU.
01:38:34.940
There's nothing I could have done to have prevented that mistake. And so that's what I'm kind of curious
01:38:39.580
about is like, where is the innovation there? What makes it impossible to give vacuronium to a person
01:38:48.180
who is not intubated? That's kind of what I want to understand. And you might say, Peter,
01:38:52.560
that's not the mistake worth creating an enormous system around because that only occurred 10 times
01:38:56.400
last year in the United States. We got to worry about the one that killed 50,000 people last year.
01:39:01.540
The movement is formalized into a group called the Institute for Healthcare Improvement,
01:39:06.760
which was started by Don Berwick. And he is a hero of patient safety. He has spoken at every
01:39:12.860
major medical center, probably in the United States, talking about the culture of safety and
01:39:16.860
all these issues. We talk about safety on rounds. And now almost every hospital has a chief quality
01:39:23.880
officer. And their job is to oversee these root cause analyses. That's routine now for any sentinel
01:39:31.420
event. If the hospital is honest, which most are, our hospital doesn't let things slip because they
01:39:38.720
settled with a family who had a 75-year-old parent die in a scanner. It doesn't matter where or when,
01:39:45.960
if there's a catastrophic or sentinel event, it's going to get a root cause analysis at Johns Hopkins.
01:39:51.400
I think that's the case at most hospitals. But to have a C-suite level executive focused only on
01:39:57.020
quality and safety within an institution. I think that's progress. I mean, we're seeing now
01:40:01.400
safety used in a constructive way when we decide, hey, there's too many patients hanging out in the
01:40:09.020
hallway and broom closet in the emergency room. That's not good for patient safety. It is now
01:40:14.980
part of that conversation. So I am a bit optimistic at the direction. Hospitals are also sitting on
01:40:21.360
tens of millions of dollars of surplus now every year, many of them, you know, not half of the
01:40:26.900
rural hospitals and not all hospitals. But what do you do with that money? When you're a non-profit,
01:40:31.800
you've got to reinvest it into something. And so you're seeing more willingness now to invest in
01:40:36.900
safer technology. And patients love it when they come into a hospital and they hear, hey, we do this,
01:40:42.520
this, and this for safety. Fundamental problem in healthcare is that we have non-competitive
01:40:46.600
markets. And the hospitals are competing basically on billboards and NFL advertisements and not on
01:40:54.220
quality and safety. And so now with more public reporting, that is starting to change. When I wrote
01:41:00.120
Unaccountable, gosh, 10 years ago, it's since turned into the TV show, The Resident, I called for public
01:41:06.380
reporting of sentinel events and other infection rates and complication rates and readmission rates.
01:41:12.340
And much of the medical establishment said, no way, this absolutely will never and should never
01:41:19.600
happen. Now we accept it. Nobody challenges or questions it. We have public reporting of those
01:41:25.100
adverse events. And when readmission rates became publicly reported, guess what happened to them?
01:41:30.660
They plummeted across the board because hospitals went to their doctors and nurses and said,
01:41:36.360
what do you need to ensure that your patients don't bounce back after you discharge them?
01:41:42.340
And we started having discharge coordinators and clear instruction sheets written at a sixth grade
01:41:48.000
English level. So it's mixed. In some areas, we haven't made much improvement. In other areas,
01:41:54.360
we do see an army of people now dedicated to quality and safety that we never saw before.
01:42:02.100
Well, Marty, I guess we'll be cautiously optimistic here. But I really am, as are, I think,
01:42:07.880
many people in the medical community, deeply troubled by what took place in Tennessee at all
01:42:13.900
levels, at the level of the nursing board, at the level of the hospital, and certainly at the level
01:42:16.920
of the DA. I think it's all a bad precedent. If your objective function is to improve outcomes,
01:42:25.540
But yeah, it was a tragedy. The silver lining is the groundswell of opposition to what happened
01:42:33.300
to her is encouraging. And I hope people keep speaking up about this case.
01:42:38.120
Yeah, likewise. All right, Marty. Well, thank you very much for this very last minute,
01:42:42.060
quick turnaround podcast that I thought was quite timely.
01:42:47.580
Thank you for listening to this week's episode of The Drive. If you're interested in diving deeper
01:42:51.720
into any topics we discuss, we've created a membership program that allows us to bring
01:42:55.980
you more in-depth, exclusive content without relying on paid ads. It's our goal to ensure
01:43:01.060
members get back much more than the price of the subscription. Now, to that end, membership
01:43:05.840
benefits include a bunch of things. One, totally kick-ass comprehensive podcast show notes that
01:43:11.240
detail every topic, paper, person, thing we discuss on each episode. The word on the street
01:43:16.160
is nobody's show notes rival these monthly AMA episodes or ask me anything episodes, hearing
01:43:22.240
these episodes completely access to our private podcast feed that allows you to hear everything
01:43:27.980
without having to listen to spiels like this. The qualies, which are a super short podcast that we
01:43:33.600
release every Tuesday through Friday, highlighting the best questions, topics, and tactics discussed
01:43:38.220
on previous episodes of The Drive. This is a great way to catch up on previous episodes without
01:43:43.520
having to go back and necessarily listen to everyone. Steep discounts on products that I
01:43:48.860
believe in, but for which I'm not getting paid to endorse, and a whole bunch of other benefits that
01:43:53.420
we continue to trickle in as time goes on. If you want to learn more and access these member-only
01:43:58.040
benefits, you can head over to peteratiamd.com forward slash subscribe. You can find me on Twitter,
01:44:04.800
Instagram, and Facebook, all with the ID peteratiamd. You can also leave us a review on Apple Podcasts
01:44:11.600
or whatever podcast player you listen on. This podcast is for general informational purposes
01:44:16.980
only and does not constitute the practice of medicine, nursing, or other professional healthcare
01:44:21.360
services, including the giving of medical advice. No doctor-patient relationship is formed. The use
01:44:28.040
of this information and the materials linked to this podcast is at the user's own risk. The content on
01:44:34.160
this podcast is not intended to be a substitute for professional medical advice, diagnosis, or treatment.
01:44:40.060
Users should not disregard or delay in obtaining medical advice from any medical condition they
01:44:46.840
have, and they should seek the assistance of their healthcare professionals for any such conditions.
01:44:52.840
Finally, I take conflicts of interest very seriously. For all of my disclosures and the companies I invest
01:44:58.080
in or advise, please visit peteratiamd.com forward slash about where I keep an up-to-date and active list