#113: Normative errors—a conversation with my daughter about current events
Episode Stats
Words per Minute
185.30151
Summary
A special episode that deals with the aftermath and some of the questions surrounding us in the wake of the brutal death of George Odell Floyd, and how we can begin to address them. In this episode, I sit down with my daughter, Olivia, to talk about racism and what it means to be black in America.
Transcript
00:00:00.000
Hey folks, welcome to a special episode that deals with kind of the aftermath and some of
00:00:04.260
the questions that are surrounding us in the wake of the brutal death of George Floyd.
00:00:09.320
A lot of people have been speaking about this in their own way, choosing to post various
00:00:14.140
I've never really thought I had much to offer with respect to doing stuff like that.
00:00:19.040
For me, it's been a very difficult couple of days kind of just thinking about this stuff
00:00:22.200
and not really knowing how to explain it to my daughter who has been asking a lot of questions
00:00:28.940
about it. And I kind of wrestled with this idea of doing a podcast or not. And in the end,
00:00:33.540
I sort of decided this evening after dinner that, you know, I was going to do this. So
00:00:36.940
Olivia and I sat down for maybe 20 minutes or something to talk about stuff that we have been
00:00:41.580
talking about all week. And I really wanted to talk more in terms of something that I had some
00:00:48.720
knowledge of. I don't consider myself an expert in law enforcement, racism, the history of police
00:00:54.340
brutality, all of the things that, you know, might factor into this. But there is one area
00:00:59.000
that I do think I have some knowledge in that I can speak to. And it has to do with the training of
00:01:04.300
physicians. And there's something I've thought a lot about. So I approach this conversation with
00:01:09.980
Olivia as I often approach conversations with her, which is sort of touching on a framework
00:01:14.200
and an understanding. Then maybe that's broader than this problem. As with all podcasts,
00:01:19.340
of course, I can't address all things. And that's more true in this podcast than
00:01:22.900
probably any podcast I'll ever produce. But I do talk about one very particular way that I
00:01:29.240
think about what's going on and perhaps more importantly, how we can start to get a handle
00:01:34.160
on it. This podcast might not be for you. If it doesn't resonate with you, I apologize. But
00:01:38.520
I think that for some folks, especially folks with kids who are struggling to kind of explain to them
00:01:43.480
what's going on, perhaps you'll find this helpful.
00:01:50.100
Hi, Olivia. Thanks for sitting down to talk for a few minutes tonight.
00:01:52.900
Yeah. So there's been a lot going on for the last few days. And I know you're in the middle of
00:01:57.860
finals. And I know that if that weren't enough, you've got a quarantine going on that you're
00:02:02.680
probably kind of at your wits end with. But tell me a little bit about how you feel about the last
00:02:07.760
few days and how you've heard about it and what you and your friends are talking about.
00:02:12.660
At first, I was seeing it on everyone's Instagram, justice for George. And I was confused. I was like,
00:02:17.700
what is that? And then I started looking it up and I was like, whoa, this just happened.
00:02:22.660
And then I started reading articles about it to see what happened. And this is the first time that
00:02:28.680
I've actually heard about something like this that's happened recently. Because the only ever times I've
00:02:34.320
heard about it is when I was younger and I don't remember it. But everyone that I text is sending like
00:02:40.600
Black Lives Matter and they're all doing different color fists. And I think that's great that they're
00:02:45.240
noticing what's happening. And I just don't understand how this could happen. Because like,
00:02:50.400
would people judge you about the color of your eyes? Like, what's the difference with skin color?
00:02:54.440
I mean, it's a very complicated question, Olivia. And it goes back hundreds of years. And you know how
00:03:00.400
the other day we were talking about all the different things in school that I'm really looking
00:03:04.460
forward to you learning in greater detail? Well, one of those things is history. And I don't think
00:03:11.120
you've yet really fully understood the history of how Black people came to this country and what terms
00:03:19.460
that was under and what conditions of racism produced it and allowed it to continue for literally
00:03:28.900
hundreds of years. And I hope that when you get to studying history, you will understand that it's not
00:03:36.300
just about taking tests and passing tests and getting answers or writing essays, but it's about
00:03:40.700
really understanding the systemic nature of racism. I want to tell you an interesting story that I've
00:03:47.200
never told you before. I don't think I've ever told mom this story either. So I kind of have slightly
00:03:52.240
different skin color, right? Like if you look at me, you know, I'm not perfectly white. I'm not black.
00:03:57.220
I mean, I kind of look like I look like I'm something, right? You also have slightly darker skin
00:04:03.080
as a result of that. So when I was in college, which was in an area that was pretty much exclusively
00:04:10.440
white people. One day I was, it was 1995. I remember this very well. I was riding my bike
00:04:17.520
to the gym. I used to go to this gym a couple of miles from my house and I'm riding my bike
00:04:22.140
and you're supposed to ride your bike on the right side of the road. But like, I don't know,
00:04:27.660
I forget. I think there was like a lot of water and snow in the ground. So I rode up on the sidewalk
00:04:31.980
and a police officer pulled his car right in front of me and yanked me off my bike and threw
00:04:40.460
me onto the grass right in front of where I went to the gym. This all took place right in front of
00:04:46.680
the gym that I went to. And I basically stood up and said, what the F man? And he said, were you in
00:04:54.360
such and such a place at such and such a time? And I'm like, what are you talking about? He goes,
00:04:58.480
you can't ride your bike here. And I was like, totally confused by what was going on. He kept
00:05:02.620
going back and forth between, I'm not allowed to ride my bike on the sidewalk. And then sort of
00:05:06.760
making these accusations. And he said, an older woman was mugged near here like yesterday or something
00:05:14.960
like that. And it was really clear at that point, what he was basically saying, which was you did this
00:05:20.860
and I can't prove it, but that's my suspicion. I don't like the way you look. And the fact that
00:05:27.240
you're riding your bike on a sidewalk where you shouldn't be riding it is my chance to make that
00:05:32.880
case. Eventually a whole bunch of people came out from the gym, apparently including somebody who was
00:05:39.480
kind of related to him. It was a small town that I went to college in and he basically let me walk
00:05:45.240
away. And I was pretty shaken up about that. That was super upsetting to me. And of course, to put
00:05:52.040
that in the perspective of what we're talking about today, that's like one, one millionth of the type
00:05:57.760
of racial profiling and racial discrimination that is experienced by black people in the United States
00:06:04.100
and elsewhere, but certainly in the United States on a daily basis. Something that, I mean, you just
00:06:09.780
won't understand and nor will I. Why do you think he did that? I think it's a combination of factors
00:06:14.840
in my example. I think certainly racism played a role. I don't believe that he would have done
00:06:20.740
that to anybody else necessarily. But I also think in that case, there was some indication that he
00:06:26.920
legitimately felt like, hey, this is a guy, there's a kid who's riding on the wrong side of the road or
00:06:32.660
up on the sidewalk and he shouldn't be. And oh, by the way, I have a reason of suspicion that there's
00:06:37.280
some bad apple somewhere out here and this is what I'm going to do. And it's hard to imagine
00:06:44.560
that times a thousand. And that's what's going on in the case of someone like George Floyd or
00:06:52.860
any one of the other people whose names we don't know. Remember, the reason we know about this,
00:06:57.880
Olivia, is it's caught on video. Yeah. There can be so many other people just like this that we
00:07:02.540
don't know about. Well, that's exactly right. And you have to think about, imagine an era when you
00:07:06.520
didn't have cell phones that we could capture anything and everything by video. How many times do
00:07:10.820
you think this has already happened? Probably more than we can count. Yeah. You would need
00:07:15.220
scientific notation to count the number of times this has happened. So when I was reading the article
00:07:21.580
where I started to learn about this, I saw, so when George Floyd was in the deli and he gave a check
00:07:29.820
that people thought was fake, I saw a comment in the article that said, one time I went to Home Depot
00:07:36.520
and I used a fake $20 bill and they knew it was fake and they gave it back to me and let me just
00:07:41.540
walk out because I was white. Yeah. There's absolutely no question that a white person and
00:07:47.040
a black person has a completely different interaction with, frankly, law enforcement,
00:07:53.040
the criminal justice system, all of these things. I'm glad to see how much this is upsetting you.
00:07:58.780
It's good to see that this is upsetting to you. As much as I hate to see a kid upset,
00:08:03.500
it should upset you. It should upset you greatly. Well, why do you think that police officers do
00:08:08.640
this? I mean, the only difference is that some people have darker skin, some people have lighter
00:08:13.480
skin. Why, why does that make them more likely to get killed or hurt? I mean, it comes down to
00:08:19.400
racism. This is what we call systemic racism. And so the real question is, why is it tolerated
00:08:27.280
in the police force? Yeah. Shouldn't there be some sort of, I don't know how to put this,
00:08:33.200
but like a test to see if you're not racist before they let you on to be a police officer? Because it's
00:08:39.740
not fair that innocent people are getting killed because of this. Yeah. I mean, I've been thinking
00:08:43.760
about this a lot, Olivia. I've seen some people who are very thoughtful make comments like,
00:08:48.060
hey, look, there are very few protesters that are actually looting and there are very few police
00:08:53.720
officers that are going to do what this guy did. It's mostly good police officers and mostly good
00:08:59.300
protesters. And they, you know, we're focusing most of our attention on the worst of both groups.
00:09:04.740
There may be some truth to that, but a counter argument to that is, isn't law enforcement
00:09:10.760
a profession in which we can't tolerate any of this? Don't we need a zero tolerance policy for this?
00:09:17.020
Would you want to fly on airplanes? I think Chris Rock used this analogy. Would you want to fly on
00:09:23.280
airplanes where 99% of the pilots were good and 1% were horrible? No, you wouldn't because that 1%
00:09:30.260
could make a difference that they're horrible. Yeah. It's, it's totally unacceptable paradigm. So
00:09:35.880
I've been thinking about this a little bit and I, and I actually realized that I don't know what the
00:09:40.920
analogy is and I don't know if that, you know, airlines are the right analogy, but I think a good
00:09:45.220
analogy is actually doctors. Comparing doctors to police officers is better than trying to compare
00:09:50.240
police officers to pilots because, well, one, I think there are probably more interactions that
00:10:01.240
are interpersonal that are the ones that we're really talking about. You know, I think that's
00:10:05.140
probably the most different thing. And when you, when you talk about a pilot being ethical or not
00:10:09.820
ethical, moral or amoral, and of course, by the way, there are examples of pilots that have done very
00:10:15.220
bad things. There are pilots that have crashed airplanes deliberately, but the frequency of that
00:10:19.760
is so incredibly low. But in medicine, there really are some bad doctors out there. And I don't mean
00:10:25.340
bad doctors that aren't smart. I mean, doctors that do horrible things that abuse patients. There's a very
00:10:33.520
famous doctor. He was the doctor of the U S gymnastics team, and he abused many of the gymnasts. I mean,
00:10:40.840
these are despicable doctors. And so I wonder if that's a better analogy. And in medicine,
00:10:46.020
there are basically three types of mistakes you can make. And I think of this as a very important
00:10:50.500
way to think about how you train people. There are technical mistakes. There are judgment mistakes.
00:10:57.600
There are normative mistakes. Do any of those words mean anything to you? Normative mistakes that can
00:11:03.820
mean like mistakes that you don't mean to cause? No, actually that's not what normative is. So let me
00:11:09.080
explain what these are. So a technical mistake is like, if you think about surgery, which is the
00:11:14.360
easiest place to explain it, a technical error is you mean to do something, but you make a mistake.
00:11:20.240
You cut too much, you cut too little, you poke too much, you poke too little. The patient is hurt as a
00:11:25.620
result of a technical error that you make. I've made many technical errors in my residency when I was
00:11:32.480
training. You know, I, I remember once doing an operation on a patient and I cut the bile duct by
00:11:37.800
accident, you know, when I didn't mean to doing, doing an operation. Do we forgive people who make
00:11:43.400
those kinds of mistakes? Yeah. Cause it wasn't, they didn't want to do that. That's true. But
00:11:48.400
there's a bigger and a more important point there. We forgive them provided what they have to own the
00:11:54.960
mistake immediately. They can never try to cover it up. They can't deny it. They have to own up to
00:12:00.260
their mistake and not hide the fact that they made a mistake about that. That's right. The other
00:12:05.420
condition to that is you must learn from your mistakes and not make the same mistakes over and
00:12:10.960
over again. So if I cut the bile duct every single week, well, that would be a problem,
00:12:15.600
right? If I'm putting a central line in a patient, which is a special IV that goes into their neck
00:12:21.160
and I constantly poke their lung, which causes a huge catastrophe of problems.
00:12:27.240
And even if I admit to it every time, but never get better, that is a problem.
00:12:30.660
That's also not helpful. Even if you do admit it, that's still not going to make it.
00:12:34.480
Yeah. At some point you just have to decide you're, maybe you're not technically capable
00:12:37.480
of doing this job. So that's a technical error. Then there's an error of judgment.
00:12:42.940
Judgment errors are a little harder to understand, but it basically says, should I, or should I not
00:12:47.480
operate on this patient now? Should I, or should I not change this medication now? Should I, or should
00:12:53.640
I not admit this patient to the hospital now or send this patient home or call in another doctor to
00:12:58.380
help consult with this? Now, again, you can make lots of mistakes with respect to judgment. Are
00:13:04.340
they forgivable? They can be as long as you own up to what you did. And you don't repeat them. And you
00:13:10.180
don't repeat them. Okay. Now enter normative errors. Normative errors are the third class of error in
00:13:17.220
training physicians. These are errors of character. This is when you lie about something. Let me give you
00:13:27.060
examples that I witnessed firsthand. You go in to see a patient in the emergency room who comes in
00:13:33.280
with an infection and you give them an antibiotic. You don't ask them if they're allergic to the
00:13:39.040
antibiotic. They have an allergic reaction to the antibiotic. You go back and change the form where it
00:13:45.900
says, what was their allergy to the antibiotic? And you write in, oh, they had a penicillin allergy,
00:13:50.440
even though you didn't ask them. Seems like a little error, doesn't it?
00:13:57.400
Because when you lie about something that you did like that, that just makes it worse.
00:14:02.160
Well, what if in that case, the patient was fine? You gave them penicillin. They had an allergy to
00:14:06.860
the penicillin. You gave them the appropriate Benadryl or epinephrine or whatever they needed
00:14:11.280
to not die from it. And they're totally fine. But you went back and you changed your form.
00:14:19.680
Well, you have to live with the fact that you did that. And you could have hurt them if you didn't
00:14:24.120
That's true. And there's an even deeper issue, which is people who consistently make those kinds
00:14:30.300
of mistakes or frankly, make those kinds of mistakes at all, will go on to make bigger and
00:14:36.680
That's right. They make bigger and bigger versions of those mistakes. And so a good training system in
00:14:44.160
medicine is one that fosters and encourages open and honest discussion of technical errors
00:14:51.040
errors and judgment errors, but it immediately identifies normative errors and kicks those
00:14:57.420
people out of programs. And not just kicks those people out of programs, kicks those people out
00:15:02.200
of medicine. If a person commits a normative error and they are kicked out of one program only to be
00:15:08.740
picked up by another program, that hasn't really solved the problem.
00:15:11.880
They can just still make more mistakes like that.
00:15:13.780
That's right. They're still going to make their way into the field and the practice of medicine.
00:15:17.200
And so as I think about these awful things that have happened and I start to think about what can
00:15:23.300
I do about it? Well, the short answer is I have no clue. I know what to do in terms of voting and
00:15:28.620
I know what to do in terms of protesting and I know what to do with those things. But the thing that
00:15:32.680
I think a lot about is how do you change the training? How do you identify systemic racism in
00:15:39.420
training? And to me, it comes down to basically identifying traits that are prevalent in a way
00:15:47.200
that are going to be amplified this way later on. Does that make sense? In other words, we should be
00:15:51.980
treating racism as a normative error. And when it shows up in even the smallest ways, it has to be
00:15:59.260
tolerated zero. Just as we wouldn't tolerate a doctor who says, oh, uh, yeah, yeah, yeah. Um, I did ask
00:16:07.740
that patient if they had an allergy and they said nothing. I don't know what they're talking about.
00:16:10.880
I can't believe they had a penicillin allergy. You have to be very careful. And I know it sounds
00:16:14.420
like a silly example, but Olivia, I've seen lots of people make those normative errors where they lie
00:16:20.580
about things. They put their own interests ahead of a patient's interest. Another example,
00:16:26.340
I've seen doctors do it financially. For example, there are two drugs that you could give a patient
00:16:34.040
and the doctor has a financial interest to use one instead of the other. And they use the more
00:16:39.780
expensive one because they have a financial interest in that. Again, it's another example
00:16:44.880
of a normative error. So I don't know if I answered some of your questions because you've
00:16:51.760
been asking me so many great questions in the last couple of days, but I think that somewhere down
00:16:57.180
the line, people a lot smarter than me have to understand what the normative equivalent looks
00:17:04.400
like to the type of racism that can lead to a police officer putting his knee on the neck of a
00:17:12.160
black man for nearly nine minutes for no apparent reason. Cause that can't be hit the first time he has
00:17:20.980
acted in a racist way. That guy's been a racist for a long time. Why didn't we figure that out?
00:17:27.420
Yeah. What's the norm? That's, that's the guy who is so far down the road that he's cutting off
00:17:32.460
the patient's wrong leg out of sheer sloppiness. Then he should be more than just fired for doing
00:17:40.020
that. Cause if this is repetitive, that's not right. Yeah. And that's my point. It's not even right
00:17:45.340
to do it at one time, but multiple times. Well, my point is how do we know what, what was seen the
00:17:50.660
first time? I mean, at some point, I mean, not to get too far down this path, you have to have a
00:17:55.820
self policing system. And this is not true in most systems. And so I'm not, I'm not singling out
00:18:02.180
police officers here because I don't think doctors like to police other doctors any more than police
00:18:08.520
officers like to police other, you know, police themselves or teachers want to police teachers.
00:18:13.340
But remember, racism exists in a lot of professions. It exists in medicine. It exists in justice. It
00:18:21.740
exists in education. It exists everywhere. It certainly exists in the military. It's just the
00:18:27.620
stakes are so high in law enforcement that that's why we're seeing it the way we're seeing it here.
00:18:34.880
And I think the point I want to make tonight, as we sit here and try to make sense of this is
00:18:39.520
I hope that there's a way that we can take this idea of normative errors as a tool that we can use
00:18:48.380
to identify people who are making mistakes that are acceptable because they can be learned from and
00:18:53.860
improved upon versus unacceptable mistakes and quickly getting those people out of the system.
00:19:00.460
It's one thing to say we may never completely eradicate racism. It's quite another thing to say
00:19:05.520
we're going to tolerate racist police officers. You see the difference? The stakes are much higher.
00:19:11.020
The biggest question is what can we even do about this?
00:19:14.480
There is probably one of the best videos I've seen going around on this. I'm going to show it to
00:19:20.800
you later today. It's by this rapper and activist, Killer Mike. And he was at a press conference and he
00:19:27.980
gave a very eloquent, passionate speech about what exactly you can and can't do. And it was like, hey,
00:19:36.320
rioting, looting, that's not the thing to do. Yes, I know you're angry. Yes, you have every reason to be
00:19:42.560
upset, but that's not the productive thing to do. The productive thing to do is not just march and not
00:19:49.120
just send your fisted emoji to your friend. It's actually do something political. It's changed the
00:19:58.600
system, right? It's vote out the people who allow this type of racism to proliferate. That's one
00:20:06.040
example. I thought that was a very elegant example. Have you guys talked about this in school? Like
00:20:11.320
actually formally in any of your classes? For some reason, teachers don't talk about politics like this
00:20:17.480
in school, but I wish they did because I want to know more about it. That's so crazy to me.
00:20:22.440
Your teachers have not said anything about this? No. What do you guys think about that? As students,
00:20:28.580
do you talk about it amongst yourselves and say, why are they not saying anything? Yeah, all these group
00:20:33.620
chats in my grade are blowing up right now, but the teachers don't really do anything. Hmm. Well,
00:20:40.240
I don't know what to say about that, but I'm glad that you and I get to talk about it. Yeah.
00:20:43.280
Thank you. All right. Well, thanks for making time to sit down tonight. I know you got to get back to
00:20:49.120
studying. Bye. Thank you for listening to this week's episode of The Drive. This podcast is for
00:20:55.980
general informational purposes only and does not constitute the practice of medicine, nursing, or
00:21:00.720
other professional healthcare services, including the giving of medical advice. No doctor-patient
00:21:06.660
relationship is formed. The use of this information and the materials linked to this podcast
00:21:11.420
is at the user's own risk. The content on this podcast is not intended to be a substitute for
00:21:17.460
professional medical advice, diagnosis, or treatment. Users should not disregard or delay in
00:21:24.120
obtaining medical advice from any medical condition they have, and they should seek the assistance of
00:21:29.500
their healthcare professionals for any such conditions. Finally, I take conflicts of interest very
00:21:35.580
seriously for all of my disclosures and the companies I invest in or advise, please visit
00:21:41.180
peteratiamd.com forward slash about where I keep an up-to-date and active list of such companies.